I am new to web development. I am not familiar with Django. I have written some Python scripts which do some intense calculations and graphs plotting using Python packages such as numpy, matlibplot and so on. I want to publish it as a web application on a server to be accessed by other computers.
So I am wondering, do I need to copy all the required packages into the project directory before deploying the application to a server? Or Django will automatically handle the Python packages dependencies upon deploying?
All the needed packages need to be installed on the server where is the web application running. If you have all the packages installed on your personal computer before deploy you will need to install the on the server as well.
You have to create virtual environment.
A virtual environment is a tool that helps to keep dependencies required by different projects separate by creating isolated python virtual environments for them.
Related
I'm working on creating my first "real" web app using Django.
Yesterday I learned I should be using a web server like Nginx to serve static files and pass off requests for dynamic content to my web app. I also learned that I need something like Gunicorn as the intermediary between the web server (Nginx) and my Django app.
My question is about virtualenv. It makes sense that we would contain app related software in it's own separate environment. What should I install in virtualenv, and what gets installed system wide? For example, in this guide we seem to install Python, Nginx and the database system wide (because they're installed before virtualenv is installed) while Django and Gunicorn are installed in virtualenv. It makes sense that Gunicorn would have to go in the virtualenv since its importing our python app, as explained here. Are the other things required to be installed system wide? Or can I pick either way? Is one way preferred over another?
Thanks!
Virtualenv is for managing Python libraries. It is not for managing Python itself, or for external services such as databases; it does however manage the Python libraries you use to access the database.
There's no room for confusion here, because there's simply no way to install Python itself or a database within a virtualenv.
In development I use Anaconda to manage environments. I have not yet developed a python project for production. In this context I have two related questions.
First, which solution incurs lower technical debt: A. Install Anaconda on production servers; or B. deploy python as deb packages?
Second, what is the simplest structure of python project folders and files to test the functionality of make-deb and dh-virtualenv as described in the last section of Nylas blog article?
Nylas blog (How We Deploy Python Code: Building, packaging & deploying Python using versioned artifacts in Debian packages)
https://www.nylas.com/blog/packaging-deploying-python/
Make-deb:
https://github.com/nylas/make-deb
dh-virtualenv:
https://github.com/spotify/dh-virtualenv
Package Python Application for Linux (Link Added November 2022):
https://opensource.com/article/20/4/package-python-applications-linux
For a test I would only add Requests package to a standard Python 2.7 environment and write one module to download and save a small csv file. Then I would like to test make-deb and dh-virtualenv to deploy to a cloud server or Raspberry pi server. Then I want to run code to verify the download app works as expected on the server. Then I want to further develop the application and test deployment tools using make-deb and dh-virtualenv to see if I can more effectively manage development for production.
Edit: Based on some further research so far it appears Anaconda cannot be made to export a requirements.txt file. It appears the options are to use virtualenv, make-deb, and dh-virtualenv; or to use Anaconda and Miniconda roughly as described in the following blog articles:
https://tdhopper.com/blog/2015/Nov/24/my-python-environment-workflow-with-conda/
https://www.thoughtvector.io/blog/deployment-with-anaconda/
I'm working on deploying a django app and I'm looking at a few tutorials which install apache within the virtualenv.
http://thecodeship.com/deployment/deploy-django-apache-virtualenv-and-mod_wsgi/
http://michal.karzynski.pl/blog/2013/09/14/django-in-virtualenv-on-webfactions-apache-with-mod-wsgi/
My question is that if I'm trying to deploy to a server that already has Apache installed on it, would installing a separate version of apache within the environment (as you would Django in general) overwrite any of the Apache settings currently on the server?
Using virtualenv doesn't mean installing a separate version of Apache. In fact, that's not even possible, because virtualenv is for Python libraries only.
Your Django app plus all its libraries lives in the virtualenv, but you use the system's Apache to serve it.
I have my development environment setup on Win 7 like this:
Django development structure
Apache -server- C:\Program Files (x86)\Apache Software Foundation\Apache2.4
PostgreSQL -database- C:\Program Files\PostgreSQL\9.2
Django -framework- C:\Python27\Lib\site-packages\django
Python -code- C:\Python27
Project -root- C:\mysite
|----------apps
|----------HTML
|----------CSS
|----------JavaScript
|----------assets
I am attempting to keep this extremely simple to start out. There are 5 main directories each with a distinct purpose. All the code resides in the project folder.
compared to WAMP structure:
C:\WAMP
|----------C:\Apache
|----------C:\MySQL
|----------C:\PHP
|----------C:\www
I like how Apache, MySQL, and PHP all reside in a neat directory. I know to keep the root project OUTSIDE in another directory in Django for security reasons.
Is it fine that Apache, PostgreSQL, and Python are installed all over the place in the Django environment?
Did I miss a core Django component and/or directory?
Will deploying and scaling be a problem?
I want this to be a guideline for beginning Django web programmers.
I can answer the question one by one:
Is if fine that Apache, PostgreSQL, and Python are installed all over the place in the Django environment?
All over the place sounds weird but yes it is totally fine.
Did I miss a core Django component and/or directory?
No you don't miss anything, Django core is in site-packages folder already and your site code is mysite, which can be located anywhere you want.
Will deploying and scaling be a problem?
No it won't be a problem with current structure. You will deploy your mysite only, the other will be installed separately.
Something you should get familiar with when starting with Django development:
Most likely when you deploy your project, it will be on a Linux server, so install and learn Linux maybe?
virtualenv: Soon you will have to install Django, then a bunch of external packages to support your project. virtualenv helps you isolate your working environment. Well it's "unofficial" a must when you start with python development.
virtualenvwrapper to make your life easier when working with virtualenv
git and github or bitbucket: if you don't know git yet, you should now.
Apache is just web server, it is used to serve files, but to make a website you do not necessary need it. Django comes with its own development server. See :
python manage.py runserver
Apache is required when you are developing PHP websites because your computer do not know how to compile and interpret it. But for Django, you use the Python language, and you have already install it if you are using Django.
Read https://docs.djangoproject.com/en/1.5/intro/tutorial01/
And where it will be the time to set up your own server using Apache look at :
https://docs.djangoproject.com/en/dev/howto/deployment/wsgi/modwsgi/.
Scaling will be a problem on windows. Python in Apache on windows gets 64 threads in one process. Couple this with the GIL and you will have scaling issues.
Python and Apache on Linux don't have this same problem. Under Linux wsgi can create multiple processes that have multiple threads each, minimizing GIL issues.
WSGI in Apache on windows is not a scalable solution in my opinion.
However you can develop there and move to linux for deployment, I do it all the time.
You will want to take advantage of the Apache Alias directive to serve all your static content like css, js, favicon.ico. This frees up python to only handle requests that require logic.
I have created a python web virtual environment contains all django, pylons related packages. I use the host ubuntu desktop PC at home and I have ubuntu virtual machine running on windows PC laptop.
Both the operating systems are linux only. I will be using the same environment for production that will be ubuntu server.
Is it possible to store the my python virtual environment to the version control and use the same files for ubuntu desktop, laptop ubuntu desktop VM and ubuntu server in production?
You might want to look into virtualenv. This will allow you to set up your working environment, 'freeze' the list of packages that are needed to replicate it, and store that list of requirements in version control so that others can check it out and rebuild the environment with a single step.
You can but you don't really need 'version' control for that. You need to setup your environment. It's a one time job to setup your environment. After that you'll just use it. Why version control it?
If you already have a VM set up, you can export it so that others can copy it and start their own instance with everything installed. VirtualBox and VMWare both support VMDK images, and Xen has its own type of VM images.
That is probably not a solution for setting up servers. I like using Turnkey Linux's appliances for development/staging/deployment servers. They are solid Ubuntu servers preconfigured for a particular application: Django, Rails, LAMP, etc. They come as Ubuntu LiveCD ISO files (for installation) or as virtual machine VMDK packages, and can be deployed to Amazon EC2. You might still have to customize that environment further prior to deploying and testing your code, but it can get you further along than a bare Linux server.