Deploying to production a Django app - Azure - python

I have a Django app in a Github repo. Through a Github action, it is deployed as a Python app in Azure.
In the Azure portal:
1- In "Configuration > Application" settings, I've defined POST_BUILD_COMMAND as
python manage.py makemigrations && python manage.py migrate
as described in Configure a Linux Python app for Azure App Service.
2- I have configured a deployment slot and a production slot. It offers a 'Swap' option, to push the live app in the deployment slot to production.
However, I'm under the impression that doing that doesn't run the POST_BUILD_COMMAND command for the production Django app, leaving the database unaltered - which means that the production frontend gets the new fields/updates, but the migrations don't occur.
What's the best way to perform the migrations in production?
Would the correct way be to set "Configurations > General settings > Startup Command" to 'python manage.py makemigrations && python manage.py migrate'? Would that work?

The best way to perform the migration is using the YAML configuration.
When we have the requirement of the libraries which are needed to be installed, while in production, be careful while migrating, because we need to ignore, venv folder.
Create a dev environment with live data and make sure to add the environment variables of the database on azure portal
Once we create the environment variables to the database, then we can get live data to the dev environment.
Create the Production Environment repo and instead of creating the connection from GitHub again, take the diversion from dev repo.
Make the parent repo as dev repo for prod repo. Use the same methodology which you used in the documentation.
Configure the pipeline based on the environment variables.
Create an automation procedure to update the production environment on interval basis.
python manage.py makemigrations && python manage.py migrate
Use the above code block for performing operations into the dev. Use the Azure portal terminal to create YAML files and change the parameters.
This can be performed by VS Code also. (The best method)
Document, is the perfect flow reference for the procedure.
GitHub -> Azure Dev Repo -> Backup -> Create Prod Repo -> Migrate from Dev to Prod -> Exclude gitignore folder

Related

Django on GAE - How to automatically 'migrate' on deploy?

Django v1.11
Postgresql v9.6
Currently, I use 2 Google CloudSQL databases, one for development and one for production. Whenever I make changes to my models, I run python manage.py migrate to update the tables in the development database. This migration does not affect the production database, however.
Now, whenever I git push changes to my Django project, TravisCI automatically runs tests and deploys the code to Google App Engine. Currently, it runs on GAE flexible environment (so I can use Python 3.5)
What I want to have is for Travis or GAE to automatically run python manage.py migrate on the production database before runserver. However, I can't figure out how to run custom commands during a deploy.
I've tried looking around GAE and Travis documentation and adding scripts to .travis.yml and app.yaml, but to no avail.
As of now, anytime there is a model change, I have to migrate the production database locally in a very hacky way. Ideally, GAE will migrate at the beginning of every deploy.
Not sure if you have seen this:
Travis CI Script Deployment
A reference from a similar issue:
How can I run a script as part of a Travis CI build?
Also, consider a database migration tool embedded with your source code, Postgresql is supported (something similar to FlywayDB migration ):
Yoyo database migrations¶

Deploying django app to production server, should I include environment (env) in git clone?

I have a working django app running on mĂ˝ localhost. It is running inside a virtual environment.
No I want to deploy the same project into a Google Compute Engine.
For that I have a question.
After I set up the production server including starting the virutal environment with vritualenv env do I need to clone in the project code from git including the env directory or only the source code including manage.py?
The process is described differently and so it is a bit confusing.
Main problem is the clarity of deploying the django app to production and the virtual environment setup using git for the code transfer.
Thank you for flow explanation.
My local structure is the following:
valuation <-- project directory w/ manage.py
valuation <-- project w/ settings.py
prophet <-- app
In my production server I have the following structure
opt/valuation <-- virtual environment
valuation <-- empty directory, [this][1] says I should clone code here
My question is what should I clone from my local project and what do keep out (mainly the manage.py, settings.py etc) so that the project will run.
Thanks.
no You don't need the clone the env folder, just make the requirements.text file , which will track all the plugins used in that project.You can update the requirements file using command
pip freeze > requirements.text
on the server just create the new env and install all the plugins using below command
pip install -r requirements.text

How can I use Flask-migrate across multiple development environments

I have flask-migrate (version 1.8.0) working well with a sqlite database in a development environment. Now I would like to migrate our data to MySQL and maintain all of our migration history (so it stays in sync with our Flask-SQLAlchemy models in our git repository).
I created an empty MySQL database, and after changing my SQLALCHEMY_DATABASE_URI, I tried running:
python manage.py db upgrade
That resulted in an error about not being able to drop the table migrate_version. (Which makes sense, since this is a new database, although sqlite actually contains the table 'alembic_version' not 'migrate_version'.)
So, I tried to initialize this new database:
python manage.py db init
Now I get an error: "Directory migrations already exists".
I can rename that folder and re-run the command with no problem, but then I lose all of my previous migrations. I think we would have the same issues when we also transition to our test and production environments.
I've seen in the docs Flask-Migrate has multiple database support, but I think that looks to be more for maintaining multiple databases in a single development environment. Is there a way to have Flask-Migrate track changes across multiple development environments?
To address the real issue in the OP's question, you need to use the --directory flag to initiate a migrations directory specific to your each environment's database.
From the flask-migrate documentation:
All commands also take a --directory DIRECTORY option that points to
the directory containing the migration scripts. If this argument is
omitted the directory used is migrations.
So:
flask db init --directory=[DIRECTORY NAME]
Flask-Migrate itself has no memory of your database, so when running migration commands with flask db, it will reference the specified migrations directory (by default, when the --directory flag is not used, this is called 'migrations').
flask db migrate --directory=[DIRECTORY_NAME]
etc.
It goes without saying that the flask command will reference the application context as configured by your config file or environment variables.
I typically create a migration directory for each environment with an explicit reference to the environment: e.g. development and staging, with something like 'migrations_dev' and 'migrations_stg'.
Hope this is helpful.
Here are the steps I took to transition from SQLite to MySQL and maintain all the migration history. I highly suspect there is a better way to do this, but it worked for me.
Initialize the new, blank database using another folder for your "new" migrations
python manage.py db init -d tmp
Create a migration
python manage.py db migrate -d tmp -m "Bring MySQL up to date"
Apply the migration
python maange.py db upgrade -d tmp
Now, you can delete the "tmp" migrations folder. You no longer need it. Find the HEAD migration. Look for 'Rev: your_revision_num (head)'
python manage.py db show
Run an update statement against your MySQL database
update alembic_version set version_num = 'your_revision_num'
Now your MySQL database schema should match your old SQLite schema and you'll have your full migration history as well.
The table migrate_version is used to track migrations by package sqlalchemy-migrate. Alembic, the package used by Flask-Migrate, uses a alembic_version table, as you know.
So my guess, is that this MySQL database that you want to use has been used before by an application that was under sqlalchemy-migrate control.
I recommend that you delete the MySQL database and make a brand new one.
I also had the same need on my side. I wanted to reproduce the command that exists in the laravel framework to make a migration in different environments:
php artisan migrate --env prod
With this kind of command, you can launch a migration in different environments. I have not found a directly equivalent command in flask.
THE "flask db upgrade --env prod" command does not exist. In particular, the --env argument .
As a workaround, I created a command that allows me to change the environment:
flask env --name prod
That command is a custom flask command that will copy the content of the .env.prod file to .env.
This way, the application is in the prod environment. And we can then launch the migration command on the prod environment.
How to use the custom env command to migrate to different environments?
To start the migration in the staging environment, just run these two commands:
flask env --name staging
flask db updgrade
Then if you want to start the migration in the production environment, just run these two commands:
flask env --name prod
flask db updgrade
How to create the custom command flask env?
First, you need to know how to create custom command in flask. Just follow the official falsk documentation
Here is the content of my custom command which allows to change the environment:
from flask.cli import with_appcontext
import shutil
#click.command('env')
#click.option("--name", is_flag=False, flag_value="Flag", default="")
#with_appcontext
def env(name):
if name == '':
print(os.getenv('FLASK_ENV'))
return
if name not in ['dev', 'prod', 'local', 'staging']:
print('That env does not exist')
return
shutil.copyfile('.env.' + name, '.env')
return
In my setup, I have 4 environments: local, dev, staging, prod.
And I have 4 corresponding .env files: .env.local, .env.staging, .env.prod, .env.dev
The custom flask env command also copies the contents of the environment files into the .env file that the flask application loads at start-up.

Moving Django 1.6 to new server

I'm wondering what steps are evolved in moving a django project to a new server.
basicly i'm completely new to Django and have a few questions. The server it is on is now is not stable so I need to act fast. I did not build the app that is there but have pulled down the www folder from the root server. The server is running centOS.
Questions.
is Django backwards compatible or will I need to insure that the same version is installed?
Apart from moving the files what other steps are involved in running the app?
Will I need to use centOS or will any linux server do?
I have a database cluster of PostgreSQL ready to go also.
Start with the docs here - this will give you a good overview.
To your specific questions:
1/ Django is not backwards compatible. You should install 1.6.x. Likely, there's a requirements.txt file in the root directory of your app. On your new server, install pip and then pip install -r requirements.txt will install your dependencies. I would personally use virtualenvwrapper to manage your dependencies on the server
2/ Check the docs, but the main steps are:
Choose a web server. I personally use nginx. You'll need to setup your nginx.conf.
Choose a Python WSGI HTTP Server - I use gunicorn. You'll also need to configure this. This tutorial is a great place to start.
If you use the DigitalOcean tutorial above, any linux server will do. Last, you'll need to upload your Postgres database to the server but sounds like you're able to do that.
3/ You will need to edit your settings.py of the Django project and update certain variables.
If you're changing your database, as well as the app deployment, you'll need to edit the database connection, run ./manage.py syncdb and ./manage.py migrate (if you're using South) to set up the database schema.
It's also recommended to change the SECRET_KEY between deployments.
If you're deploying on a different hosts, you'll need to edit ALLOWED_HOSTS appropriately for your new deployment as well.
Good luck!

Getting Django 1.7 to work on Google App Engine

Can anyone help to point us to instructions on how to get Django >1.5 working on Google App Engine? I have seen a number of people claim they have Django 1.6 working. We'd like to get 1.6 or 1.7 running. I have searched here for instructions on how to set this up. No luck so far.
Update:
In our development machine we have Django 1.7 installed (both /user/local and on virtualenv). However, if we modify GAE yaml to use Django 1.7 we get the following error messages:
google.appengine.api.yaml_errors.EventError: django version "1.7" is not supported, use one of: "1.2", "1.3", "1.4", "1.5" or "latest" ("latest" recommended for development only) in "./app.yaml",
The version 1.9.12 GoogleAppEngine sdk install in our /Applications/GoogleAppEngineLauncher.app/Contents/Resources/GoogleAppEngine-default.bundle/Contents/Resources/google_appengine/lib directory shows the following Django versions listed:
django-0.96 django-1.2 django-1.3 django-1.4 django-1.5
My question is related to how to get our development environment setup correctly for Django 1.7 on Google App Engine and how to make sure we successfully deploy our app with Django 1.7 when we deploy to Google App Engine in production. How do we get the Django 1.7 to deploy on GAE when we deploy our app?
You can use any pure Python third party libraries in your Google App Engine application. In order to use a third party library, simply include the files in your application's directory, and they will be uploaded with your application when you deploy it to our system. You can import the files as you would any other Python files with your application.
I have application using Django 1.7 this way and everything is working fine. However, sometimes you may need to sort of hack something due to the App Engine limitations and its specifics. But it depends on your use cases.
I would also suggest to use virtual environment for your project. Install each library that is not supported by App Engine directly via pip and then create a symlink in your application directory pointing to the given library.
This way you can keep all required packages in a file (e.g. requirements.txt) that can be stored in SCM system (e.g. Git) along with your source files and other team members can quite easily replicate your working environment.
Provided that you use virtual environment and install all needed libraries (Django, ...) via pip, here is the directory layout that should work for you.
virtual-env-root
.Python
bin
include
lib
app-engine-project-root
app.yaml
django-project-root
django-app-root
symlink-to-django -> lib/python2.7/site-packages/django
symlink-to-another-lib -> lib/python2.7/site-packages/...
Such a layout can be easily deployed with the below command.
$ appcfg.py update app-engine-project-root
Or tested with App Engine development server.
$ dev_appserver.py app-engine-project-root
UPDATE
Since App Engine Python SDK version 1.9.15 you can use the vendoring mechanism to set up third party libraries. You do not have to create symlinks in your application directory pointing to the Python lib folder anymore.
Create lib directory directly in your application root and tell your app how to find libraries in this directory by means of appengine_config.py file.
from google.appengine.ext import vendor
# Add any libraries installed in the "lib" folder.
vendor.add('lib')
New directory layout follows.
virtual-env-root
.Python
bin
include
lib
app-engine-project-root
lib
app.yaml
appengine_config.py
django-project-root
django-app-root
Use pip with the -t lib flag to install libraries in this directory.
$ pip install -t lib [lib-name]
Or
$ pip install -t lib -r requirements.txt
You cannot - GAE only supports 1.5, and even that is marked as experimental. If you need django 1.7, perhaps you should use Google Compute Engine, which is Google's brand name for virtual machines that you can spool up.
If you are not married to Google App Engine, Heroku supports django 1.7 without issues.
Do you have specific a guide on how to move a Django 1.7 project to
Google Compute Engine? There is a bunch of Google stuff without any
guides on how to make them work.
Here are the steps, but they are the same had you deployed on any other server because GCE just gives you a linux instance:
First, make sure your developer account has a billing method attached to it.
Go to the developer console
Create a new project by clicking on Projects, then Create Project.
Wait as the project is being created (you'll see a progress window on the bottom right of your screen).
Once the project is finished creating, the console will automatically shift to that project's settings:
You can create a new instance, or deploy a ready-made template from the second column. You can see there are popular stacks and software applications for which templates are created.
As there is no django template yet, you will start by creating an instance.
Billing is controlled on a per-project basis, so you'll have enable billing at this point if you haven't done so already.
The next page is where you configure the instance. The fields are self-explanatory. You set the type of machine you like (how many virtual CPUs and memory), where (physically) you prefer the machine to be located, if you want both HTTP and HTTPS ports open, and then a disk image from which the instance will boot:
Once you have configured the machine, it will be brought online booted up and then you'll have access to the terminal via SSH.
From this point forward, you should treat this instance like any linux server. Install whatever you need to make your project work using the normal packaging tools; upload your files, etc.
For Amazon, the process is a bit simpler as there is a large library of AMIs that you can use for a one-click deployment process. AMI is Amazon Machine Image - a template from which you can deploy an instance.
For Heroku, as its a PaaS, you don't have to worry about the hardware components; however as with most PaaS platforms, you don't have write access to the filesystem. So to manage your static assets you have to do some extra work. The easiest option is to create a S3 bucket on Amazon and use that with django-storages. The official django tutorial at heroku suggests the use of dj-static to serve files directly from Heroku. This works fine for testing, but if you want to start uploading files, then you need to handle those correctly.
However, once you sort that out the steps are even simpler:
Pre-requisites:
git
heroku toolbelt
dj-database-url Python package
gunicorn Python package
The basic steps:
Create a git repository (if you have not done already) in your source code directory with git init.
Create a requirements.txt at the root of your project. pip freeze > requirements.txt should do it if you are using a virtual environment. Otherwise, you can create a text file and list the packages you need.
Adjust your settings.py, by adding this line at the very bottom: import dj_database_url
DATABASES['default'] = dj_database_url.config()
Create a Procfile (case is important). This is how you tell Heroku what kind of dyno (process) you need for your application. For django, you need a web dyno so in this file the following line should do: web: gunicorn yourproject.wsgi --log-file -
Create an app on Heroku and deploy. You should run these commands from your source code directory:
heroku create --buildpack https://github.com/heroku/heroku-buildpack-python
heroku addons:add heroku-postgresql:dev
git push heroku master
heroku run python yourproject/manage.py migrate --noinput
heroku run python web/manage.py collectstatic
You only do the first two steps once, then whenever you need to update your application simply git push heroku master to create a new revision on Heroku.
App Engine's Python environment currently knows how to provide Django up to version 1.5 via the libraries: configuration mechanism. This doesn't mean that later versions of Django won't work, only that they aren't yet built in. (I'm not sure why the latest built-in version is 1.5. It may have something to do with AE's historical policy of bundling each supported version of Django with the SDK, which probably needs to be revised to keep the SDK from getting too large.)
You can try to include Django 1.7 with your application files. I haven't tried this with 1.7 specifically yet, but it's worked with previous versions. Some adjustments to sys.path will be needed in your main.py.
Note that there is a limit of 10,000 application files. If you're concerned about this limit, one option is to use Python's zipimport and include Django as a zip archive. https://docs.python.org/2/library/zipimport.html

Categories

Resources