How to run makemigrations for dockerized django app - python

I have been trying to setup a django app using this guide:
https://testdriven.io/blog/dockerizing-django-with-postgres-gunicorn-and-nginx/
And it was a great resource to get everything up and running. However i have problem now when it comes to migrations. Say i add a model, add data into my db, and later i change the model. Now i can run makemigrations using the exec method on the container, but when i shut down my migrations are not stored in my local files i save to my git project, instead they are lost as i spin down the container.
Does anyone know how to solve this, how do you makemigrations is such a setup where you run two dockerized django/postgres dev/prod environments?

Related

Delete db.sqlite via terminal (Django)

I'm currently deploying a project to Heroku, however it returns me the following error: ValueError: Related model 'store.user' cannot be resolved.
But there is a way to avoid such error locally. You simply do:
py manage.py migrate store
And then
py manage.py migrate
In other words, if I migrate separetely, I won't face such error. However, Heroku migrates all together, then the deploy fails because of this error.
If I proceed locally as Heroku does, i.e. run
py manage.py migrate
I can effortlessly click on and delete the db.sqlite3 file and makemigrations and migrate separetely. Then the issue would be resolved. However, that's not possible when deploying to Heroku. Thus, how can I delete this file only via terminal? I have been searching, but people only say you click on the file and the delete it, which for me it is not possible.
Thanks
However, Heroku migrates all together, then the deploy fails because of this error
Heroku does nothing of the sort.
Migrations only run on Heroku when you tell them to, either by running something like
heroku run python manage.py migrate
or because you have declared a release process that should run when you deploy a new version, e.g.:
web: gunicorn app.wsgi
release: python manage.py migrate
In both cases you are in charge of what that command is. If you don't want to run python manage.py migrate every time you deploy, remove the release process from your Procfile.
if I migrate separately, I won't face such error
This is concerning.
There could be several causes for this. One common issue is missing dependencies in your migration. If you write a migration in app1 that depends on certain models and tables from app2 existing, you need to add a dependency to the relevant migration in app1, e.g. something like this example from the documentation:
class Migration(migrations.Migration):
dependencies = [
('app1', '0001_initial'),
# added dependency to enable using models from app2 in move_m1
('app2', '0004_foobar'),
]
operations = [
migrations.RunPython(move_m1),
]
If migrations are written properly, such that they accurately reflect the code in each commit, build off each other, have dependencies declared, etc. you should always be able to safely apply them to your database.
Finally, you ask about deleting db.sqlite.
Heroku's filesystem is ephemeral. It gets baked into your application slug at build time, and any changes you make to it get reset every time your dyno restarts. This happens frequently (at least once per day).
This means you can't effectively delete your database file. But it also means you can't save data and expect it to be there when you look it up later! SQLite isn't a good fit for Heroku.
A client-server database like PostgreSQL is a much better choice. Heroku offers its own Postgres service with a free tier.
If you switch, I strongly urge you to switch in development as well. Django's ORM makes it relatively easy to change, but database engines aren't drop-in replacements for each other. I've seen real-world examples where developing on SQLite and deploying to Postgres or MySQL causes things to fail in production that worked in development.

Why do I get "CommandError: App 'app_name' does not have migrations." when using Heroku?

So I have deployed my Django project to Heroku, and now trying to migrate the database. I have everything working fine in my local sever. Then I tried to run the commands in heroku, as below.
heroku run python manage.py makemigrations app_name'.
This worked fine.
Migrations for 'app_name':
contents\migrations\0001_initial.py
- Create model Book
- Create model Category
- Create model Content
Then of course, I tried : heroku run python manage.py migrate app_name.
But then I got this error CommandError: App 'app_name' does not have migrations.
I've done some research for possible issues, none of which were relevant to mine.
For example I do have __init__.py in the migrations folder inside app_name directory. Also, I've tried heroku run python manage.py migrate as well as heroku run python manage.py migrate app_name. I'm very confused. What do you think is the problem? Thanks in advance. :)
I had the same problem, I don't know why, but the following works for me (two commands in one line):
python manage.py makemigrations && python manage.py migrate app_name
As explained in heroku's help article Why are my file uploads missing/deleted?:
The Heroku filesystem is ephemeral - that means that any changes to
the filesystem whilst the dyno is running only last until that dyno is
shut down or restarted.
That is you do generate the migration files but they cease to exist shortly after when the app is reloaded.
In general the strategy you take is not a good approach to migrations as this can cause various issues, let's say you have multiple production servers each generates their own migrations this causes a conflict! In your situation suppose you do migrate this way and then later you want to add some new models this again will cause you problems.
One should always add their migrations to the version control (git, etc.) commit them and push them to production (here your heroku dyno). That is run python manage.py makemigrations locally, commit the generated migrations, push them and then run heroku run python manage.py migrate.

Django on GAE - How to automatically 'migrate' on deploy?

Django v1.11
Postgresql v9.6
Currently, I use 2 Google CloudSQL databases, one for development and one for production. Whenever I make changes to my models, I run python manage.py migrate to update the tables in the development database. This migration does not affect the production database, however.
Now, whenever I git push changes to my Django project, TravisCI automatically runs tests and deploys the code to Google App Engine. Currently, it runs on GAE flexible environment (so I can use Python 3.5)
What I want to have is for Travis or GAE to automatically run python manage.py migrate on the production database before runserver. However, I can't figure out how to run custom commands during a deploy.
I've tried looking around GAE and Travis documentation and adding scripts to .travis.yml and app.yaml, but to no avail.
As of now, anytime there is a model change, I have to migrate the production database locally in a very hacky way. Ideally, GAE will migrate at the beginning of every deploy.
Not sure if you have seen this:
Travis CI Script Deployment
A reference from a similar issue:
How can I run a script as part of a Travis CI build?
Also, consider a database migration tool embedded with your source code, Postgresql is supported (something similar to FlywayDB migration ):
Yoyo database migrations¶

Django Migration Process for Elasticbeanstalk / Multiple Databases

I am developing a small web application using Django and Elasticbeanstalk.
I created a EB application with two environments (staging and production), created a RDS instance and assigned it to my EB environments.
For development I use a local database, because deploying to AWS takes quite some time.
However, I am having troubles with the migrations. Because I develop and test locally every couple of minutes, I tend to have different migrations locally and on the two environments.
So once I deploy the current version of the app to a certain environment, the "manage.py migrate" fails most of the times because tables already exist or do not exist even though they should (because another environment already created the tables).
So I was wondering how to handle the migration process when using multiple environments for development, staging and production with some common and some exclusive database instances that might not reflect the same structure all the time?
Should I exclude the migration files from the code repository and the eb deployment and run makemigrations & migrate after every deployment? Should I not run migrations automatically using the .ebextensions and apply all the migrations manually through one of the instances?
What's the recommended way of using the same Django application with different database instances on different environments?
Seems that you might have deleted the table or migrations at some point of time.
When you run makemigrations, django create migratins and when you run migrate, it creates database whichever is specified in settings file.
One thing is if you keep on creating migrations and do not run it in a particular database, it will be absolutely fine. Whenever you switch to databsse and run migrations, it will handle it as every database will store the point upto which migrations have been run until now in django-migrations table and will start running next migrations only.
To solve your problem, you can delete all databases and migration files and start afresh as you are perhaps testing right now. Things will go fine untill you delete a migration or a database in any of the server.
If you have precious data, you should get into migration files and tables to analyse and manage things.

Heroku and Django Running Server

I'm following this tutorial: http://tutorial.djangogirls.org/en/domain/README.html
When I do python manage.py runserver it works fine. It also works when I run
heroku ps:scale web=1
then
heroku open
with python manage.py runserver it shows my blog posts and everything that I had added. But when I run the server with heroku open there are no posts as if the database is missing or something.
Why is that? Why do the two commands launch the same web page but with different posts/different databases?
Which brings me to my follow up question: how do I know when I need to run migrate or makemigrations again for the server? Would doing so fix the problem? And what exactly do those commands do/why are they necessary?
Thanks
EDIT:
Bonus question: Why do my posts display in descending order of time? The new posts are on the bottom of the page rather than the top. How can I change this?
There is a difference between your local development and your deployed project.
I assume you created your posts local. So they are saved local into your database. Local you use a filebased database defined in the settings 'django.db.backends.sqlite3' that means when you run manage.py syncdb the file is created with all tables inside. When you deploy your code to heroku the code is pushed to the server and runs from this place. This can be everywhere, so it can't connect to your local databasefile. For your project you has to setup a database on heroku as well. I recommend to read this article. When you want to transfer your data you can create a database dump local and load all data to you heroku database. Described here and here.
Another point, you don't run the server with heroku open or by fire python manage.py runserver. The heroku server is automatically started when your git push heroku master is done. It use the configs from your Procfile.
When you want to migrate your heroku database you has to run heroku run python manage.py migrate <app_name> than the migration is done remotely on the heroku server. You have to run this command everytime you changed a model and added a migrationfile with python manage.py makemigration <app_name>. When you did this you has to migrate you database local and remote. That means that you change the database structure to match your models. Remember the models are only a abstraction(orm) of your database.
I don't know your project but the order sees legit. Try to imagine this as rows. first row comes first. So the last entered row is on the bottom. You can change the order of the queryset with something like.order_by('-id'). So you get all entries in reverse order.

Categories

Resources