Delete db.sqlite via terminal (Django) - python

I'm currently deploying a project to Heroku, however it returns me the following error: ValueError: Related model 'store.user' cannot be resolved.
But there is a way to avoid such error locally. You simply do:
py manage.py migrate store
And then
py manage.py migrate
In other words, if I migrate separetely, I won't face such error. However, Heroku migrates all together, then the deploy fails because of this error.
If I proceed locally as Heroku does, i.e. run
py manage.py migrate
I can effortlessly click on and delete the db.sqlite3 file and makemigrations and migrate separetely. Then the issue would be resolved. However, that's not possible when deploying to Heroku. Thus, how can I delete this file only via terminal? I have been searching, but people only say you click on the file and the delete it, which for me it is not possible.
Thanks

However, Heroku migrates all together, then the deploy fails because of this error
Heroku does nothing of the sort.
Migrations only run on Heroku when you tell them to, either by running something like
heroku run python manage.py migrate
or because you have declared a release process that should run when you deploy a new version, e.g.:
web: gunicorn app.wsgi
release: python manage.py migrate
In both cases you are in charge of what that command is. If you don't want to run python manage.py migrate every time you deploy, remove the release process from your Procfile.
if I migrate separately, I won't face such error
This is concerning.
There could be several causes for this. One common issue is missing dependencies in your migration. If you write a migration in app1 that depends on certain models and tables from app2 existing, you need to add a dependency to the relevant migration in app1, e.g. something like this example from the documentation:
class Migration(migrations.Migration):
dependencies = [
('app1', '0001_initial'),
# added dependency to enable using models from app2 in move_m1
('app2', '0004_foobar'),
]
operations = [
migrations.RunPython(move_m1),
]
If migrations are written properly, such that they accurately reflect the code in each commit, build off each other, have dependencies declared, etc. you should always be able to safely apply them to your database.
Finally, you ask about deleting db.sqlite.
Heroku's filesystem is ephemeral. It gets baked into your application slug at build time, and any changes you make to it get reset every time your dyno restarts. This happens frequently (at least once per day).
This means you can't effectively delete your database file. But it also means you can't save data and expect it to be there when you look it up later! SQLite isn't a good fit for Heroku.
A client-server database like PostgreSQL is a much better choice. Heroku offers its own Postgres service with a free tier.
If you switch, I strongly urge you to switch in development as well. Django's ORM makes it relatively easy to change, but database engines aren't drop-in replacements for each other. I've seen real-world examples where developing on SQLite and deploying to Postgres or MySQL causes things to fail in production that worked in development.

Related

Why do I get "CommandError: App 'app_name' does not have migrations." when using Heroku?

So I have deployed my Django project to Heroku, and now trying to migrate the database. I have everything working fine in my local sever. Then I tried to run the commands in heroku, as below.
heroku run python manage.py makemigrations app_name'.
This worked fine.
Migrations for 'app_name':
contents\migrations\0001_initial.py
- Create model Book
- Create model Category
- Create model Content
Then of course, I tried : heroku run python manage.py migrate app_name.
But then I got this error CommandError: App 'app_name' does not have migrations.
I've done some research for possible issues, none of which were relevant to mine.
For example I do have __init__.py in the migrations folder inside app_name directory. Also, I've tried heroku run python manage.py migrate as well as heroku run python manage.py migrate app_name. I'm very confused. What do you think is the problem? Thanks in advance. :)
I had the same problem, I don't know why, but the following works for me (two commands in one line):
python manage.py makemigrations && python manage.py migrate app_name
As explained in heroku's help article Why are my file uploads missing/deleted?:
The Heroku filesystem is ephemeral - that means that any changes to
the filesystem whilst the dyno is running only last until that dyno is
shut down or restarted.
That is you do generate the migration files but they cease to exist shortly after when the app is reloaded.
In general the strategy you take is not a good approach to migrations as this can cause various issues, let's say you have multiple production servers each generates their own migrations this causes a conflict! In your situation suppose you do migrate this way and then later you want to add some new models this again will cause you problems.
One should always add their migrations to the version control (git, etc.) commit them and push them to production (here your heroku dyno). That is run python manage.py makemigrations locally, commit the generated migrations, push them and then run heroku run python manage.py migrate.

How to run makemigrations for dockerized django app

I have been trying to setup a django app using this guide:
https://testdriven.io/blog/dockerizing-django-with-postgres-gunicorn-and-nginx/
And it was a great resource to get everything up and running. However i have problem now when it comes to migrations. Say i add a model, add data into my db, and later i change the model. Now i can run makemigrations using the exec method on the container, but when i shut down my migrations are not stored in my local files i save to my git project, instead they are lost as i spin down the container.
Does anyone know how to solve this, how do you makemigrations is such a setup where you run two dockerized django/postgres dev/prod environments?

Django on GAE - How to automatically 'migrate' on deploy?

Django v1.11
Postgresql v9.6
Currently, I use 2 Google CloudSQL databases, one for development and one for production. Whenever I make changes to my models, I run python manage.py migrate to update the tables in the development database. This migration does not affect the production database, however.
Now, whenever I git push changes to my Django project, TravisCI automatically runs tests and deploys the code to Google App Engine. Currently, it runs on GAE flexible environment (so I can use Python 3.5)
What I want to have is for Travis or GAE to automatically run python manage.py migrate on the production database before runserver. However, I can't figure out how to run custom commands during a deploy.
I've tried looking around GAE and Travis documentation and adding scripts to .travis.yml and app.yaml, but to no avail.
As of now, anytime there is a model change, I have to migrate the production database locally in a very hacky way. Ideally, GAE will migrate at the beginning of every deploy.
Not sure if you have seen this:
Travis CI Script Deployment
A reference from a similar issue:
How can I run a script as part of a Travis CI build?
Also, consider a database migration tool embedded with your source code, Postgresql is supported (something similar to FlywayDB migration ):
Yoyo database migrations¶

Django project: track migrations

While developing a Django project tracking it with git and GitHub, how should I manage migrations?
Sometimes when I deploy a release to production some migrations crash due to files that I delete after this migration.
How can I avoid this?
Thanks.
There is other threads on this but basically this is the rules I use:
You should definately remote migrations files using Git.
Never run makemigrations on production environment always in developpment.
Now, let's say you made a change on one of your models (in developpment I hope), you will run a normal makemigrations. Then, run migrate (still in dev) in order to test everything. When you're ready, you will commit and push the created files and pull in prod to then run migrate to update database schema.
This will assure good versionning of your migrations files. Also, it will greatly help you in the long run, because running makemigrations in produciton and in dev simultaneously will just cause more conflicts on migrations files which can be a pain.

Django Migration Process for Elasticbeanstalk / Multiple Databases

I am developing a small web application using Django and Elasticbeanstalk.
I created a EB application with two environments (staging and production), created a RDS instance and assigned it to my EB environments.
For development I use a local database, because deploying to AWS takes quite some time.
However, I am having troubles with the migrations. Because I develop and test locally every couple of minutes, I tend to have different migrations locally and on the two environments.
So once I deploy the current version of the app to a certain environment, the "manage.py migrate" fails most of the times because tables already exist or do not exist even though they should (because another environment already created the tables).
So I was wondering how to handle the migration process when using multiple environments for development, staging and production with some common and some exclusive database instances that might not reflect the same structure all the time?
Should I exclude the migration files from the code repository and the eb deployment and run makemigrations & migrate after every deployment? Should I not run migrations automatically using the .ebextensions and apply all the migrations manually through one of the instances?
What's the recommended way of using the same Django application with different database instances on different environments?
Seems that you might have deleted the table or migrations at some point of time.
When you run makemigrations, django create migratins and when you run migrate, it creates database whichever is specified in settings file.
One thing is if you keep on creating migrations and do not run it in a particular database, it will be absolutely fine. Whenever you switch to databsse and run migrations, it will handle it as every database will store the point upto which migrations have been run until now in django-migrations table and will start running next migrations only.
To solve your problem, you can delete all databases and migration files and start afresh as you are perhaps testing right now. Things will go fine untill you delete a migration or a database in any of the server.
If you have precious data, you should get into migration files and tables to analyse and manage things.

Categories

Resources