I have a small team of 5 and I need help finding resources or advice on how to have one centralized MySQL database. We all work virtual and host all the files on one github to centralize everything. So it would be a pain to have 5 local databases on different computers all with the same information as we work scripting logic from the models and etc that will be the same.
The github_cm_dev: is the main directory inside gitHub
venv: is the virtual environment (so anyone on the team can use and activate the environment easily)
indieitude-project/indieitude directory: Has all of my django applications and files
indieitude-project/indieitude/indieitude directory: has all the config files i.e. settings, urls.p etc
Any advice or thoughts on the matter?
Normally there is one master database -- the production DB. There are three general options here:
1) If the data isn't too sensitive and a dev doesn't need every last change, you can simply let each dev take a dump file from production and apply it to their machine.
2) If the data is sensitive (e.g. w/ user passwords), you can create a "dummy" version of the DB and use that as the canonical data for local dev and for testing.
3) If you need every change on each user's machine, then you can preload the entries you need with django fixture files. See https://docs.djangoproject.com/en/dev/howto/initial-data/ You can create a fixture file from a DB pretty easily. A fixture file is just text so you can put it in your git repo.
Related
So I have a MySQL database for my Django project which I think is located in the default directly (wherever that should be). When I want to move my Django project to digital ocean, how do I move that database? The my.cnf file is located inside the same directory as the settings.py file for the Django project. I am using macOS.
In short: How do I copy my database file to my Django project directory and reconfigure its path to that?
Compare the MySQL versions of both servers. Ensure the new version isn't less than the current version. Also check its not two major versions ahead of the current one.
Ensure MySQL is stopped.
Copy/move the entire datadir to the new location. There should be no files in this new location that weren't in the original.
In the my.cnf file set datadir to the new location.
Copy the my.cnf file to the location where mysql is expecting to read this on the new server. Look at top of mysqld --help --verbose if you have doubts where this is.
There are many things to consider here,
In Digital Ocean you would have MySQL server running on same machine as the Django server i.e just like your local machine?
If yes then you might not have to change anything on django settings (if you keep user and password same as local).
If No, you might need to change host in django to IP of that server(there are many things involved here...)
Regarding the data transfer, it is always the best practice to dump MySQL and Import it in new DB. There are many guides available on web. Here is one from Digital ocean.
Say I’m working on a complex Python program for keeping track of shipment data along with Git for version control. On my local machine, is it best to create two different folder structures, one being the develop folder and the other being the production folder?
So, I push my final (develop) changes to the master branch, then pull those changes to production folder?
I like that idea of keeping separate folders in order to prevent mistakes, but I think it totally depends on the standards/conventions of your group/environment. If you communicate this approach to them and they approve of it, then I would recommend it. But the way version control works is that it allows you to access remote versions from anywhere and by using the same folder location, you're optimizing space on your machine.
I am working on a Django based application whose location on my disk is home/user/Documents/project/application. Now this application takes in some values from the user and writes them into a file located in a folder which is under the project directory i.e home/user/Documents/project/folder/file. While running the development server using the command python manage.py runserver everything worked fine, however after deployment the application/views.py which accesses the file via open('folder/path','w') is not able to access it anymore, because by default it looks in var/www folder when deployed via apache2 server using mod_wsgi.
Now, I am not putting the folder into /var/www because it is not a good practise to put any python code there as it might become readable clients which is a major security threat. Please let me know, how can I point the deployed application to read and write to correct file.
The real solution is to install your data files in /srv/data/myapp or some such so that you can give the webserver user correct permissions to only those directories. Whether you choose to put your code in /var/www or not, is a separate question, but I would suggest putting at least your wsgi file there (and, of course, specifying your <DocumentRoot..> correctly.
This question is mostly about the technical details + some best practices of how to efficiently deploy a python web app that's built using platter.
Taking Django for instance, I have a project that's already built into a tarball distribution. This includes all wheels of all deps + the package of the app itself.
My repo directory also contains some other files that need to be distributed with the deployed code, such as: manage.py, a fabfile package with fabric utils, and some configuration files (for supervisor, nginx, etc).
So my questions are:
How can I wrap these extra files into the distribution that contains the project?
If I simply use git to clone/pull the project on the server I have these files, but then I have duplicate of the source code being both in the project and zipped in the tarball. How can I avoid that? Committing the tarball into a separate repo?
Perhaps the duplication is not so bad, and I'll end up with multiple tarballs in my dist/ directory and only one symlinked to the current from which I deploy?
Same goes for a Tornado based app.
My first rule of deployment is "whatever works". Every production environment has different requirements. But to give opinions on your questions:
Not everything should be in your Python project. Perhaps there is a way to do it, but I think it's using the wrong hammer.
You can create a separate Git repo that handles configuration and asset files for your production deployment (this does not even be managed by Git if you don't care about old, irrelevant configuration files). This does not have to be a Python project, just the files for the production deployment. You may optionally put a Python script or two in here (or just a README.txt or fab files or a Buildout config) to automate tasks such as unpacking your platter or copying config files around.
It's tempting (and possible) to put production config things in your main Git repo. This is even suggested by apps that create boilerplate files for development and production configuration. This doesn't mean it's the best way to do things though.
My rule is that the main Git repo is "development only". It's cloned by developers who are setting up and working in development environments. It conflates a Python project far too much to try and be an Python application and also be a place to manage a production system, IMHO.
Production is managed separately. Sometimes by people different from the developers or at least the developer is wearing a different hat when thinking about a production deployment. This way you can also have a small, clean repo that tracks just changes to your production system.
Playing with symlinks within a single deployment that represents different builds is an extra layer of confusion. And the impetus to do so comes from trying to do everything from a single Python project.
Deploy your python application to something like /var/myapp/build-2015-10-29/. Then create a symlink at /var/myapp/current/ that points to this location. This way you can create a full deployment at /var/myapp/build-2015-11-05/ and tweak the config to start on a separate port, bring the app up and ensure everything works, then just switch from the symlink from the old build to the new build with minimal downtime.
As a fledgling Django developer, I was wondering if it was customary, or indeed possible, to create a site with Django then transfer the complete file structure to a different machine where it would "go live".
Thanks,
~Caitlin
You could use GIT or Mercurial - or other version control system. To put the site structure on a central server. After that you could deploy the site for example with fabric to multiple servers. For deployment process you should consider using for example virtualenv to isolate the project from global python packages and requirements.
Of course that's possible and in fact it's the only way to "go live". You don't want to develop in your live server, do you? And it's true for any platform, not just django.
If I understood your question correctly, you need a system to push your development code to live.
Use a version control system: git, svn, mercurial etc.
Identify environment specific code like setting/config files etc. and have separate instances of them for each environment.
Create a testing/staging/PP environment which has live data or live-like data and deploy your code there before pushing it to live.
To avoid any downtime during deployment process, usually a symbolic link is created which points to the existing code folder. When a new release is to be pushed, a new folder is created with new code, after all other dependencies are done (like setting and database changes) and the sym link is pointed to the new folder.