How to apply an upgrade python package whithout stop Django server? - python

I have a Django Server that uses a X package, these package i upgrade daily. I don’t want to Stop my Django instance every day when the package was updated.
I have a Cron daily to Upgrade the Package (Pip package -U),
but the Django doesn’t use the latest version upgrated. Thanks

Generally doing a software update to a running service implies some kind of a restart or reload of that service. In many cases, if you force this, it may work for some amount of time but may cause difficult to find bugs; java is infamous for working until the next occasional subroutine (ie, log collection) happens if you upgrade the jdk version in-place like this.
I'm not sure what kind of modules you're modifying here, but in some cases you might try out https://github.com/django-extensions/django-extensions to see if the manage.py shell_plus autoreload may serve your needs.

Related

Setting up a server to host multiple domains using django, virtualenv, gunicorn and nginx

I am setting up a new server machine, which will host multiple django websites.
I must point out that I own (developed and are in absolute control of) all websites that will be run on the server.
I am pretty certain that ALL of the websites will be using the same version of:
django
gunicorn
nginx
postgreSQL and psycopg2 (all though some websites will be using geospatial and other extensions)
The only thing that I know will differ between the django applications are:
python modules used (which may have implications for version of python required)
I can understand using virtualenv to manage instances of where a project has specific python modules (or even python version requirements), but it seems pretty wasteful to me (in terms of resources), to have each project (via virtualenv), to have separate installations of django, nginx, gunicorn ... etc.
My question then is this:
Is it 'acceptable' (or considered best practice in scenarios such as that outlined above) to globally install django, gunicorn, nginx, postgreSQL and psycopg2 and simply use virtualenv to manage only the parts (e.g. python modules/versions) that differ between projects?.
Note: In this scenario there'll be one nginx server handling multiple domains.
Last but not the least, is it possible to use virtualenv to manage different postgreSQL extensions in different projects?
No. It would probably work, but it would be a bad idea.
Firstly, it's not clear what kind of "resources" you think would be wasted. The only relevant thing is disk space, and we're talking about a few megabytes only; not even worth thinking about.
Secondly, you'd now make it impossible to upgrade any of them individually; for anything beyond a trivial upgrade, you'd need to test and release them all together, rather than just doing what you need and deploying that one on its own.
I wouldn't consider it advisable.
By doing that, you are creating a dependency between the projects which means you'll never be able to upgrade one without all the others. Which would be a massive PITA.
Eventually it would get to a point where you could never upgrade because Project A's dependency foo doesnt work with django 1.N but Project B's dependency bar requires at least 1.N - At which point you fall back to the cleaner solution anyway, separate environments.
That applies to the django side of things at least, it may work slightly better with Postgres and Nginx.
I would suggest to use docker virtualization so that every project has it's own scope and doesn't interfere with other projects.
I'm currently having such configuration on multiple servers and I'm really happy with it because I'm really flexible and what is really important - I'm secure, because if any of projects has critical bugs in it, other projects are still safe.

How should I debug an app running on server in Django 1.3/Postgres 8.4 when local is Django 1.7/Postgres 9.3?

As a Django / Python newbie, should I try to debug on a server running 4 year old software versions, try to recreate the old software installations on my local, or just try to run the software in current version of Django/Python/Postgres/PostGIS on my local Mac OS X 10.9.5?
Background:
On a project where I was supposed to just load data into Postgres/PostGIS, I need to debug why a 2010 year old Django / Postgres / Postgis project is getting an error. I'm a LAMP developer who's never used Django or done much in Python, but I've been able to get a staging site working on the server, and make one or two changes. I thought it would make sense to debug locally on my Mac OS X 10.9.5. So I've used homebrew to install Django 1.7 and Postgres 9.3. Looking at the version differences, I'm worried it will be a more of a hassle now to try to migrate and upgrade the project than to attempt to debug it on the staging site instance running on the server.
FWIW, I know the lines of code that I'd like to investigate (seems like maybe an object is not getting loaded properly from db, since it is in the db), but I'm not positive what sort of echo statements to use. I like using IDE's to inspect things. The project is a bit of an orphan, as the first professional project of a developer who is no longer available to help. And of course, the deadline is last week. :(
Differences between your production and development environments can cause a myriad of headaches.
I recommend you use a tool such as Vagrant to set up a development environment running inside of a virtual machine that mirrors your production server.
Use VirtualEnv to emulate the necessary Django version. PostgreSQL is trickier, in theory you can have a second instance with the required version running simultaneously, but that can also cause very subtle conflicts. It would be better to have it running on another machine (virtual or physical) and access it through your local network.
The simplest way I think is to look at using unittest and mock object to set up some unit tests on the functions that you suspect are the cause of the problem. By using unittest and mock objects, you can control how the existing code interacts with Django and Postgres objects and allow for version differences by setting the expected return values.
With mock object, you can mock all or just part of an existing Python object, which reduces the dependencies you require for your development environment. Depending on how the code is structured, you might not need to install either Django or Postgres at all or a webserver for that matter. This blog explains Mock object in detail.
Even though you're pressed for time, you could do worse than setting up unittests for the whole project, future developers will thank you.
In terms of debugging, I personally can't reccomend pudb enough, it's an interactive command line debugger which you can use with unittest to zero in on what part of the code is causing the problem.
If you do need to install Django and Postgres, I would suggest looking at virtualenv which allows you to set up a virtual environment for Python. That way you can just install the specific dependencies you need without interfering with your global system wide installation. You can also install earlier versions of packages which would do the trick to emulate the existing system's state.

How to install (run) Jumo (Python/Django) on localhost

I am trying to run Jumo open source platform ( https://github.com/jumoconnect/openjumo) on my local machine (Windows 7). I have Python and Django installed and it works - I can create new project without problems, but I am not sure how to set existing project and make it work.
I am not experienced with Python/Django, so first I need to know whether it is hard task or not. I guess it's much complicated then installing WordPress or Joomla, but does that require a lot of work in order to set it up?
Can someone write some kind of guide if it is not too complicated? Or if there is some tutorial that explains how to do that, that would be great.
Just to be clear, I am not trying to learn something in one day, just want to see if this first step in using this platform is too hard, and if it's not, I'd continue learning, because this is exactly what I need for one project.
I use Python 2.7 and Django 1.3.1
Thanks
This project has a lot of external requirements which are not listed anywhere and unless you are comfortable and familiar with django, I wouldn't recommend installing it. Here are some but not all of the requirements that you'll need running:
Celery and django-celery
RabbitMQ (a broker for Celery), or another broker but then you'll have to edit settings.py again.
memcached (this is optional, as you can use django's own session middleware for development), but if you simply download the code and try to run it, you'll run into an error since the default settings.py has the default middleware commented out.
grappelli
django-jenkins (not required, but again, unless you edit the default settings.py, you'll get errors).
django-tastypie
django-debug-toolbar
Data science toolkit server
as you may know Django is a powerful web framework and Python is a programming language, for python itself you need more than a few months to become some kind of expert.
it's very easy to start, but not that easy to learn the whole stuff. Django is one of the most or maybe the most powerful web frameworks (beside Ruby on Rails).
my advice to you:
learn python's basics (there's many books such as dive into python,
...)
learn Django basics (Django Book)
use Django on a Gnu/Linux (for example Ubuntu)
learning basics of Django can be done in a few weeks.
consider deploying a Django website is not that easy. you need to know a little about Apache web server.
www.djangoproject.com
note that in python (like other programming languages) you need to define working path. errors mentioned in question's comments usually caused by this.
also you should check Python installation path. sometimes these kinds of errors occurs when there is a project with the same name as yours in python installation path.

What is the best way to distribute code across servers?

I have a directory of python programs, classes and packages that I currently distribute to 5 servers. It seems I'm continually going to be adding more servers and right now I'm just doing a basic rsync over from my local box to the servers.
What would a better approach be for distributing code across n servers?
thanks
I use Mercurial with fabric to deploy all the source code. Fabric's written in python, so it'll be easy for you to get started. Updating the production service is as simple as fab production deploy. Which ends ups doing something like this:
Shut down all the services and put an "Upgrade in Progress" page.
Update the source code directory.
Run all migrations.
Start up all services.
It's pretty awesome seeing this all happen automatically.
First, make sure to keep all code under revision control (if you're not already doing that), so that you can check out new versions of the code from a repository instead of having to copy it to the servers from your workstation.
With revision control in place you can use a tool such as Capistrano to automatically check out the code on each server without having to log in to each machine and do a manual checkout.
With such a setup, deploying a new version to all servers can be as simple as running
$ cap deploy
from your local machine.
While I also use version control to do this, another approach you might consider is to package up the source using whatever package management your host systems use (for example RPMs or dpkgs), and set up the systems to use a custom repository Then an "apt-get upgrade" or "yum update" will update the software on the systems. Then you could use something like "mussh" to run the stop/update/start commands on all the tools.
Ideally, you'd push it to a "testing" repository first, have your staging systems install it, and once the testing of that was signed off on you could move it to the production repository.
It's very similar to the recommendations of using fabric or version control in general, just another alternative which may suit some people better.
The downside to using packages is that you're probably using version control anyway, and you do have to manage version numbers of these packages. I do this using revision tags within my version control, so I could just as easily do an "svn update" or similar on the destination systems.
In either case, you may need to consider the migration from one version to the next. If a user loads a page that contains references to other elements, you do the update and those elements go away, what do you do? You may wish to do something either within your deployment scripting, or within your code where you first push out a version with the new page, but keep the old referenced elements, deploy that, and then remove the referenced elements and deploy that later.
In this way users won't see broken elements within the page.

Django version selection

Greetings,
I am currently working on a long term project that uses Django 1.1.1, and we are planning to release it around march of 2010.
Now while surfing I came upon to this article which says the planned release date of Django 1.2.0 is March 9, 2010.
Now I am a bit confused. If I should continue developing under 1.1.1 or start developing using 1.2.0 beta.
I'd say only develop for the latest version if there is a specific feature you need/like. Read up on it so you know of course what is in store.
1.0 onwards. I've found swapping django versions to be relatively trouble free. At any stage all you need to do is swap symlinks on a source tree on your test server. and of course running that thorough unit test suite you've written will show up any version skew bugs.
The upgrade path is not difficult. I would familiarize yourself with the differences and avoid deprecations, but continue on the battle tested 1.1 branch. While you should never ever trust software release dates, you also probably don't want to bet your farm on a brand new branch with features you clearly don't need yet. You've got plenty to do to release in a month. Upgrade when you have time. If your release was scheduled for June you might consider it, but for now stick with stability. That's mho.
Part of our current application under development is being put into production now, but we hope to use Django 1.2 final. Our strategy is to write code, test and deploy using Django 1.1.1, but also test using virtualenv. There's really no reason not to test your code under 1.2 whatever your deployment decision because you'll want it eventually to be compatible with 1.2.
virtualenv makes the whole process painless and is the key to quickly switching between environments. It's incredibly easy to set up:
easy_install virtualenv
virtualenv django12
cd django12
source bin/activate
Then download and install Django 1.2 in the virtual environment and run your tests. I run the development server in virtualenv on port 8081, so I can have both servers -- using the same application code -- running at the same time, ports 8080, 8081.
In our case we had to remove one import and wrap a few others with try/except conditions. I had to write a dummy csrf_token template tag for CSRF to work -- the Django developers have informed me they'll include a dummy tag in 1.2 final. We also upgraded the South migrations tool to 0.7-pre, as the current release doesn't support Django 1.2.
Bottom line: Regardless of your deployment decision, a case can be made for testing both versions of Django if at all possible.

Categories

Resources