django code which runs event independent - python

I am creating a web application in django and I want to create a backend app which runs continuously instead of running only when the "view" is called. How do I do this ?
Any help would be appreciated.
Thank you.

Have a look at Celery. It is a task queue that tightly integrates with Django.
You can also create a custom management command that contains a while True: ... sleep loop.
In any case, you should set DEBUG to false, otherwise Django will eat up your memory.

http://code.google.com/p/django-cron/
This is a plugin, which will allow you to put some tasks for independent execution.
Also this question contains good solution for this question: Django - Set Up A Scheduled Job?

Django is not especially for this; that said, you can use django's facilities, and just write a programme that executes continuously.

Write a management command and daemonize it with supervisord.
By the way, technically django itself is running continuously and not only when view is called.
What are you trying to acheive?

Related

Does django-celery-beat deal with multiple instances of django processes created by web-servers

I have a fairly complex periodic-tasks that needs to be offloaded from django context. django-celery-beat looks promising. While I was going through celery-beat docs, I found this:
You have to ensure only a single scheduler is running for a schedule at a time, otherwise you’d end up with duplicate tasks. Using a centralized approach means the schedule doesn’t have to be synchronized, and the service can operate without using locks.
A typical production deployment will spawn a pool of worker-processes each running a django instance. Will that result in creation of multiple scheduler processes as well? Do I need to have some synchronisation logic?
Thanks for your time!
It does not.
You can dig into the issues page on their github repo for confirmation. I think it's weird that the documentation doesn't call this out, but I suppose you have to assume that's how all celery beats work unless they specify otherwise.
In theory, you could build your own synchronization, but it will probably be a better experience to use a different scheduler that has that functionality built in, like Heroku's redbeat: https://blog.heroku.com/redbeat-celery-beat-scheduler.

Django does not exit after finishing tests when using threading module

I am doing integration tests using django.test.SimpleTestCase.
After running python manage.py test, the tests run successfully and the terminal hangs with the message:
---------------------------
Ran 5 tests in 1.365s
OK
The problem is that currently I go back to the terminal using CTRL+C, but I want to have automated tests in my CI/CD pipeline.
Did I do something wrong in the way I executed the tests? Or is this behaviour normal? In this case, is there an way in Bash to programatically execute and then exit the tests?
EDIT:
After analysing my app in depth, I was able to identify what was causing that behaviour. I am using threading in a way like the following in my views.py:
def __pooling():
wait_time = 10
call_remote_server()
threading.Timer(wait_time, __pooling).start()
__pooling()
Basically I need that my application do something from time to time asynchronously.
Should I change the way I am doing the pooling? Or should I disable it (how?) during the tests?
I was able to identify what was causing that behaviour. I am using threading in a way like the following in my views.py:
def __pooling():
wait_time = 10
call_remote_server()
threading.Timer(wait_time, __pooling).start()
__pooling()
Basically I need that my application do something from time to time
asynchronously. Should I change the way I am doing the pooling?
I don't fully understand your needs, but a more traditional approach would be to schedule a task (probably a management command) outside of Django itself. An OS-level scheduler like cron or Windows Task Scheduler, something like APScheduler, or a task queue like Celery would all be reasonable choices.
Or should I disable it (how?) during the tests?
I don't recommend continuing to use your __pooling() function as it exists today. In my opinion this kind of thing doesn't belong in your views.py. But if you want to keep it, something like
from django.conf import settings
if not settings.DEBUG:
__pooling()
might help. Your __pooling() function would only be called when DEBUG is falsy, as it should be in production. (If it is also falsy in your CI environment you could choose another existing setting, or add something to your settings.py specifically to control this.)

Running python bot in django background

I've got a django project with simple form to take users details in. I want to use python bot running in the background and constantly checking django database for any changes. Is it Celery the right tool for this job? Any other solution? Thank you
I don't think Celery is really what you want here - Celery is primarily for moving tasks that don't need to be dealt with in the same process to a separate worker, such as sending registration emails.
For this situation I'd be inclined to use Django's signals to trigger the required functionality whenever the appropriate changes are made to the database. For instance, if it needed to be triggered when a particular type of object was created, such as a new user, then you might use the post_save signal of the user model.
The bot would be in a separate process, but it's not too hard to communicate between processes using Redis. Just have the signal publish a message to Redis, and have the bot listen for that message and carry out the required action on that event.
I don't have the details of your needs but, there are a few ways to achieve such things:
The Constantly checking approach:
A crontab which launch your python script every minute.
Like you said, you could use Celery beat, to achieve what a crontab would do, in your python environment
"On change" approach:
Probably the best, if you have control of the Django project, you could have your script run on the form validation/save! For this, You can add a celery task, run the python script, use Django signals...

Django Task/Command Execution Best Practice/Understanding

I've got a little problem with understanding the django management commands. I've got an Webapplication which displays some network traffic information through eth0. Therefore I've created a python class which analyse the traffic and create/update the specific data in the database. Something like this:
class Analyzer:
def doSomething(self):
#analyze the traffic create/update data in db
def startAnalyzing(self):
while 1:
self.doSomething()
Then I create a management command which creates this class instance and runs startAnalyzing().
Now my question:
Is this the correct way to do that over management command because the task is not terminating (run the whole time) and not started/stopped via webapplication? Or what is the correct way?
Is it probably better to start the "Analyzer" not via django? Im new to django and wan't to do it the right way.
Is it possible to start sniffing the traffic when i run: manage.py runserver 0.0.0.0:8080?
Many thanks in advance.
What you're doing is not intended to do with management commands. In fact management commands are what the name implies, a command to manage something, do a quick action. Not keep a whole process running for the entire life time of the web app.
To achieve what you want, you should write a simple python script and keep it running with a process manager (supervisor ?). You just then have to setup django in the beginning of the script so can have access to Django's ORM, which probably is the reason you've chosen Django.
So all in all, you're script would look something like the following:
import sys, os
sys.path.insert(0, "/path/to/parent/of/project") # /home/projects/django-proj
os.environ.setdefault("DJANGO_SETTINGS_MODULE", 'proj.settings')
import django
django.setup()
from proj.app.models import DBModel
This way you can use django's ORM as you would use in a normal Django application. You can also provide templates and views of the Database as you normally would.
The only thing that remains is to keep the script running, and that you can simply do with supervisord.

Simple approach to launching background task in Django

I have a Django website, and one page has a button (or link) that when clicked will launch a somewhat long running task. Obviously I want to launch this task as a background task and immediately return a result to the user. I want to implement this using a simple approach that will not require me to install and learn a whole new messaging architecture like Celery for example. I do not want to use Celery! I just want to use a simple approach that I can set up and get running over the next half hour or so. Isn't there a simple way to do this in Django without having to add (yet another) 3rd party package?
Just use a thread.
import threading
t = threading.Thread(target=long_process,
args=args,
kwargs=kwargs)
t.setDaemon(True)
t.start()
return HttpResponse()
See this question for more details:
Can Django do multi-thread works?
Have a look at django-background-tasks - it does exactly what you need and doesn't need any additional services to be running like RabbitMQ or Redis. It manages a task queue in the database and has a Django management command which you can run once or as a cron job.
If you're willing to install a 3rd party library, but you want something a whole lot simpler than Celery, check out Redis Queue. It does require Redis, which is pretty easy in itself, but that can provide a lot of other benefits as well.
RQ itself has almost zero configuration. It's startlingly simple.
References:
http://python-rq.org/
http://nvie.com/posts/introducing-rq/
https://devcenter.heroku.com/articles/python-rq (RQ on Heroku)

Categories

Resources