I am doing integration tests using django.test.SimpleTestCase.
After running python manage.py test, the tests run successfully and the terminal hangs with the message:
---------------------------
Ran 5 tests in 1.365s
OK
The problem is that currently I go back to the terminal using CTRL+C, but I want to have automated tests in my CI/CD pipeline.
Did I do something wrong in the way I executed the tests? Or is this behaviour normal? In this case, is there an way in Bash to programatically execute and then exit the tests?
EDIT:
After analysing my app in depth, I was able to identify what was causing that behaviour. I am using threading in a way like the following in my views.py:
def __pooling():
wait_time = 10
call_remote_server()
threading.Timer(wait_time, __pooling).start()
__pooling()
Basically I need that my application do something from time to time asynchronously.
Should I change the way I am doing the pooling? Or should I disable it (how?) during the tests?
I was able to identify what was causing that behaviour. I am using threading in a way like the following in my views.py:
def __pooling():
wait_time = 10
call_remote_server()
threading.Timer(wait_time, __pooling).start()
__pooling()
Basically I need that my application do something from time to time
asynchronously. Should I change the way I am doing the pooling?
I don't fully understand your needs, but a more traditional approach would be to schedule a task (probably a management command) outside of Django itself. An OS-level scheduler like cron or Windows Task Scheduler, something like APScheduler, or a task queue like Celery would all be reasonable choices.
Or should I disable it (how?) during the tests?
I don't recommend continuing to use your __pooling() function as it exists today. In my opinion this kind of thing doesn't belong in your views.py. But if you want to keep it, something like
from django.conf import settings
if not settings.DEBUG:
__pooling()
might help. Your __pooling() function would only be called when DEBUG is falsy, as it should be in production. (If it is also falsy in your CI environment you could choose another existing setting, or add something to your settings.py specifically to control this.)
Related
There is a specific periodic task that needs to be removed from message queue. I am using the configuration of Redis and celery here.
tasks.py
#periodic_task(run_every=crontab(minute='*/6'))
def task_abcd():
"""
some operations here
"""
There are other periodic tasks also in the project but I need to stop this specific task to stop from now on.
As explained in this answer, the following code will work?
#periodic_task(run_every=crontab(minute='*/6'))
def task_abcd():
pass
In this example periodic task schedule is defined directly in code, meaning it is hard-coded and cannot be altered dynamically without code change and app re-deploy.
The provided code with task logic deleted or with simple return at the beginning - will work, but will not be the answer to the question - task will still run, there just is no code that will run with it.
Also, it is recommended NOT to use #periodic_task:
"""Deprecated decorator, please use :setting:beat_schedule."""
so it is not recommended to use it.
First, change method from being #periodic_task to just regular celery #task, and because you are using Django - it is better to go straightforward for #shared_task:
from celery import shared_task
#shared_task
def task_abcd():
...
Now this is just one of celery tasks, which needs to be called explicitly. Or it can be run periodically if added to celery beat schedule.
For production and if using multiple workers it is not recommended to run celery worker with embedded beat (-B) - run separate instance of celery beat scheduler.
Schedule can specified in celery.py or in django project settings (settings.py).
It is still not very dynamic, as to re-read settings app needs to be reloaded.
Then, use Database Scheduler which will allow dynamically creating schedules - which tasks need to be run and when and with what arguments. It even provides nice django admin web views for administration!
That code will work but I'd go for something that doesn't force you to update your code every time you need to disable/enable the task.
What you could do is to use a configurable variable whose value could come from an admin panel, a configuration file, or whatever you want, and use that to return before your code runs if the task is in disabled mode.
For instance:
#periodic_task(run_every=crontab(minute='*/6'))
def task_abcd():
config = load_config_for_task_abcd()
if not config.is_enabled:
return
# some operations here
In this way, even if your task is scheduled, its operations won't be executed.
If you simply want to remove the periodic task, have you tried to remove the function and then restart your celery service. You can restart your Redis service as well as your Django server for safe measure.
Make sure that the function you removed is not referenced anywhere else.
I need a thread running when I start my django server, basically the thread just periodically processes some items from a database.
Where is the best place to start this thread.
I think this is generally a bad idea. You shouldn't have that kind of periodic threads running in the frontend process.
I would create a management command that will do the processing. Then I would set up a cron job (or any other mechanic provided by the hosting) calling the management command. This way you divide the work to logic places and you can also test the processing much easier.
You want to execute code in the top-level urls.py. That module is imported and executed once on server startup.
in your urls.py
from django.confs.urls.defaults import *
from my_app import one_time_startup
urlpatterns = ...
one_time_startup() # This is your function that you want to execute.
I've got a little problem with understanding the django management commands. I've got an Webapplication which displays some network traffic information through eth0. Therefore I've created a python class which analyse the traffic and create/update the specific data in the database. Something like this:
class Analyzer:
def doSomething(self):
#analyze the traffic create/update data in db
def startAnalyzing(self):
while 1:
self.doSomething()
Then I create a management command which creates this class instance and runs startAnalyzing().
Now my question:
Is this the correct way to do that over management command because the task is not terminating (run the whole time) and not started/stopped via webapplication? Or what is the correct way?
Is it probably better to start the "Analyzer" not via django? Im new to django and wan't to do it the right way.
Is it possible to start sniffing the traffic when i run: manage.py runserver 0.0.0.0:8080?
Many thanks in advance.
What you're doing is not intended to do with management commands. In fact management commands are what the name implies, a command to manage something, do a quick action. Not keep a whole process running for the entire life time of the web app.
To achieve what you want, you should write a simple python script and keep it running with a process manager (supervisor ?). You just then have to setup django in the beginning of the script so can have access to Django's ORM, which probably is the reason you've chosen Django.
So all in all, you're script would look something like the following:
import sys, os
sys.path.insert(0, "/path/to/parent/of/project") # /home/projects/django-proj
os.environ.setdefault("DJANGO_SETTINGS_MODULE", 'proj.settings')
import django
django.setup()
from proj.app.models import DBModel
This way you can use django's ORM as you would use in a normal Django application. You can also provide templates and views of the Database as you normally would.
The only thing that remains is to keep the script running, and that you can simply do with supervisord.
I am creating a web application in django and I want to create a backend app which runs continuously instead of running only when the "view" is called. How do I do this ?
Any help would be appreciated.
Thank you.
Have a look at Celery. It is a task queue that tightly integrates with Django.
You can also create a custom management command that contains a while True: ... sleep loop.
In any case, you should set DEBUG to false, otherwise Django will eat up your memory.
http://code.google.com/p/django-cron/
This is a plugin, which will allow you to put some tasks for independent execution.
Also this question contains good solution for this question: Django - Set Up A Scheduled Job?
Django is not especially for this; that said, you can use django's facilities, and just write a programme that executes continuously.
Write a management command and daemonize it with supervisord.
By the way, technically django itself is running continuously and not only when view is called.
What are you trying to acheive?
I want to perform some one-time operations such as to start a background thread and populate a cache every 30 minutes as initialize action when the Django server is started, so it will not block user from visiting the website. Where should I place all this code in Django?
Put them into the setting.py file does not work. It seems it will cause a circular dependency.
Put them into the __init__.py file does not work. Django server call it many times (What is the reason?)
I just create standalone scripts and schedule them with cron. Admittedly it's a bit low-tech, but It Just Works. Just place this at the top of a script in your projects top-level directory and call as needed.
#!/usr/bin/env python
from django.core.management import setup_environ
import settings
setup_environ(settings)
from django.db import transaction
# random interesting things
# If you change the database, make sure you use this next line
transaction.commit_unless_managed()
We put one-time startup scripts in the top-level urls.py. This is often where your admin bindings go -- they're one-time startup, also.
Some folks like to put these things in settings.py but that seems to conflate settings (which don't do much) with the rest of the site's code (which does stuff).
For one operation in startserver, you can use customs commands or if you want a periodic task or a queue of taske you can use celery
__init__.py will be called every time the app is imported. So if you're using mod_wsgi with Apache for instance with the prefork method, then every new process created is effectively 'starting' the project thus importing __init__.py. It sounds like your best method would be to create a new management command, and then cron that up to run every so often if that's an option. Either that, or run that management command before starting the server. You could write up a quick script that runs that management command and then starts the server for instance.