I am setting my scheduler using apscheduler inside a file called Scheduler.py -
def start_scheduler():
scheduler = BackgroundScheduler()
scheduler.add_job(xyz_job, trigger='cron', hour=21, minute=0)
scheduler.start()
Now inside my apps.py -
from. Scheduler import start_scheduler
class AppNameConfig(AppConfig):
name = 'appname'
start_scheduler()
Then from cmd I use - python manage.py runserver
My scheduler works fine and everyday at 9pm, scheduler started working in background.
P.S. - This was an API with entry and end points. I tested it using postman. I was getting the output correctly.
Now on my linux server, I am doing the same thing, but instead of using django, I am using gunicorn to expose my api.
I am using the command -
gunicorn -b server_ip:port project.wsgi --workers=2 --daemon
Using the above gunicorn command, my api is still working fine and I am getting the output but my scheduler isn't working.
Can anyone give some insight on what can be the possible solution for this?
Related
I'm trying to figure out the woeful instructions here
Under the section "Configuring a Celery app" I'm not sure where i put the code:
import os
app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
Any clarification of these instructions is greatly appreciated.
The instructions are indicating you should put that code in your tasks.py module. However, that's not exactly extensible for multiple packages, each with their own tasks.py module. What I'd recommend is creating a celery.py file in the same directory as your settings.py file.
# tasks.py
import celery
app = celery.Celery('example')
app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
Or you can specify your settings in settings.py and configure celery as such:
# settings.py
broker_url = os.environ['REDIS_URL']
result_backend = os.environ['REDIS_URL']
# celery.py
from celery import Celery
from celery.utils.collections import DictAttribute
from celery.loaders.base import BaseLoader
from django.conf import settings
from django.apps import apps
class ProjectLoader(BaseLoader):
def read_configuration(self):
"""Load configuration from Django settings.
This may not be needed to be honest. It's what I use in my project.
"""
return DictAttribute(settings)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")
# CELERY_LOADER must be set in the environment. Setting the ``loader``
# kwarg for the app instance does _not_ do what we need it to.
os.environ.setdefault("CELERY_LOADER", "project.celery:ProjectLoader")
app = Celery("project")
app.config_from_object("django.conf:settings")
app.autodiscover_tasks(lambda: [n.name for n in apps.get_app_configs()])
# Procfile
worker: celery worker --app=project.celery
Disclaimer, some of these configs will require adjustment for your project.
Following is the steps i took to make a minimal heroku/django/celery/redis project in conjunction with the instructions here along with other sources I found on the web. Hopefully someone will find this useful.
In your terminal, use the "heroku login" command to log in to the Heroku CLI.
"git clone https://github.com/heroku/python-getting-started.git" to copy a basic django skeleton project to your local.
rename python-getting-started to whatever.
cd into this directory.
run the following command: "pip install -r requirements.txt"
Note: Postgres must be properly installed in order for this step to work properly.
run the following command: "python manage.py collectstatic"
Install redis on Mac: "brew install redis"
Start redis server: "redis-server&" (The & at the end is to run it as a background process)
Test if Redis server is running: "redis-cli ping". If it replies “PONG”, then it’s good to go!
Install celery: "pip install celery"
Make a tasks.py file in your application directory with the following code:
from celery import Celery
app = Celery('tasks', broker='redis://localhost:6379/0')
#app.task
def add(x, y):
return x + y
"cd .." back into root directory.
Run celery: "celery worker -A=location of tasks&"
run: "python manage.py shell" in your root directory.
As your tasks celery server has been started, you can now use it to run your task just by importing tasks.py script, e.g from Python interpreter interactive mode:
import hello.tasks
hello.tasks.add.delay(1, 1)
This should return an Async message.
Push your local to heroku master.
** Note: If you run "celery worker -A=location of tasks.py&" and it gives the message:
consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 61]
Connection refused.
Try restarting the redis server with the command: "brew services restart redis"
There you have it. A minimal heroku/django/celery/redis project! You can download it here.Instructions on how to deploy this to heroku.
** Note: In the working project the "celery worker" command is already included in the Procfile.
I have written this task in tasks.py file which is under my django apps directory myapp.
#periodic task that run every minute
#periodic_task(run_every=(crontab(hour="*", minute="*", day_of_week="*")))
def news():
'''
Grab url
'''
logger.info("Start task")
now = datetime.now()
urls = []
urls.append(crawler()) #crawler return dic obj
for url_dic in list(reversed(urls)):
for title, url in url_dict.items():
#Save all the scrape url in database
Url.objects.create(title=headline, url=url)
logger.info("Task finished: result = %s" %url)
The main objectives of this task is to push the url and title to django database every minute
To run this celery task we need to invoke these commands using django ./manage utility how to run these commands as a daemon and I am planning to host this app in heroku
python manage.py celeryd --verbosity=2 --loglevel=DEBUG
python manage.py celerybeat --verbosity=2 --loglevel=DEBUG
but I need to run these two commands command as a daemon in background, How can we run this commands as a daemon so that my celery tasks can run.
A fast fix will be to put "&" after your commands i.e.
python manage.py celeryd --verbosity=2 --loglevel=DEBUG &
python manage.py celerybeat --verbosity=2 --loglevel=DEBUG &
After hitting enter this tasks will act as daemon and still print out the useful debug info. So this is great for initial stage and sometimes small applications that do not rely heavily on celery.
For development purpose i will suggest using supervisor .See THIS POST that gives realy nice info for celery, django and supervisor integration. Read the: "Running Celery workers as daemons" part of the post.
I am working on Django based web app. During unittest, I need to write a test which needs "Celery worker" running in the background.
I have already used:
CELERY_EAGER_PROPAGATES_EXCEPTIONS=True
CELERY_ALWAYS_EAGER=True
BROKER_BACKEND='memory
In over_ride settings, but these are not running celery worker for me in background when needed.
Any help would much appreciated.
Celery won't get run by Django automatically.
You can start a worker process by running from your project root:
$ celery -A my_proj worker
my_proj should be the application name you configured with app = Celery('my_proj')
Here's my celery app config:
from __future__ import absolute_import
from celery import Celery
import os
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'tshirtmafia.settings')
app = Celery('tshirtmafia')
app.conf.update(
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
)
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
settings.py:
INSTALLED_APPS:
'kombu.transport.django',
'djcelery',
also:
BROKER_URL = 'django://'
Here's my task:
#shared_task
def test():
send_mail('nesamone bus', 'Files have been successfully generated.', 'marijus.merkevicius#gmail.com',
['marijus.merkevicius#gmail.com'], fail_silently=False)
Now when I run locally python manage.py celeryd locally and then run test.delay() from shell locally it works.
Now I'm trying to deploy my app. When with the exact same configuration I try to open python manage.py celeryd and in other window I open shell and run test task, it doesn't work.
I've also tried to setup background daemon like this:
/etc/default/celeryd configuration:
# Name of nodes to start, here we have a single node
CELERYD_NODES="w1"
# or we could have three nodes:
#CELERYD_NODES="w1 w2 w3"
# Where to chdir at start. (CATMAID Django project dir.)
CELERYD_CHDIR="/home/tshirtnation/"
# Python interpreter from environment. (in CATMAID Django dir)
ENV_PYTHON="/usr/bin/python"
# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryd_multi"
# How to call "manage.py celeryctl"
CELERYCTL="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryctl"
# Extra arguments to celeryd
CELERYD_OPTS="--time-limit=300 --concurrency=1"
# Name of the celery config module.
CELERY_CONFIG_MODULE="celeryconfig"
# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
# Workers should run as an unprivileged user.
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="settings"
And I use default celery /etc/init.d/celeryd script.
So basically it seems like celeryd starts but doesn't work. No idea how to debug this and what might be wrong.
Let me know if you need anything else
Celery turned to be a very capricious child in Django robust system as for me.
There are too little initial data for understanding the reason of your problems.
The most usual reason of Celery daemon fail is file system permissions.
But to clarify the reason I'd try:
Start celery from a command line by the user-owner of django project:
celery -A proj worker -l info
If it works OK, go further
Start celery in a verbal mode as a root user just like daemon to be:
sudo sh -x /etc/init.d/celeryd start
This will show most of the problems with the daemon script - celery user and group used, but not all, unfortunately: permission fails are not visible.
My little remark.
Usually Celery is started by own celery user, and the django project by another one. After long fighting celery and system, I refused from celery user, and owned celery process by the django project user.
And .. do not forget to start once
update-rc.d celerybeat defaults
update-rc.d celeryd defaults
this is for Ubuntu daemon start, sure.
Good luck
This does not work for me
$> cat /etc/lsb-release
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=12.04
DISTRIB_CODENAME=precise
DISTRIB_DESCRIPTION="Ubuntu 12.04.2 LTS"
django 1.7rc3
celery 3.1.13
python 2.7
I attempt to run
celery worker -A <project_name>
and I get
django.core.exceptions.AppRegistryNotReady: Models aren't loaded yet.
The runserver command works fine so I don't think it has to do with my settings?
python manage.py runserver 0.0.0.0:8080
I've double checked celery.py and confirm it has the correct values for the following lines:
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
Is there something else i should be doing?
Django 1.7 now requires a different initialization for standalone scripts. When running outside the manage.py context, you now need to include:
import django
django.setup()
Try adding it before the app = Celery('proj') in your script.
I found out that the reason this was failing is because i had this in one of my tasks.py
CURRENT_DOMAIN = Site.objects.get_current().domain
I got side steped this for now with
CURRENT_DOMAIN = lambda: Site.objects.get_current().domain
currently waiting to see if anyone on github cares to offer a better recommendation.
https://github.com/celery/celery/issues/2227
will update if i get one. If not will probably just do a helper function that lazy returns the value i want.
update
at the suggestion of celery author, I've refactored my code so I don't make that call at the module level.
He's also addressed the issue by making sure django.setup() is called before importing task modules
https://github.com/celery/celery/issues/2227
I'm seeing the same issue however only with:
CELERYBEAT_SCHEDULER='djcelery.schedulers.DatabaseScheduler'
Do you have this enabled? If I run it with the default celery scheduler it loads fine. But can't get it to load with the Django scheduler.