django 1.7rc3 and celery 3.13 - AppRegistryNotReady - python

This does not work for me
$> cat /etc/lsb-release
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=12.04
DISTRIB_CODENAME=precise
DISTRIB_DESCRIPTION="Ubuntu 12.04.2 LTS"
django 1.7rc3
celery 3.1.13
python 2.7
I attempt to run
celery worker -A <project_name>
and I get
django.core.exceptions.AppRegistryNotReady: Models aren't loaded yet.
The runserver command works fine so I don't think it has to do with my settings?
python manage.py runserver 0.0.0.0:8080
I've double checked celery.py and confirm it has the correct values for the following lines:
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'proj.settings')
app = Celery('proj')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
Is there something else i should be doing?

Django 1.7 now requires a different initialization for standalone scripts. When running outside the manage.py context, you now need to include:
import django
django.setup()
Try adding it before the app = Celery('proj') in your script.

I found out that the reason this was failing is because i had this in one of my tasks.py
CURRENT_DOMAIN = Site.objects.get_current().domain
I got side steped this for now with
CURRENT_DOMAIN = lambda: Site.objects.get_current().domain
currently waiting to see if anyone on github cares to offer a better recommendation.
https://github.com/celery/celery/issues/2227
will update if i get one. If not will probably just do a helper function that lazy returns the value i want.
update
at the suggestion of celery author, I've refactored my code so I don't make that call at the module level.
He's also addressed the issue by making sure django.setup() is called before importing task modules
https://github.com/celery/celery/issues/2227

I'm seeing the same issue however only with:
CELERYBEAT_SCHEDULER='djcelery.schedulers.DatabaseScheduler'
Do you have this enabled? If I run it with the default celery scheduler it loads fine. But can't get it to load with the Django scheduler.

Related

Django hosting on Cloudera applications: server not starting

I am trying to port a django web application from some server to cloudera "applications" (data science workbench) and I was trying to make it work. I managed to do so with flask and fastapi applications, just the django framework is missing. My problem, when trying the base setup (https://docs.djangoproject.com/en/3.2/intro/tutorial01/), locally works like a charm but when I try to start up the server from an instance in cloudera, the server does not start and, more curiosuly, I get a weird output related to some package of the image I am spinning. (you can see I am bypassing the traditional runserver command on django because in the cloudera applications side I cannot run shells directly and also due to the fact that I would to tie it up with some environmental variable)
below the manage.py
import os
import sys
if __name__ == '__main__':
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings")
from django.core.management import execute_from_command_line
port_to_pass = 8000
args = ['name', 'runserver', '127.0.0.1:%d' % port_to_pass]
execute_from_command_line(args)
weird "print" (the instance I am spinning has got the exact same package versions installed locally):
when I should be getting:
any idea what might the problem be?
You need an extra entry.py script:
!pip install django
!python manage.py runserver 8000

How can I run the full Flask Tutorial app in the Wingware IDE?

How can I run the full Flask Tutorial app in the Wingware IDE?
I've been using Flask under Wing Pro 7.2 for some time, and can get control because I start Flask by doing app.run() in Wing.
I conceived a wish to trace through the official working version of the completed tutorial, obtained by
git clone https://github.com/pallets/flask
This works fine (using 'flask run'), and I now have the complete source. But there's no app.run() anywhere. I tried putting one in init.py:
def create_app(test_config=None):
#...
db.init_app(app)
return app
RUN = True
if RUN:
app= create_app()
app.run()
and flask starts up, but throws an error on request 'localhost:5000/', which normally fires up a database form.
Is there a starting point in the Python code somewhere?
Or, is it possible to attach Wing to a running flask, and tell it about the source files? There is a bit in the Wing manual about attaching, but it seems to demand information about the target that we lack.
I managed to start the tutorial by creating a file main.py in the same directory as the flaskr package, with this contents:
import flaskr
app = flaskr.create_app()
app.debug = False
app.run(use_reloader=True)
Then I set this as the main debug file in Wing.
To make debugging work correctly, you may also need to set the Python Executable in Project Properties (from the Project menu) to the Python command line or activated env you want to use.
Also, it is important to set Debug/Execute > Debug Child Processes in Project Properties to Always Debug Child Processes. Otherwise the process actually running the app code is not debugged.
This works but results in a SQL error because the table 'post' does not exist if you did not already run the following first to initialize the database:
$ export FLASK_APP=flaskr
$ export FLASK_ENV=development
$ flask init-db
Once I did that, everything worked for me.

Clarification of guide to Heroku Celery

I'm trying to figure out the woeful instructions here
Under the section "Configuring a Celery app" I'm not sure where i put the code:
import os
app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
Any clarification of these instructions is greatly appreciated.
The instructions are indicating you should put that code in your tasks.py module. However, that's not exactly extensible for multiple packages, each with their own tasks.py module. What I'd recommend is creating a celery.py file in the same directory as your settings.py file.
# tasks.py
import celery
app = celery.Celery('example')
app.conf.update(BROKER_URL=os.environ['REDIS_URL'],
CELERY_RESULT_BACKEND=os.environ['REDIS_URL'])
Or you can specify your settings in settings.py and configure celery as such:
# settings.py
broker_url = os.environ['REDIS_URL']
result_backend = os.environ['REDIS_URL']
# celery.py
from celery import Celery
from celery.utils.collections import DictAttribute
from celery.loaders.base import BaseLoader
from django.conf import settings
from django.apps import apps
class ProjectLoader(BaseLoader):
def read_configuration(self):
"""Load configuration from Django settings.
This may not be needed to be honest. It's what I use in my project.
"""
return DictAttribute(settings)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")
# CELERY_LOADER must be set in the environment. Setting the ``loader``
# kwarg for the app instance does _not_ do what we need it to.
os.environ.setdefault("CELERY_LOADER", "project.celery:ProjectLoader")
app = Celery("project")
app.config_from_object("django.conf:settings")
app.autodiscover_tasks(lambda: [n.name for n in apps.get_app_configs()])
# Procfile
worker: celery worker --app=project.celery
Disclaimer, some of these configs will require adjustment for your project.
Following is the steps i took to make a minimal heroku/django/celery/redis project in conjunction with the instructions here along with other sources I found on the web. Hopefully someone will find this useful.
In your terminal, use the "heroku login" command to log in to the Heroku CLI.
"git clone https://github.com/heroku/python-getting-started.git" to copy a basic django skeleton project to your local.
rename python-getting-started to whatever.
cd into this directory.
run the following command: "pip install -r requirements.txt"
Note: Postgres must be properly installed in order for this step to work properly.
run the following command: "python manage.py collectstatic"
Install redis on Mac: "brew install redis"
Start redis server: "redis-server&" (The & at the end is to run it as a background process)
Test if Redis server is running: "redis-cli ping". If it replies “PONG”, then it’s good to go!
Install celery: "pip install celery"
Make a tasks.py file in your application directory with the following code:
from celery import Celery
app = Celery('tasks', broker='redis://localhost:6379/0')
#app.task
def add(x, y):
return x + y
"cd .." back into root directory.
Run celery: "celery worker -A=location of tasks&"
run: "python manage.py shell" in your root directory.
As your tasks celery server has been started, you can now use it to run your task just by importing tasks.py script, e.g from Python interpreter interactive mode:
import hello.tasks
hello.tasks.add.delay(1, 1)
This should return an Async message.
Push your local to heroku master.
** Note: If you run "celery worker -A=location of tasks.py&" and it gives the message:
consumer: Cannot connect to amqp://guest:**#127.0.0.1:5672//: [Errno 61]
Connection refused.
Try restarting the redis server with the command: "brew services restart redis"
There you have it. A minimal heroku/django/celery/redis project! You can download it here.Instructions on how to deploy this to heroku.
** Note: In the working project the "celery worker" command is already included in the Procfile.

How to decouple Django and Celery?

In my celery tasks, I have a task which uses a python module(theano) that runs on GPU, this module can only be imported by one thread at the same time. But to start the website, I have to run:
python manage.py runserver
celery -A celery_try worker -l info
So the module will be imported by both celery and django website, which is conflict. Is there a way to decouple Django and Celery so that the module is only imported once?
For testing purposes you can run django development server in single-threaded mode: python manage.py runserver --nothreading.
You want to import theano only in celery worker process, not in django web server process, right? Ok, let's make import conditional, so that it is imported in celery and not in django.
import os
try:
# next line will raise exception in django, but will work fine in celery
is_worker = os.environ['celery_worker']
import theano # celery will import theano, django won't
except Exception as exc:
# django code will catch exception that celery_worker doesn't exist and print it here
print exc
And start your celery worker with celery_worker environment variable set:
celery_worker=yes celery -A celery_try worker -l info
To discriminate between celery worker and django, let's set a bash environment variable in celery process, but not in django process. I called that variable celery_worker. In order to set it, I prepended celery -A celery_try worker -l info with per-command env variable assignment: celery_worker=yes. Now, in python code, I check if the environment variable is present. If it is, we are in celery worker and need to import theano.
If we're in django, os.environ['celery_worker'] shouldn't be defined and should raise an exception.

django celery daemon doesn't work

Here's my celery app config:
from __future__ import absolute_import
from celery import Celery
import os
from django.conf import settings
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'tshirtmafia.settings')
app = Celery('tshirtmafia')
app.conf.update(
CELERY_RESULT_BACKEND='djcelery.backends.database:DatabaseBackend',
)
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
settings.py:
INSTALLED_APPS:
'kombu.transport.django',
'djcelery',
also:
BROKER_URL = 'django://'
Here's my task:
#shared_task
def test():
send_mail('nesamone bus', 'Files have been successfully generated.', 'marijus.merkevicius#gmail.com',
['marijus.merkevicius#gmail.com'], fail_silently=False)
Now when I run locally python manage.py celeryd locally and then run test.delay() from shell locally it works.
Now I'm trying to deploy my app. When with the exact same configuration I try to open python manage.py celeryd and in other window I open shell and run test task, it doesn't work.
I've also tried to setup background daemon like this:
/etc/default/celeryd configuration:
# Name of nodes to start, here we have a single node
CELERYD_NODES="w1"
# or we could have three nodes:
#CELERYD_NODES="w1 w2 w3"
# Where to chdir at start. (CATMAID Django project dir.)
CELERYD_CHDIR="/home/tshirtnation/"
# Python interpreter from environment. (in CATMAID Django dir)
ENV_PYTHON="/usr/bin/python"
# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryd_multi"
# How to call "manage.py celeryctl"
CELERYCTL="$ENV_PYTHON $CELERYD_CHDIR/manage.py celeryctl"
# Extra arguments to celeryd
CELERYD_OPTS="--time-limit=300 --concurrency=1"
# Name of the celery config module.
CELERY_CONFIG_MODULE="celeryconfig"
# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
# Workers should run as an unprivileged user.
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="settings"
And I use default celery /etc/init.d/celeryd script.
So basically it seems like celeryd starts but doesn't work. No idea how to debug this and what might be wrong.
Let me know if you need anything else
Celery turned to be a very capricious child in Django robust system as for me.
There are too little initial data for understanding the reason of your problems.
The most usual reason of Celery daemon fail is file system permissions.
But to clarify the reason I'd try:
Start celery from a command line by the user-owner of django project:
celery -A proj worker -l info
If it works OK, go further
Start celery in a verbal mode as a root user just like daemon to be:
sudo sh -x /etc/init.d/celeryd start
This will show most of the problems with the daemon script - celery user and group used, but not all, unfortunately: permission fails are not visible.
My little remark.
Usually Celery is started by own celery user, and the django project by another one. After long fighting celery and system, I refused from celery user, and owned celery process by the django project user.
And .. do not forget to start once
update-rc.d celerybeat defaults
update-rc.d celeryd defaults
this is for Ubuntu daemon start, sure.
Good luck

Categories

Resources