So I am making an app in Flask and I am using RabbitMQ as message broker and also backend Celery worker. I also use SocketIO in order to report back the celery worker status to the client. When I am running my app I get the following error:
I appreciate if you let me know why do I get this error.
app.py
app = Flask(__name__)
app.config['SECRET_KEY'] = ''
app.config.update(
CELERY_BROKER_URL = 'amqp://localhost//',
CELERY_RESULT_BACKEND='amqp://localhost//'
)
socketio = SocketIO(app, message_queue='amqp://')
celery = make_celery(app)
#app.route('/')
def my_form():
return render_template("form.html")
JavaScript
var socket = io.connect(location.protocol + '//' + document.domain + ':' + location.port );
make_celery module
def make_celery(app):
celery = Celery(app.import_name, backend=app.config['CELERY_RESULT_BACKEND'],
broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
Oops, the error message has been copy/pasted from another module, and I forgot to update it. The message should have read "Kombu requires a monkey patched socket library to work with gevent".
Basically this is saying that without monkey patching, gevent is going to block when socket operations are issued. See http://www.gevent.org/gevent.monkey.html for more details about this.
Related
I am doing inference of a machine learning PyTorch model and want to use celery and Redis for queuing prediction. I have written the inference code in the flask as a rest API and now got confused about how to add celery tasks and Redis message broker.
# flask rest API code for inference
response = {}
#app.route('/predict', methods=["POST", "GET"])
def predict():
if request.method=='POST':
solute = request.form["solute"]
solvent = request.form["solvent"]
else:
solute = request.args.get("solute")
solvent = request.args.get("solvent")
results = predictions(solute, solvent)
response["predictions"] = results[0].item()
response["interaction_map"] = (results[1].detach().numpy()).tolist()
return flask.jsonify({'result': response})
# celery tasks
from celery import Celery, current_task
from celery.result import AsyncResult
REDIS_URL = 'redis://redis:6379/0'
BROKER_URL = 'redis://redis:6379/0'
CELERY = Celery('tasks',
backend=REDIS_URL,
broker=BROKER_URL)
def get_job(job_id):
return AsyncResult(job_id, app=CELERY)
I have tried running it using celery worker -b redis://localhost:6379 --app= main.celery -l (here b is broker), then I got the error
Error: No such option: -b.
What am I doing wrong?
I have a Flask application that is using the factory function pattern (from the intro tutorial) and am attempting to offload a long running job to a background worker with a Redis queue. I am invoking the background work from a Blueprint, and am unable to pass the application context along with the invocation. The intention is to use the application context and the SQLite configuration to perform writes to it from the background thread. What am I missing here? I think this may be more of a "you just don't know enough about how Flask works" issue and if that is the case, please let me know what I'm doing wrong! Thanks.
ERROR
RuntimeError: Working outside of application context.
dy.py
import sqlite3
import click
from flask import current_app, g
from flask.cli import with_appcontext
def get_db():
if 'db' not in g:
g.db = sqlite3.connect(
current_app.config['DATABASE'],
detect_types=sqlite3.PARSE_DECLTYPES
)
g.db.row_factory = sqlite3.Row
return g.db
Blueprint module:
from flask import (
Blueprint, flash, g, redirect, render_template, request, url_for, Response, current_app
)
import sqlite3
from app.db import get_db
from rq import Queue
from redis import Redis
bp = Blueprint('perform_work', __name__)
q = Queue(connection=Redis())
def some_func():
db = get_db()
...
def generate_work():
result = q.enqueue(some_func)
...
#bp.route('/perform_work', methods=['POST'])
def perform_work():
...
generate_work()
worker.py
import os
import redis
from rq import Worker, Queue, Connection
listen = ['default']
redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')
conn = redis.from_url(redis_url)
if __name__ == '__main__':
with Connection(conn):
worker = Worker(list(map(Queue, listen)))
worker.work()
alright well im glad i typed this out i guess. the context of the app was never registered with the worker, which should've occurred in worker.py, so yeah the worker had no idea about the application itself. here's the updated worker.py that registers the app's context:
import os
import redis
from rq import Worker, Queue, Connection
from app import create_app
listen = ['default']
redis_url = os.getenv('REDISTOGO_URL', 'redis://localhost:6379')
conn = redis.from_url(redis_url)
app = create_app()
app.app_context().push()
if __name__ == '__main__':
with Connection(conn):
worker = Worker(list(map(Queue, listen)))
worker.work()
I have used this tutorial to set up Celery on my Flask application but i keep getting the following error:
File "C:\Users\user\AppData\Local\Programs\Python\Python38\lib\site-packages\celery\app\base.py", line 141, in data
return self.callback()
celery.exceptions.ImproperlyConfigured:
Cannot mix new setting names with old setting names, please
rename the following settings to use the old format:
include -> CELERY_INCLUDE
Or change all of the settings to use the new format :)
What am i doing wrong? The code i used is basically the same of the tutorial:
init.py
app = Flask(__name__)
app.config.from_object(Config)
app.config['TESTING'] = True
db = SQLAlchemy(app)
migrate = Migrate(app, db)
def make_celery(app):
celery = Celery(
app.import_name,
backend=app.config['CELERY_RESULT_BACKEND'],
broker=app.config['CELERY_BROKER_URL']
)
celery.conf.update(app.config)
class ContextTask(celery.Task):
def __call__(self, *args, **kwargs):
with app.app_context():
return self.run(*args, **kwargs)
celery.Task = ContextTask
return celery
app.config.update(
CELERY_BROKER_URL='redis://localhost:6379',
CELERY_RESULT_BACKEND='redis://localhost:6379'
)
celery = make_celery(app)
the code you are using used old variable names
change lines
app.config.update(
CELERY_BROKER_URL='redis://localhost:6379',
CELERY_RESULT_BACKEND='redis://localhost:6379'
)
to
app.config.update(
broker_url='redis://localhost:6379',
result_backend='redis://localhost:6379'
)
The similar Celery/Flask implementation was working fine on Ubuntu and Python 3.8, but giving errors as above on Windows 10 (Python 3.9.0 and Celery 4.4.6 (cliffs)).
It finally worked for me by adding -P solo to the celery worker command (ref - https://github.com/celery/celery/issues/3759)
$ celery worker -A proj -Q yourqueuename -P solo --loglevel=INFO
When I try to send a socket message outside from the SocketIO event context, the message does not arrive at the client.
Method outside of the context:
#main.route('/import/event', methods=['POST'])
def update_client():
from .. import socketio
userid = request.json['userid']
data = request.json
current_app.logger.info("New data: {}".format(data))
current_app.logger.info("Userid: {}".format(userid))
socketio.emit('import', data, namespace='/events', room=userid)
return 'ok'
I also tried:
with current_app.app_context():
socketio.emit('import', data, namespace='/events', room=userid)
On the SocketIO Context 'on.connect'
#socketio.on('connect', namespace='/events')
def events_connect():
current_app.logger.info("CONNECT {}".format(request.namespace))
userid = request.sid
current_app.clients[userid] = request.namespace
The method update_client will be called from a thread.
On the Client side:
$(document).ready(function(){
var socket = io.connect('http://' + document.domain + ':' + location.port +'/events');
var userid;
socket.on('connect', function() {
console.log("on connect");
});
socket.on('import', function (event) {
console.log("On import" +event);
});
When I call the emit('import', 'test') in the #socketio.on('connect') method the messages arrives at the client and the log message is printed.
There is an example in the documentation:
#app.route('/ping')
def ping():
socketio.emit('ping event', {'data': 42}, namespace='/chat')
Do I miss something or why does the message not arrive at the client?
Edit:
The socketio is created in the app/__init__.py function
socketio = SocketIO()
create_app():
app = Flask(__name__)
socketio.init_app(app)
manage.py
from app import socketio
if __name__ == '__main__':
socketio.run(app)
Flask-SocketIO==2.6
eventlet==0.19.0
I found a solution.
When I run the application with the flask internal server, the messages are not received by the client.
python manage.py run
But when I run the server with gunicorn all works like a charm.
So the solution here is to use the gunicorn with eventlet.
gunicorn --worker-class eventlet -w 1 manage:app
I use Flask 0.11 with Flask-Migrate 2.0. Perhaps I missed something, since Flask 0.11 has a new startup command.
I have a flask app that roughly looks like this:
app = Flask(__name__)
#app.route('/',methods=['POST'])
def foo():
data = json.loads(request.data)
# do some stuff
return "OK"
Now in addition I would like to run a function every ten seconds from that script. I don't want to use sleep for that. I have the following celery script in addition:
from celery import Celery
from datetime import timedelta
celery = Celery('__name__')
CELERYBEAT_SCHEDULE = {
'add-every-30-seconds': {
'task': 'tasks.add',
'schedule': timedelta(seconds=10)
},
}
#celery.task(name='tasks.add')
def hello():
app.logger.info('run my function')
The script works fine, but the logger.info is not executed. What am I missing?
Do you have Celery worker and Celery beat running? Scheduled tasks are handled by beat, which queues the task mentioned when appropriate. Worker then actually crunches the numbers and executes your task.
celery worker --app myproject--loglevel=info
celery beat --app myproject
Your task however looks like it's calling the Flask app's logger. When using the worker, you probably don't have the Flask application around (since it's in another process). Try using a normal Python logger for the demo task.
Well, celery beat can be embedded in regular celery worker as well, with -B parameter in your command.
celery -A --app myproject --loglevel=info -B
It is only recommended for the development environment. For production, you should run beat and celery workers separately as documentation mentions. Otherwise, your periodic task will run more than one time.
A celery task by default will run outside of the Flask app context and thus it won't have access to Flask app instance. However it's very easy to create the Flask app context while running a task by using app_context method of the Flask app object.
app = Flask(__name__)
celery = Celery(app.name)
#celery.task
def task():
with app.app_context():
app.logger.info('running my task')
This article by Miguel Grinberg is a very good place to get a primer on the basics of using Celery in a Flask application.
First install the redis on machine and check it is running or not.
install the python dependencies
celery
redis
flask
folder structure
project
app
init.py
task.py
main.py
write task.py
from celery import Celery
from celery.schedules import crontab
from app import app
from app.scrap import product_data
from celery.utils.log import get_task_logger
logger = get_task_logger(__name__)
def make_celery(app):
#Celery configuration
app.config['CELERY_BROKER_URL'] = 'redis://127.0.0.1:6379'
app.config['CELERY_RESULT_BACKEND'] = 'db+postgresql://user:password#172.17.0.3:5432/mydatabase'
app.config['CELERY_RESULT_EXTENDED']=True
app.config['CELERYBEAT_SCHEDULE'] = {
# Executes every minute
'periodic_task-every-minute': {
'task': 'periodic_task',
'schedule': crontab(minute="*")
}
}
celery = Celery(app.import_name, broker=app.config['CELERY_BROKER_URL'])
celery.conf.update(app.config)
TaskBase = celery.Task
class ContextTask(TaskBase):
abstract = True
def __call__(self, *args, **kwargs):
with app.app_context():
return TaskBase.__call__(self, *args, **kwargs)
celery.Task = ContextTask
return celery
celery = make_celery(app)
#celery.task(name="periodic_task",bind=True)
def testing(self):
file1 = open("../myfile.txt", "a")
# writing newline character
file1.write("\n")
file1.write("Today")
#faik
print("Running")
self.request.task_name = "state"
logger.info("Hello! from periodic task")
return "Done"
write init.py
from flask import Flask, Blueprint,request
from flask_restx import Api,Resource,fields
from flask_sqlalchemy import SQLAlchemy
import redis
from rq import Queue
app = Flask(__name__)
app.config['SECRET_KEY']='7c09ebc8801a0ce8fb82b3d2ec51b4db'
app.config['SQLALCHEMY_DATABASE_URI']='sqlite:///site.db'
db=SQLAlchemy(app)
command to run celery beat and worker
celery -A app.task.celery beat
celery -A app.task.celery worker --loglevel=info