Views.py
class UserCreate(Resource):
def post(self):
# try:
celery=create_user_employee.delay(auth_header, form_data, employee_id, employee_response)
# from celery import current_task
if "exc" in celery.get:
# print(celery)
return deliver_response(False, 'Celery task failed'), 400
return deliver_response(True, message="User Creation in Progress", status=200, data=data), 200
Task.py
#celery.task(name='accounts.tasks.create_user_employee')
def create_user_employee(auth_header, form_data, employee_id, employee_response):
try:
# add_employee(form_data, employee_id, eid)
if "streetLane" in form_data:
user_id=form_data.get("personalEmail","")
employee_address=address_post__(form_data, auth_header, user_id)
return deliver_response(True,message="success",status=200),200
except Exception as e:
return deliver_response(False, str(e), status=500), 500
Note:I am not able to return the response to flask app from tasks.py the objective here is that i need to break the views.py func if there is any error response from tasks.py but the result is not bein able to import or print in views.py any help would be great.
...................................................................................................................................................................................
As the celery process is a Async process, you will not get immediate response from celery job.
So this code
celery=create_user_employee.delay(auth_header, form_data, employee_id, employee_response)
doesn't give you result of the task. So celery variable doesn't give any meaningful info of task getting completed. Also don't use celery as a variable name.
So to fetch results of celery task,
from celery import result
res = result.AsyncResult(job_id)
So res variable has res.status and res.result to fetch status of task and result of job_id task. You can set your own job id when you call celery task.
When to fetch those results is upto you. Either write a recursive function that sees the status of job periodically or try to see job status whenever the result is required.
I would expect an Error to be thrown at the place where you have if "exc" in celery.get: ... something like TypeError: argument of type 'function' is not iterable.
Did you try to slightly change that line and have if "exc" in celery.get(): (notice the parenthesis)?
Also, giving a result of delay() a name "celery" is misleading - whoever reads your code may think that is an instance of the Celery class, which is not. That is an instance of the AsyncResult type. get() is just one of its methods.
Related
In the following code, an API gives a task to a task broker, who puts it in a queue, where it is picked up by a worker. The worker will then execute the task and notify the task broker (using a redis message channel) that he is done, after which the task broker will remove it from its queue. This works.
What I'd like is that the task broker is then able to return the result of the task to the API. But I'm unsure on how to do so since it is asynchronous code and I'm having difficulty figuring it out. Can you help?
Simplified the code is roughly as follows, but incomplete.
The API code:
#router.post('', response_model=BaseDocument)
async def post_document(document: BaseDocument):
"""Create the document with a specific type and an optional name given in the payload"""
task = DocumentTask({ <SNIP>
})
task_broker.give_task(task)
result = await task_broker.get_task_result(task)
return result
The task broker code, first part is giving the task, the second part is removing the task and the final part is what I assume should be a blocking call on the status of the removed task
def give_task(self, task_obj):
self.add_task_to_queue(task_obj)
<SNIP>
self.message_channel.publish(task_obj)
# ...
def remove_task_from_queue(self, task):
id_task_to_remove = task.id
for i in range(len(task_queue)):
if task_queue[i]["id"] == id_task_to_remove:
removed_task = task_queue.pop(i)
logger.debug(
f"[TaskBroker] Task with id '{id_task_to_remove}' succesfully removed !"
)
removed_task["status"] = "DONE"
return
# ...
async def get_task_result(self, task):
return task.result
My intuition would like to implement a way in get_task_result that blocks on task.result until it is modified, where I would modify it in remove_task_from_queue when it is removed from the queue (and thus done).
Any idea in how to do this, asynchronously?
I am new to celery but failing at what should be simple:
Backend and broker are both configured for RabbitMQ
Task as follows:
#app.task
def add(x, y):
return x + y
Test Code:
File 1:
from tasks import add
from celery import uuid
task_id = uuid()
result = add.delay(7, 2)
task_id = result.task_id
print task_id
# output =
05f3f783-a538-45ed-89e3-c836a2623e8a
print result.get()
# output =
9
File 2:
from tasks import add
from celery.result import AsyncResult
res = AsyncResult('05f3f783-a538-45ed-89e3-c836a2623e8a')
print res.state
# output =
pending
print ('Result = %s' %res.get())
My understanding is file 2 should retrieve the value success and 9.
I have installed flower:
This reports success and 9 for the result.
Help. This is driving me nuts.
Thank you
Maybe you should read the FineManual and think twice ?
RPC Result Backend (RabbitMQ/QPid)
The RPC result backend (rpc://) is special as it doesn’t actually store the states, but rather sends
them as messages. This is an important difference as it means that a
result can only be retrieved once, and only by the client that
initiated the task. Two different processes can’t wait for the same
result.
(...)
The messages are transient (non-persistent) by default, so the results
will disappear if the broker restarts. You can configure the result
backend to send persistent messages using the result_persistent
setting.
In flask app, I need to execute other task's checkJob function (checking the job status and email to the the user) after executing return render_template(page). The user will see the confirm page but there is still background job running to check the job status.
I tried to use celery https://blog.miguelgrinberg.com/post/using-celery-with-flask for the background job and it does not work. Anything after return render_template(page) is not being executed.
Here's the code fragment:
#app.route("/myprocess", methods=['POST'])
def myprocess():
//.... do work
#r = checkJob()
return render_template('confirm.html')
r = checkJob()
#celery.task()
def checkJob():
bb=1
while bb == 1:
print "checkJob"
time.sleep(10)
As suggested in the comments, you should use apply_async().
#app.route("/myprocess", methods=['POST'])
def myprocess():
#.... do work
r = checkJob.apply_async()
return render_template('confirm.html')
Note that, as with the example, you do not want to invoke checkJob() but rather keep it like checkJob.
I'm trying to create multiples Google Docs in background task.
I try to use the taskqueue from Google App Engine but I mustn't understand a point as I kept getting this message :
INFO 2016-05-17 15:38:46,393 module.py:787] default: "POST /update_docs HTTP/1.1" 302 -
WARNING 2016-05-17 15:38:46,393 taskqueue_stub.py:1981] Task task1 failed to execute. This task will retry in 0.800 seconds
Here is my code. I make a multiple call to the method UpdateDocs that need to be executed from the queue.
# Create a GDoc in the queue (called by her)
class UpdateDocs(BaseHandler):
#decorator.oauth_required
def post(self):
try:
http = decorator.http()
service = discovery.build("drive", "v2", http=http)
# Create the file
docs_name = self.request.get('docs_name')
body = {
'mimeType': DOCS_MIMETYPE,
'title': docs_name,
}
service.files().insert(body=body).execute()
except AccessTokenRefreshError:
self.redirect("/")
# Create multiple GDocs by calling the queue
class QueueMultiDocsCreator(BaseHandler):
def get(self):
try:
for i in range(5):
name = "File_n" + str(i)
taskqueue.add(
url='/update_docs',
params={
'docs_name': name,
})
self.redirect('/files')
except AccessTokenRefreshError:
self.redirect('/')
I can see the push queue in the App Engine Console, and every tasks is inside it but they can't run, I don't get why.
Kindly try to specify the worker module in your code.
As shown in Creating a new task, after calling the taskqueue.add() function, it targets the module named worker and invokes its handler by setting the url/update-counter.
class EnqueueTaskHandler(webapp2.RequestHandler):
def post(self):
amount = int(self.request.get('amount'))
task = taskqueue.add(
url='/update_counter',
target='worker',
params={'amount': amount})
self.response.write(
'Task {} enqueued, ETA {}.'.format(task.name, task.eta))
And from what I have read in this blog, a worker is one important piece of a task queue. It is a python process that reads jobs form the queue and executes them one at a time.
I hope that helps.
I am using celery with a flask app to launch some background tasks, I am using mongoDB as a backend.
I would like to store in the backend some information about the task being launch and then be able to retrieve it.
I believe the key is in the use of self.update_state(state= ..., meta = {}) where meta is my custom information. However I do not find anything working.
we assume you have a task like this:
#celery.task(bind=True)
def counter(self):
for i in xrange(100):
time.sleep(1)
self.update(state='PROGRESS', meta={'current': i})
return {'status': 'complete'}
and you have flask route like this:
#app.route('/count/')
def count_100():
"""
this starts a counter task and returns a response immediately
"""
task = counter.delay()
# this will return an empty json object with 202 http code status
# which means requests is still in progress and a Location header
return jsonify(), 202, dict(Location=url_for('status', task_id=task.id))
and finally your task status route is something like this:
from celery.result import AsyncResult
...
#app.route('/status/<task_id>/')
def status(task_id):
task = AsyncResult(task_id) # retrieving the task we started
if task.state == 'PROGRESS':
response = {
'state': task.state,
'current': task.info.get('current', 0)
}
return jsonify(response)