I'm using Celery, and it works for async, but I need to set up an operation to a specific datetime.
For example "on the 30th of August, 2019, at 11:35, do this."
My celery.py is very simple now, but it works:
import time
from datetime import datetime, timedelta
from datetime import date
from celery import shared_task,current_task, task
from celery import Celery
app = Celery()
#app.task
def test():
print ('1')
todaynow = datetime.now()
print todaynow
I call it from view and run print on rabbit
Any idea for how to program a task?
ty
EDIT:
I try in view to call "test"
test.apply_async(eta=datetime(2019, 7, 31, 6, 28))
in flower it receive the task but not execute it, why?
To run a task at a specific time you can pass the eta parameter to apply_async
test.apply_async(eta=datetime.datetime(2019, 8, 30, 11, 35))
Celery component responsible for scheduling tasks to run at specific time, or repeatedly after some time is called the Celery Beat (scheduler). Celery documentation has a complete section describing it, with details how to run it, and how to configure it. If you are familiar with crontab you will easily create your own scheduled task-runs.
You can create a single executed periodic scheduler for "the 30 of august 2019 at 11 and 35 min do this" by using celery such as:
import time
from datetime import datetime, timedelta
from datetime import date
from celery import Celery, shared_task,current_task, task
from celery.schedules import crontab
app = Celery()
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
sender.add_periodic_task(
crontab(hour=11, minute=35, day_of_month=30, month_of_year=8),
test.s(),
expires=1
)
#app.task
def test():
print ('1')
todaynow = datetime.now()
print todaynow
To schedule task you need to use celery beat .
from celery.task import periodic_task
#periodic_task(run_every=crontab(minute="*")) # It will run your task every minute
def schedule_task():
print('1')
todaynow = datetime.now()
print(todaynow)
You can schedule your task at any specific time using periodic task .
To know more use this link
https://docs.celeryproject.org/en/latest/userguide/periodic-tasks.html .
Don't forget to restart your celery beat after creation of task .
Related
I need to do a job with 15 mins interval within 11:15 am to 2:15 pm IST every day,
I tried using the below code but still i cannot able to get it.
Any answers would be really appreciated.
from apscheduler.schedulers.background import BackgroundScheduler,BlockingScheduler
from datetime import datetime
def my_job():
print(f"custom job called at {datetime.now()}")
scheduler = BlockingScheduler()
scheduler.add_job(my_job, 'cron', minute='1', hour='12-14')
scheduler.start()
The easiest way I see to achieve those times would be to create multiple triggers with the OrTrigger. I also used different cron operators, to make it run every 15 minutes.
from apscheduler.triggers.combining import OrTrigger
from apscheduler.triggers.cron import CronTrigger
trigger = OrTrigger([CronTrigger(hour='11', minute='15-59/15'),
CronTrigger(hour='12-13', minute='0-59/15'),
CronTrigger(hour='14', minute='0-15/15')])
scheduler.add_job(my_job, trigger)
from apscheduler.triggers.combining import OrTrigger
from apscheduler.triggers.cron import CronTrigger
trigger = OrTrigger([CronTrigger(hour='11', minute='15-59/15'),
CronTrigger(hour='12-13', minute='0-59/15'),
CronTrigger(hour='14', minute='0-15/15')])
scheduler.add_job(my_job, trigger,timezone='Asia/Kolkata')
scheduler.start()
now I'm using celery and flower for async jobs.
when I define tasks.py like this:
import os
import time
from celery import Celery, Task
celery = Celery(__name__)
celery.conf.broker_url = os.environ.get("CELERY_BROKER_URL", "redis://localhost:6379")
celery.conf.result_backend = os.environ.get("CELERY_RESULT_BACKEND", "redis://localhost:6379")
#celery.task(name="create_task")
def create_task(task_type):
time.sleep(int(task_type) * 10)
return True
the executed tasks are shown on ${flower host}/tasks, but
when I define create_task() like this, executed tasks aren't shown on ${flower host}/tasks.
class MyTask(Task):
def run(self, task_type):
time.sleep(int(task_type) * 10)
return True
create_task = celery.register_task(MyTask())
both of them are executed tasks successfully, and I can see the number of executed task from here:
and as I can see from the documentation, the definition of tasks are fine.
https://docs.celeryq.dev/en/stable/userguide/tasks.html#custom-task-classes
what's the difference?
I found a reason,
when I execute a task, I need to specify the name of args (in this case, task_type) explicitly.
task = create_task.delay(task_type=int(task_type))
sorry for my TED Talk.
i am not able to schedule the sub schedule file(run only at once)
30 1 * * * python3 majorfile.py
the major file is scripted like this
import datetime
import time
import schedule
if today == some specific date:
def run1():
exec(open('sub_file.py').read())
return schedule.CancelJob
schedule.every().day.at("5:30").do(run1)
(this job is needed to run once)
with second the part i am facing issue(in this part also have situations like concurrent events)....it is not running....can anyone help me in this?'''
Error:-
The works pending must be checked in a loop.
Code:-
import datetime
from time import sleep
import schedule
def run1():
exec(open('sub_file.py').read())
return schedule.CancelJob
schedule.every().day.at("5:30").do(run1)
while True:
schedule.run_pending()
sleep(1)
I can't seem to figure out how to get this working. I want to run a function every ten seconds
from __future__ import absolute_import, unicode_literals, print_function
from celery import Celery
import app as x # the library which hold the func i want to task
app = Celery(
'myapp',
broker='amqp://guest#localhost//',
)
app.conf.timezone = 'UTC'
#app.task
def shed_task():
x.run()
#app.on_after_configure.connect
def setup_periodic_tasks(sender, **kwargs):
# Calls say('hello') every 10 seconds.
sender.add_periodic_task(10.0, shed_task.s(), name='add every 10')
if __name__ == '__main__':
app.start()
Then when I run the script it just shows me a bunch of commands I can use with celery. How can I get this running ? Do I have to run it in command line or something ?
Additionally when I get it running will I be able to see a list of complete tasks along with any errors ?
You can do it simply with python threading module like below
import time, threading
def foo():
print(time.ctime())
threading.Timer(10, foo).start()
foo()
I am trying to schedule a job in python using the APScheduler package. This answer looked great, but it's snytax is out of date. I went to user guide for the current version 3, but I cannot find a basic example where I can pass a datetime object to the scheduler like in the linked answer.
from datetime import date
from apscheduler.schedulers.background import BackgroundScheduler as Scheduler
from datetime import datetime
# Start the scheduler
sched = Scheduler()
sched.start()
# Define the function that is to be executed
def my_job(text):
print text
#Schedule job
job = sched.add_job(my_job, next_run_time = datetime(2015, 5, 11, 1, 05, 5), args = ['Hello World!'])
This yields the error: No handlers could be found for logger "apscheduler.executors.default".