Heroku Schedule task with postgres connection and python - python

I have a web app running on heroku using flask and SQLAlchemy. I am now wondering how i can start a schedule task that runs daily and does some database related tasks (deleting some row if you need to know:)
The documentation on heroku recommends to use APScheduler but i would like to do it with Heroku-Scheduler. Dispite this decision i would like to know how i connect to my postgres database in this scheduler task. I could not find any example or hint for that.
thanks for your time
Torsten

Heroku scheduler will run any command you throw at it. The typical way would be to create a Python script/command as part of your flask app. You can do something similar to http://flask-script.readthedocs.org/en/latest/. Then within the scheduler you would schedule it similar to:
python manage.py mytask

Related

Run a script everyday in django-rest api on heroku

I want to run a particular script every day in the morning for my rest api deployed on heroku the script itself it's only a database filler. how can that be done? I forgot to mention that i'm using django rest_framework
There are quite a few options. Here's what I do on an app I recently built that regularly scrapes a number of news websites.
In the root of my project I created a clock.py and then used apscheduler to create a scheduled job. Here's a simple version:
from apscheduler.schedulers.blocking import Blocking Scheduler
#IMPORTANT! Have to make sure things are working before this starts
from django import setup
setup()
sched = BlockingScheduler()
#sched.scheduled_job('interval', days=1):
def timed_job():
#do something here
...
I deployed my project on Heroku, so I then went through the process of spinning up a machine to run my clock.py file. Once it was up and running the process ran as intended.
There are a number of options besides interval you can use to specify when you want the process to run. You can use chron style scheduling if you want it to run at specific times and dates. You can read more about that here

Scheduled Python tasks in Heroku Application

I'm hosting a Flask application in Heroku(free) which acts as an API and reads from an SQLite database file.
The way the project ran on my computer, I had scheduled Python scripts which would run every night and append new data to my SQLite database, which would then in turn be able to be used by my Flask Application.
However, hosted on Heroku, I don't think I will be able to run my Flask application and a python script 24/7. I know there is an alternate solution APScheduler on Flask, which would carry out tasks in Python functions in the Flask application. However, according to Heroku's free use guidelines, if there is no traffic to my page in 30 minutes, the application will "sleep." I'm assuming that means scheduled tasks will no longer work once the application is asleep, which defeats the purpose of using APScheduler.
Are there any alternatives I could use to go about this?

How can I run function asynchronously to make calculation parallelly on Heroku with Django app?

I have to run function 500 times with different arguments on Heroku with hobby plan in my Django app at a specific time. I need to do it in the shortest period of time. I noticed that when I use Heroku Scheduler every task is running parallelly and asynchronously and each of them has own worker. So for example 10 functions ran in this way will calculate results as there would be only 1 ran function. As I have mentioned I need to run 500 functions with different arguments. I could create 500 Heroku schedulers and ran it separately but it seems to me that it's not supported by Heroku or maybe I am wrong? If so maybe someone know how it could be solved in another way?
Heroku doesn't support running this amount of workers at the same time in hobby plan.
You can use Celery to run this asynchronously with the amount of workers you want. Heroku hobby plan only supports 1 worker, but Celery will run you task in background at least (if that helps).
If you want to go with Celery, there's a guide to start using Celery on Django

Using celery with django app and backend server

I have a follow-on / clarification question related to an older question
I have 2 servers (for now). 1 server runs a django web application. The other server runs pure python scripts that are CRON-scheduled data acquisition & processing jobs for the web app.
There is a use case where user activity in the web application (updating a certain field) should trigger a series of actions by the backend server. I could stick with CRON but as we scale up, I can imagine running into trouble. Celery seems like a good solution except I'm unclear how to implement it. (Yes, I did read the getting started guide).
I want the web application to send tasks to a specific queue but the backend server to actually execute the work.
Assuming that both servers are using the same broker URL,
Do I need to define stub tasks in Djando or can I just use the celery.send_task method?
Should I still be using django-celery?
Meanwhile the backend server will be running Celery with the full implementation of the tasks and workers?
I decided to try it and work through any issues that came up.
On my django server, I did not use django-celery. I installed celery and redis (via pip) and followed most of the instructions in the First Steps with Django:
updated proj/proj/settings.py file to include the bare minimum of
configuration for Celery such as the BROKER_URL
created the proj/proj/celery.py file but without the task defined
at the bottom
updated the proj/proj/__init__.py file as documented
Since the server running django wasn't actually going to execute any
Celery tasks, in the view that would trigger a task, I added the
following:
from proj.celery import app as celery_app
try:
# send it to celery for backend processing
celery_app.send_task('tasks.mytask', kwargs={'some_id':obj.id,'another_att':obj.att}, queue='my-queue')
except Exception as err:
print('Issue sending task to Celery')
print err
The other server had the following installed: celery and redis (I used an AWS Elasticache redis instance for this testing).
This server had the following files:
celeryconfig.py will all of my Celery configuration and queues
defined, pointing to the same BROKER_URL as the django server
tasks.py with the actual code for all of my tasks
The celery workers were then started on this server, using the standard command: celery -A tasks worker -Q my-queue1,my-queue2
For testing, the above worked. Now I just need to make celery run in the background and optimize the number of workers/queue.
If anyone has additional comments or improvements, I'd love to hear them!

Celery with Django - deployment

I am considering using celery in my project. I found a lot of information about how to use it etc. What I am interested in is how to deploy/package my solution.
I need to run two components - django app and then celeryd worker (component that sends emails). For example I would like my django app to use email_ticket task that would email support tickets. I create tasks.py in the django app.
#task
def email_ticket(from, message):
...
Do I deploy my django app and then just run celeryd as separate process from the same path?
./manage.py celeryd ...
What about workers on different servers? Deploy whole django application and run only celeryd? I understand I could use celery only for the worker, but I would like to use celerycam and celerybeat.
Any feedback is appreciated. Thanks
Thanks for any feedback.
This is covered in the documentation here. The gist is you need to download some init scripts and setup some config. Once that's done celeryd will start on boot and you'll be off and running.

Categories

Resources