how to create scheduler in fastapi (uvicorn server) - python

I am working on a python service in which i have to create one FastAPI rest endpoint
and along with that 2 scheduler tasks should be run in background.
how can we implement scheduler in FastAPI ?
if it's not feasible what would be the best way to implement such requirement , creating two services would be the right way ?
Thanks & regards,
Aru

Related

Django how best to perform api request for large jobs

I need some direction as to how to achieve the following functionality using Django.
I want my application to enable multiple users to submit jobs to make calls to an API.
Each user job will require multiple API calls and will store the results in a db or a file.
Each user should be able to submit multiple jobs.
In case of some failure such as network blocked or API not returning results I want the application to pause for a while and then resume completing that job.
Basically want the application to pickup from where it was left off.
Any ideas as to how I could implement this solution or any technologies such as celery I should be looking at or even if you can suggest an opensource project where I can learn how to perform this would be a great help.
You can do this with rabbitmq and celery.
This post might be helpful.
https://medium.com/#ffreitasalves/executing-time-consuming-tasks-asynchronously-with-django-and-celery-8578eebab356

Consuming webservice from multiple sources and save to Db

I am building an application in django that collects hotel information from various sources and format this data to a uniform format. There after I need to expose API to allow hotels access to web apps and devices using django-rest-framework.
So For example if I have 4 sources
[HotelPlus, xHotelService, HotelSignup, HotelSource]
So please let me know the best implementation practice in terms of django. Being a PHP developer, I prefer to do this by writing a custom third party services implementing an interface so adding more sources becomes easy. That way I only need to call execute() method from the cron task and rest is done by the service controller (fetching feed and populating it in database).
But I am new to python django, so I dont have much idea of creating services or middleware is a right fit for this task.
For fetching data from the sources you will need dedicated worker processes and broker so that your main django process won't be blocked. You can use celery for that and it already supports django.
After writing the tasks for fetching and formatting the data, you should need a scheduler to call this tasks periodically. You can use celery beat for that.

Serving Multiple RASA bots on Django Backend

I’m currently trying to serve multiple bots (running different models) and to allow users to interact with it on a website. I’ve had a look at the following: http://www.rasa.com/docs/nlu/http/, http://www.rasa.com/docs/core/http/ and http://www.rasa.com/docs/nlu/python/, but I’m still having trouble figuring out how it can be done.
Some of the solutions I’ve considered are either:
Serve the bot on a HTTP server and have my website interact with the Rasa HTTP server
Create the website on Django Framework or REST API, and run Rasa Core and NLU on the backend.
What would be the best way to go about doing this? And, could anyone please briefly explain how this can be done (with multiple bot models and instances running)?
Any help would be greatly appreciated!
For anyone else searching for an answer, I ended up using Flask as the server, along with Flask-SocketIO for real time communication. The server serves an API which allows clients to communicate with it via SocketIO, determines which bot to interact with, gets the response, and sends it back to the client.

Web Scraping with Google Compute Engine / App Engine

I've written a python script that uses Selenium to scrape information from a website and stores it in a csv file. It works well on my local machine when I manually execute it but I now want to run the script automatically once per hour for several weeks and safe the data in a database. It may take about 5-10 minutes to run the script.
I've just started off with Google Cloud and it looks like there are several ways of implementing it with either Compute Engine or App Engine. So far, I get stuck at a certain point with all three ways that I found so far (e.g. getting the scheduled task call a URL of my backend instance and getting that instance to kick off the script). I've tried to:
Execute the script via Compute Engine and use datastore or cloud sql. Unclear if crontab can easily be set up.
Use Task Queues and Scheduled Tasks on App Engine.
Use backend instance and Scheduled Tasks on App Engine.
I'd be curious to hear from others what they would recommend as the easiest and most appropriate way given that this is truly a backend script that does not need a user front end.
App Engine is feasible but only if you limit your use of Selenium to a .remote out to a site such as http://crossbrowsertesting.com/ -- feasible but messy.
I'd use Compute Engine -- and cron is trivial to use on any Linux image, see e.g http://www.thegeekstuff.com/2009/06/15-practical-crontab-examples/ !

Celery deployment strategy

I have a task that build a task Server and I decide to use Celery.
My idea is that :
Build a celery worker Server on Machine 1
A Web cluster consist of some web servers running Django.
There are some task that Django website have to tell celery Server on Machine 1 to work.
For example :
When a new user registered , Django code will call the celery worker to send email somehow.
I read documents about celery, but I can not find any documents that show me how to call a "send email" task to the Machine 1 and ask Machine 1 to send the email.
any idea ?
Thank you very much
When you use django and celery together take a look at django-celery,
http://docs.celeryproject.org:8000/en/latest/django/

Categories

Resources