Celery + Django on startup - python

I have written a code with django and celery, it is for a machine to run some tasks and to be accessible through net. So I need the django and celery to run automatically when the built-in PC starts. Can anyone please tell me the best and easiest way to do this? Can servers like Apache handle such thing?

Ok, if anyone has this problem, first I wrote a .sh file for each one and then call them in start up setting in ubuntu.

Related

How to schedule a job with Flask on Windows?

It seems that flask-crontab cannot run on my machine because there is no such thing as a windows cron.
This is the error I get: ModuleNotFoundError: No module named 'fcntl'
If this is the case, how can I write a scheduled job in a Flask application?
Specifically, I need to be able to deploy the application--so it won't be running on windows once it goes into production, but in the meantime, I can't debug if I can't test the cronjob on my own computer.
Any help is appreciated, e.g. pointing me to documentation that is useful, suggesting other extensions, etc. Thanks!
Pete
I would recommend you start using Celery,
Celery is a great library to use for job scheduling, whether you want this job to be periodic or upon request tasks.
Follow this guide
You can use Apache Airflow. It is written in Python. Its documentation is here.

Parallel Processing in Django

I get a file from the user. Once the file has been uploaded and saved, Now this file has to be analysed.
Since it is a huge file and analysis takes minimum 1 hour (say), I have a field in the model saying the status of the analysis as Analysing or Analysis Done.
The script for analysing is a separate python file and the analysis has to be done there.
How do I go about doing this? I want this script to run at the background. Also I have
to deploy in apache server.
How should I proceed?
Should I use threads? How do I go about using
external python scripts in threads?
I came to know about CronTabs, But I don't know
how can I implement in this situation.
I can't use Celery, since Celery has been stopped for
Windows
I came to know about Django Management
Commands. But since I deploy using an Apache
server, I don't know whether I can do that.
I can think of a few ways to solve this problem.
If you can batch the processing of the files, then you can run a cron job which will run a django command or a script at certain intervals to process the file.
If you can't batch the processing, you should look at other queuing systems like django-rq or you can build a simple queuing system using an event dispatch library.
If you really want to use celery what you can do is run your whole project inside a docker container so that you can use celery 4 since that is your requirement.

Deploying multiple Gunicorn applications. Best process manager for easy one-click setup?

I am a python programmer, and server administration was always a bit hard for me to immerse to. I always read tutorials and in practice just repeated the steps each time I set up a new project. I always used uwsgi, but realized that gunicorn is easier for simple projects like mine.
Recently, I successfully set up my first single gunicorn application with the help of this article: https://www.digitalocean.com/community/tutorials/how-to-set-up-django-with-postgres-nginx-and-gunicorn-on-ubuntu-16-04
But what should I do if I want to launch another app with gunicorn? Should I just make another systemd service file, like myproject.service? I'm looking for convenient 'one click' setup, so I can easily transfer my project to another machine, or add more gunicorn applications without much configuration. Or maybe, I should use another process manager like supervisor? What is the best solution for a newbie like me?
Sorry if my question's too dumb, but I'm really trying.
Thank you!

How to run a system command from a django web application?

Does anyone knows of a proven and simple way of running a system command from a django application?
Maybe using celery? ...
From my research, it's a problematic task, since it involves permissions and insecure approaches to the problem. Am i right?
EDIT: Use case: delete some files on a remote machine.
Thanks
Here is one approach: in your Django web application, write a message to a queue (e.g., RabbitMQ) containing the information that you need. In a separate system, read the message from the queue and perform any file actions. You can indeed use Celery for setting up this system.

Starting Tornado Web

I'm quite new to using Tornado Web as a web server, and am having a little difficulty keeping it running. I normally use Django and Nginx, and am used to start/stop/restarting the server. However with Tornado I'm having trouble telling it to "run" without directly executing my main python file for the site, ie "python ~/path/to/server.py".
I'm sure I'm getting this completely wrong - is there a way of 'bootstrapping' my script so that when Nginx starts, Tornado starts?
Any help would be appreciated!
A better way to do it is using supervisord as it is also written in python
No, there is not a way to have nginx spawn your tornado instance.
Typically you would use an external framework like daemontools or a system init script to run the tornado process.

Categories

Resources