How to run a system command from a django web application? - python

Does anyone knows of a proven and simple way of running a system command from a django application?
Maybe using celery? ...
From my research, it's a problematic task, since it involves permissions and insecure approaches to the problem. Am i right?
EDIT: Use case: delete some files on a remote machine.
Thanks

Here is one approach: in your Django web application, write a message to a queue (e.g., RabbitMQ) containing the information that you need. In a separate system, read the message from the queue and perform any file actions. You can indeed use Celery for setting up this system.

Related

Advice on running flask app ONLY locally forever

I want to create web form that stays on forever on a single computer. Users can come to the computer fill out the form and submit it. After submitting, it will record the responses in an excel file and send emails. The next user can then come and fill out a new form automatically. I was planning on using Flask for this task since it is simple to create, but since I am not doing this on some production server, I will just have it running locally in development on the single computer.
I have never seen anyone do something like this with Flask so I was wondering if my idea is possible or if I should avoid it. I am also new to web development so I was wondering what problems there could be with keeping a flask application stay on 24/7 on a local development computer.
Thanks
There is nothing wrong with doing this in principle however, it is likely not the best solution for the time-to-reward payoff.
First, to answer your question, this could easily be done, even for a beginner, completing this in a few hours with minimal Python and HTML experience could definitely be done. Your app could crash in the background for many reasons (running out of space, bad memory addresses, etc) but most likely you will be fine.
As for specifically building it, it is all possible, there are libraries you can use to add the results to an excel file, or you can easily just append to a CSV (which is what I would recommend). Creating and sending an email, similarly is relatively simple, but again, doing it without python would be much easier.
If you are not set on flask/python, you could check out Google Forms but if you are set on python, or want to use it as a learning experience, it can definitely be done.
Your idea is possible and while there are many ways to do this kind of thing, what you are suggesting is not necessarily to be avoided.
All apps that run on a computer over a long period of time start a process and keep it going until closed. That is essentially what you are doing.
Having done this myself (and still currently doing it) at my business, I can say that it works great.
The only caveat is that to ensure that it will always be available, you need to have the process monitored by some tool to make sure that it gets restarted if it ever closes due to a variety of reasons.
In linux, supervisor is a great tool for doing that. In windows you could register it as a service. But you could also just create an easy way to restart and make it easy for the user to do so if it is down when they need it.
Yes, this could be done. It's very similar to the applications that run on the servers in data centers.
To keep the application running forever or restarting it after your system starts you'll need to use a system manager similar to systemd in Unix. You could use NSSM - the Non-Sucking Service Manager
or Service Control to monitor your application and restart it if it crashes. This will also have to be enabled on startup.
Other than this, you could use Waitres to serve your Flask application. Waitress is a WSGI web server with which you can easily configure the number of threads and workers to enable serving multiple users at the same time.
In a production environment, it's always suggested to use a web server interface like Gunicorn or Waitress.

Parallel Processing in Django

I get a file from the user. Once the file has been uploaded and saved, Now this file has to be analysed.
Since it is a huge file and analysis takes minimum 1 hour (say), I have a field in the model saying the status of the analysis as Analysing or Analysis Done.
The script for analysing is a separate python file and the analysis has to be done there.
How do I go about doing this? I want this script to run at the background. Also I have
to deploy in apache server.
How should I proceed?
Should I use threads? How do I go about using
external python scripts in threads?
I came to know about CronTabs, But I don't know
how can I implement in this situation.
I can't use Celery, since Celery has been stopped for
Windows
I came to know about Django Management
Commands. But since I deploy using an Apache
server, I don't know whether I can do that.
I can think of a few ways to solve this problem.
If you can batch the processing of the files, then you can run a cron job which will run a django command or a script at certain intervals to process the file.
If you can't batch the processing, you should look at other queuing systems like django-rq or you can build a simple queuing system using an event dispatch library.
If you really want to use celery what you can do is run your whole project inside a docker container so that you can use celery 4 since that is your requirement.

Best way to schedule task in Django, without Celery

Im looking for relatively simple and lightweight way to setup primitive DB maintain tasks for Django-based web-site. Celery seems for me like overkill.
In my mind its now looking like making custom Django management command, and putting in in cron. Maybe some could suggest better method?
django-extensions has a jobs-scheduling function that would work well for DB maintenance tasks. You still would rely on cron entries to actually run them though.
But then again, just doing a management command from cron is perfectly reasonable.
Django Chronograph is a django app with a very nice admin interface for managing Cron Jobs and setting up multiple task. So in this way, you don't have to go and fiddle with your server's cron file and this interface/app would manage it efficiently for you.
You can also do it the Django way by writing Custom Management Commands as also mentioned here.

Scheduling a regular event: Cron/Cron alternatives (including Celery)

Something I've had interest in is regularly running a certain set of actions at regular time intervals. Obviously, this is a task for cron, right?
Unfortunately, the Internet seems to be in a bit of disagreement there.
Let me elaborate a little about my setup. First, my development environment is in Windows, while my production environment is hosted on Webfaction (Linux). There is no real cron on Windows, right? Also, I use Django! And what's suggested for Django?
Celery of course! Unfortunately, setting up Celery has been more or less a literal nightmare for me - please see Error message 'No handlers could be found for logger “multiprocessing”' using Celery. And this is only ONE of the problems I've had with Celery. Others include a socket error which it I'm the only one ever to have gotten the problem.
Don't get me wrong, Celery seems REALLY cool. Unfortunately, there seems to be a lack of support, and some odd limitations built into its preferred backend, RabbitMQ. Unfortunately, no matter how cool a program is, if it doesn't work, well, it doesn't work!
That's where I hope all of you can come in. I'd like to know about cron or a cron-equivalent, which can be set up similarly (preferably identically) in both a Windows and a Linux environment.
(I've been struggling with Celery for about two weeks now and unfortunately I think it's time to toss in the towel and give up on it, at least for now.)
I had the same problem, and held off trying to solve it with celery (too complicated) or cron (external to application) and ended up finding Advanced Python Scheduler. Only just started using it but it seems reasonably mature and stable, has decent documentation and will take a number of scheduling formats (e.g. cron style).
From the documentation, running a function at a specific interval.
from apscheduler.scheduler import Scheduler
sched = Scheduler()
sched.start()
def hello_world():
print "hello world"
sched.add_interval_job(hello_world,seconds=10)
This is non-blocking, and I run something pretty identical by simply importing the module from my urls.py. Hope this helps.
A simple, non-Celery way to approach things would be to create custom django-admin commands to perform your asynchronous or scheduled tasks.
Then, on Windows, you use the at command to schedule these tasks. On Linux, you use cron.
I'd also strongly recommend ditching Windows if you can for a development environment. Your life will be so much better on Linux or even Mac OSX. Re-purpose a spare or old machine with Ubuntu for example, or run Ubuntu in a VM on your Windows box.
https://github.com/andybak/django-cron
Triggered by a single cron task but all the scheduling and configuration is done in Python.
Django Chronograph is a great alternative. You only need to setup one cron then do everything in django admin. You can schedule tasks/commands from django management.

Modify system configuration files and use system commands through web interface

I received a project recently and I am wondering how to do something in a correct and secure manner.
The situation is the following:
There are classes to manage linux users, mysql users and databases and apache virtual hosts. They're used to automate the addition of users in a small shared-hosting environnement. These classes are then used in command-line scripts to offer a nice interface for the system administrator.
I am now asked to build a simple web interface to offer a GUI to the administrator and then offer some features directly to the users (change their unix password and other daily procedures).
I don't know how to implement the web application. It will run in Apache (with the apache user) but the classes need to access files and commands that are only usable by the root user to do the necessary changes (e.g useradd and virtual hosts configuration files). When using the command-line scripts, it is not a problem as they are run under the correct user. Giving permissions to the apache user would probably be dangerous.
What would be the best technique to allow this through the web application ? I would like to use the classes directly if possible (it would be handier than calling the command line scripts like external processes and parsing output) but I can't see how to do this in a secure manner.
I saw existing products doing similar things (webmin, eBox, ...) but I don't know how it works.
PS: The classes I received are simple but really badly programmed and barely commented. They are actually in PHP but I'm planning to port them to python. Then I'd like to use the Django framework to build the web admin interface.
Thanks and sorry if the question is not clear enough.
EDIT: I read a little bit about webmin and saw that it uses its own mini web server (called miniserv.pl). It seems like a good solution. The user running this server should then have permissions to modify the files and use the commands. How could I do something similar with Django? Use the development server? Would it be better to use something like CherryPy?
Hello
You can easily create web applications in Python using WSGI-compliant web frameworks such as CherryPy2 and templating engines such as Genshi. You can use the 'subprocess' module to manadge external commands...
You can use sudo to give the apache user root permission for only the commands/scripts you need for your web app.

Categories

Resources