Running scripts on raspberry pi from the internet - python

I am running python scripts on the pi using another voice recognition python script at the moment. I now also want to run these scripts from the internet. According to a little bit of research, one way could be setting up a small webserver on the pi such as lighttpd and create a database on it. Then create another small script which periodically checks a value in the database. This value can be modified over the internet. According to the value I will be using the voice recognition script or using the other values in the database to run the python scripts.
My question is, is this method efficient or is there a simpler method to do this? I am fairly competent at python but I am totally new to web servers and databases. However I do not mind to spend time learning how to use them.
Thanks in advance!

One route that I personally chose, was the configure the Pi for use as a LAMP (Liniux Apache MySQL Python). Some great instructions can be found here: http://www.wikihow.com/Make-a-Raspberry-Pi-Web-Server
If this is overkill, have you considered using cron jobs to automate your pythons scripts? You could then set up times at which your two scripts would run, and with a little inter-process communication you have two entities that are aware of each other. http://www.thesitewizard.com/general/set-cron-job.shtml

Related

Advice on running flask app ONLY locally forever

I want to create web form that stays on forever on a single computer. Users can come to the computer fill out the form and submit it. After submitting, it will record the responses in an excel file and send emails. The next user can then come and fill out a new form automatically. I was planning on using Flask for this task since it is simple to create, but since I am not doing this on some production server, I will just have it running locally in development on the single computer.
I have never seen anyone do something like this with Flask so I was wondering if my idea is possible or if I should avoid it. I am also new to web development so I was wondering what problems there could be with keeping a flask application stay on 24/7 on a local development computer.
Thanks
There is nothing wrong with doing this in principle however, it is likely not the best solution for the time-to-reward payoff.
First, to answer your question, this could easily be done, even for a beginner, completing this in a few hours with minimal Python and HTML experience could definitely be done. Your app could crash in the background for many reasons (running out of space, bad memory addresses, etc) but most likely you will be fine.
As for specifically building it, it is all possible, there are libraries you can use to add the results to an excel file, or you can easily just append to a CSV (which is what I would recommend). Creating and sending an email, similarly is relatively simple, but again, doing it without python would be much easier.
If you are not set on flask/python, you could check out Google Forms but if you are set on python, or want to use it as a learning experience, it can definitely be done.
Your idea is possible and while there are many ways to do this kind of thing, what you are suggesting is not necessarily to be avoided.
All apps that run on a computer over a long period of time start a process and keep it going until closed. That is essentially what you are doing.
Having done this myself (and still currently doing it) at my business, I can say that it works great.
The only caveat is that to ensure that it will always be available, you need to have the process monitored by some tool to make sure that it gets restarted if it ever closes due to a variety of reasons.
In linux, supervisor is a great tool for doing that. In windows you could register it as a service. But you could also just create an easy way to restart and make it easy for the user to do so if it is down when they need it.
Yes, this could be done. It's very similar to the applications that run on the servers in data centers.
To keep the application running forever or restarting it after your system starts you'll need to use a system manager similar to systemd in Unix. You could use NSSM - the Non-Sucking Service Manager
or Service Control to monitor your application and restart it if it crashes. This will also have to be enabled on startup.
Other than this, you could use Waitres to serve your Flask application. Waitress is a WSGI web server with which you can easily configure the number of threads and workers to enable serving multiple users at the same time.
In a production environment, it's always suggested to use a web server interface like Gunicorn or Waitress.

Deploying a Python Script to Windows and Linux

I have a python server that I need to run in both a Linux and Windows environment, and my question is about deployment. What is the best method for deploying the solution instead of just double clicking on the file and running it?
Since I use the server_forever() on the server, I can just run the script from command line, but this keeps the python window open. If I log off the machine, naturally the process will stop. So what is the best method for deploying a python script that needs to keep running if the user is logged in or off a machine.
Since I am going to be using multiple environment, Linux and Windows, can you please be specific in what OS you are talking about?
For windows, I was thinking of running the script 'At Startup' using the Windows scheduler. But I wanted to see if anyone had a better option. For linux, I really don't know what to create. I am assuming a CRON job?
Deployment does refer to coding, so using serve_forever() on a multiprocessing job manager keeps the python window open upon execution. Is there a way to hide this window through code? Would you recommend using a conversion tool like py2exe instead?
This is the subject matter of a whole library of books, so I will just give an introduction here :-)
You can basically start scripts directly and then have multiple options to do this in a way that they keep running in the background.
If you have certain functionality that needs to run on regular moments, you would do this by scheduling it:
Windows: Windows Scheduler or specific scheduling tools
Linux: Cron
If your problem is that you want to start a script without it closing on you while SSH'ing into Linux, you want to look into the "screen" or "tmux" tools.
If you want to have it started automatically this could be done by using the "At Startup" as you point out and Linux has similar functionalities, but the preferred and more robust way would be to set up a service that is better integrated with the OS.
Windows: Windows Service
Linux: Daemon
Even more capabilities can be yielded by using an application server such a Django
Tomcat (see comment) is an option, but definitely not the standard one; you'll have a hard time finding support both from Tomcat people running Python or Python people running their stuff on Tomcat. That being said, I imagine you could enable CGI and have it run a Python command with your script.
Yet, instead of just starting a Python script I would strongly encourage you to have a look at different Python options that are probably available for your specific use case. From lightweight web solutions like Flask over a versatile networking engine like Twisted to a full blown web framework like Django.
They all have rather well-thought-out deployment solutions available. Look up WSGI for more background.

Deploying a Python Script on a Server (CentOS): Where to start?

I'm new to Python (relatively new to programing in general) and I have created a small python script that scrape some data off of a site once a week and stores it to a local database (I'm trying to do some statistical analysis on downloaded music). I've tested it on my Mac and would like to put it up onto my server (VPS with WiredTree running CentOS 5), but I have no idea where to start.
I tried Googling for it, but apparently I'm using the wrong terms as "deploying" means to create an executable file. The only thing that seems to make sense is to set it up inside Django, but I think that might be overkill. I don't know...
EDIT: More clarity
You should look into cron for this, which will allow you to schedule the execution of your Python script.
If you aren't sure how to make your Python script executable, add a shebang to the top of the script, and then add execute permissions to the script using chmod.
Copy script to server
test script manually on server
set cron, "crontab -e" to a value that will test it soon
once you've debugged issues set cron to the appropriate time.
Sounds like a job for Cron?
Cron is a scheduler that provides a way to run certain scripts (apps, etc.) at certain times.
Here is a short tutorial that explains how to set up cron.
See this for more general cron information.
Edit:
Also, since you are using CentOS: if you end up having issues with your script later on... it could partly be caused by SELinux. There are ways to disable SELinux on your server (if you have enough access permissions.) But... there are arguments against disabling SELinux, as well.

How to run a python script on a Rails server?

I have a Rails server which will need to run a python script at the background. I know that I can run it like I run terminal commands in ruby, but how is the performance like? is it better to use a python framework and not Rails? Is there better ways (optimization wise) to run python scripts on a Rails server?
If you mean that it needs to run periodically, just set it up as a cron job, no special performance characteristics to worry about there.
If you mean that it needs to run when pages are requested from your Ruby website, then simply running the script each time won't perform well as it needs to fire up the Python interpreter over and over again.
If the Python script is large but is only called from a relatively small number of page requests, you might be able to get away with this, sometimes it's not worth the time to optimise a slow operation that isn't called often.
If the bulk of your website is based around the functionality of the Python script, then yes, you are probably better off switching to a Python web framework and loading it as a module.
If the Python script isn't very big, then you are probably better off rewriting it in Ruby.
Worst case scenario is that the script is big and used often, but doesn't make up enough of your website to justify switching to Python. In that case, I'd consider wrapping the Python in a daemon that Ruby can talk to in the background.
You will incur the cost of starting python each time you run it from ruby. The cost would be the same in a python framework, unless you could use the python script as a library instead.
You could setup a daemon in rails to execute the python script.
http://railscasts.com/episodes/129-custom-daemon - Tutorial for setting up daemons in rails

A blackbox testing frame with testing management system

Is there a testing framework (preferable python) that executes test, monitor the progress (failed/passed/timeout) and controls the vmware? Thanks
I am trying to make some automation functional testing in Vmware using Autoit script, VMs are controlled by a little python script on the host machine (deploy test files into VMs, execute them and collect the results data). But now it seems to be lots of works to do if I want this script to be able to manage and execute a series of test cases.
Thanks a lot!
Cheers,
Zhe
There are lots of continuous integration tools that may do what you want.
One implemented in Python that may fit your need is Buildbot - it can manage running builds and tests across multiple machines and consolidating the results.

Categories

Resources