Running script on server start in google app engine, in Python - python

Is it possible to run a script each time the dev server starts? Also at each deploy to google?
I want the application to fill the database based on what some methods returns.
Is there any way to do this?
..fredrik

I use appengine python with the django helper. As far as I know you cannot hook anything on the deploy, but you could put a call to check if you need to do your setup in the main function of main.py. This is how the helper initializes itself on the first request. I haven't looked at webapp in a while, but I assume main.py acts in a similar fashion for that framework.
Be aware that main is run on the first request, not when you first deploy. It will also happen if appengine starts up a new instance to handle load, or if all instances were stopped because of inactivity. So make sure you check to see if you need to do your initialization and then only do it if needed.

You can do this by writing a script in your favorite scripting language that performs the actions that you desire and then runs the dev server or runs appcfg.py update.

Try to make wrapper around the server runner and script that run deployment. So you will be able to run custom code when you need.

Warmup Requests in combination with min_idle_instances will probably work in your deploy usecase.

Related

How to run Flask inside a Jupyter notebook block for easy testing?

I want to Run a Flask Server inside a jupyter notebook for specific test and QA scenarios. I do understand that it is not wise to run a server inside notebook(As mentioned in the comments of this question).
However, I want to test a specific function that both requires a flask AppContext and a running server. It is a third-party API webhook handler and the third party does not have method to generate fake webhooks. While this might be a very specific case, I think this question is worth asking for edge cases like mine.
You can run a thread inside a block, with a target call of app.run. This will run the app in a separate thread.
from threading import Thread
Thread(target=app.run).start()
Restart the ipykernel as soon as you are done, and DO NOT in any circumstances use this in prduction

How to limit python script so that it can't access local resources?

I am working on a project that allows users to upload a python script to an API and run it on a schedule. Currently, I'm trying to figure out a way to limit the functionality of the script so that it cannot access local files, mess with the flask server running the API, etc. Do you have any ideas on how I can achieve this? Is there anyway to make it so only specific libraries are available for importing?
Running other scripts on your server is serious security issue. If you are trying to deploy Python interpreter on your web application, you can try with something like judge0 - GitHub. It is free if you deploy it yourself and it will run scripts safely inside containers.
The simplest way is to ensure the user running the script is not root, but a user specifically designed for this task (e.g. part of a group that can only read and not write or execute). This means at minimum you should ensure all files have the appropriate mode. Then you can just use a pipe or something to run the script.
Alternatively, you could use a runtime that’s not “local”, like a VM or compute service (AWS lambda, etc). The latter would be simplest, and there’s lots of vendors who offer compute service with programmatic api.

Advice on running flask app ONLY locally forever

I want to create web form that stays on forever on a single computer. Users can come to the computer fill out the form and submit it. After submitting, it will record the responses in an excel file and send emails. The next user can then come and fill out a new form automatically. I was planning on using Flask for this task since it is simple to create, but since I am not doing this on some production server, I will just have it running locally in development on the single computer.
I have never seen anyone do something like this with Flask so I was wondering if my idea is possible or if I should avoid it. I am also new to web development so I was wondering what problems there could be with keeping a flask application stay on 24/7 on a local development computer.
Thanks
There is nothing wrong with doing this in principle however, it is likely not the best solution for the time-to-reward payoff.
First, to answer your question, this could easily be done, even for a beginner, completing this in a few hours with minimal Python and HTML experience could definitely be done. Your app could crash in the background for many reasons (running out of space, bad memory addresses, etc) but most likely you will be fine.
As for specifically building it, it is all possible, there are libraries you can use to add the results to an excel file, or you can easily just append to a CSV (which is what I would recommend). Creating and sending an email, similarly is relatively simple, but again, doing it without python would be much easier.
If you are not set on flask/python, you could check out Google Forms but if you are set on python, or want to use it as a learning experience, it can definitely be done.
Your idea is possible and while there are many ways to do this kind of thing, what you are suggesting is not necessarily to be avoided.
All apps that run on a computer over a long period of time start a process and keep it going until closed. That is essentially what you are doing.
Having done this myself (and still currently doing it) at my business, I can say that it works great.
The only caveat is that to ensure that it will always be available, you need to have the process monitored by some tool to make sure that it gets restarted if it ever closes due to a variety of reasons.
In linux, supervisor is a great tool for doing that. In windows you could register it as a service. But you could also just create an easy way to restart and make it easy for the user to do so if it is down when they need it.
Yes, this could be done. It's very similar to the applications that run on the servers in data centers.
To keep the application running forever or restarting it after your system starts you'll need to use a system manager similar to systemd in Unix. You could use NSSM - the Non-Sucking Service Manager
or Service Control to monitor your application and restart it if it crashes. This will also have to be enabled on startup.
Other than this, you could use Waitres to serve your Flask application. Waitress is a WSGI web server with which you can easily configure the number of threads and workers to enable serving multiple users at the same time.
In a production environment, it's always suggested to use a web server interface like Gunicorn or Waitress.

Long running daemon process on django

I need to run a python script (which is listening to Twitter) which will call various methods on my django app when it gets tweets that match a particular hashtag.
At the moment, I just run the script by hand on the command line but I'd like it to run inside django if possible so that I can control it from there and so it doesn't have to perform HTTP POSTs when it gets new data.
I've looked at celery (briefly) but this seems to be for performing certain small tasks at regular intervals to me.
Is there a way to use celery (or anything else) to be able to control this long-running "listen to Twitter" script that I've got?
You should Supervisord to run your django application and your script. Making the script a part of the Django project, will let you use Django signals which you can use to write a custom signal that will be emitted every time your twitter logic is done doing what it is supposed to. Signals are blocking. If you want them to be asynchronous use Celery with Django
An alternative would be to run your django application and the twitter script via supervisord and then expose a REST API which does a HTTP POST to the Django application. You can use TastyPie for that.

Starting Tornado Web

I'm quite new to using Tornado Web as a web server, and am having a little difficulty keeping it running. I normally use Django and Nginx, and am used to start/stop/restarting the server. However with Tornado I'm having trouble telling it to "run" without directly executing my main python file for the site, ie "python ~/path/to/server.py".
I'm sure I'm getting this completely wrong - is there a way of 'bootstrapping' my script so that when Nginx starts, Tornado starts?
Any help would be appreciated!
A better way to do it is using supervisord as it is also written in python
No, there is not a way to have nginx spawn your tornado instance.
Typically you would use an external framework like daemontools or a system init script to run the tornado process.

Categories

Resources