How to run a scheduled task to run a Flask app constantly? - python

I feel pretty stupid asking this, but I am still kind of a newbie when it comes to server-related tasks.
I have a flask app that scrapes data from a website and checks a Postgres database if updates to the scraped data have to be made. Now I would like this task to run constantly because the data is going to be visualized on a website of mine.
I found the Flask-APScheduler lib for which I successfully run a scheduled task every 60 minutes. Now my short-minded question:
I run this task through an SSH connection to my server from my work PC. At the end of the day, I usually deactivate my virtual environment, close my console, and turn off my PC. Doesn't this also shut down my script and it will not update my database anymore?

If you run the program inside a terminal multiplexer such as screen or tmux (https://github.com/tmux/tmux/wiki). You can detach your tmux session from your ssh session and it will keep on running even if you close the ssh session. When you want to connect to the running session again, you can ssh back into the server and attach yourself to the background tmux session that you left running.
You can find the details for doing attach and detach here:
https://github.com/tmux/tmux/wiki/Getting-Started#attaching-and-detaching

Related

Software to run a local python script on a server

I regularly have Python scripts that take up to 8+ hours to complete that I want to run on a remote server. However, I don't want to go through the hassle of setting up a server, setting an environment, running the script and shutting down the server after the script is done every time.
Ideally, I'm looking for a CLI product like Heroku that spins up a server, runs the script in an environment and shuts down the server after the script is done.
AWS Lambda functions sound close to what I'm looking for, but they have a runtime limit. Are there other solutions that would fit these criteria?
Thanks!

Trying to remotely start a process with visible window with Python on Windows machines

I tried to do this with WMI, but interactive processes cannot be started with it (as stated in Microsoft documentation). I see processes in task manager, but windows do not show.
I tried with Paramiko, same thing. Process visible in task manager, but no window appears (notepad for example).
I tried with PsExec, but the only case a window appears on the remote machine, is when you specify -i and it does not show normally, only through a message box saying something like "a message arrived do you want to see it".
Do you know a way to start a program remotely, and have its interface behave like it would if you manually started it?
Thanks.
Normally the (SSH) servers run as a Windows service.
Window services run in a separate Windows session (google for "Session 0 isolation"). They cannot access interactive (user) Windows sessions.
Also note that there can be multiple user sessions (multiple logged in users) in Windows. How would the SSH server know, what user session to display the GUI on (even if it could)?
The message you are getting is thanks to the "Interactive Services Detection" service that detects that a service is trying to show a GUI on an invisible Session 0 and allows you to replicate the GUI on the user session.
You can run the SSH server in an interactive Windows session, instead as a service. It has its limitations though.
In general, all this (running GUI application on Windows remotely through SSH) does not look like a good idea to me.
Also this question is more about a specific SSH server, rather that about an SSH client you are using. So you you include details about your SSH server, you can get better answers.
ok i found a way. With subprocess schtasks( the windows task scheduler). For whatever reason, when i launch a remote process with it , it starts as if i had clicked myself on the exe. For it to start with no delay, creating the task to an old date like 2012 with schtasks /Create /F and running the then named task with schtasks /Run will do the trick

Running flask app on DigitalOcean: should I keep the ssh console open all the time?

I created a droplet that runs a flask application. My question is when I ssh into the droplet and restart the apache2 server, do I have to keep the console open all the time (that is I should not shut down my computer) for the application to be live?
What if I have a dynamic application that runs scripts in the background, do I have to keep the console open all the time for the dynamic parts to work?
P.S:
there's a similar question in SO about a NodeJs app but some parts of the answer they provided are irrelevant to my Flask app.
You can use the "screen" command to mantain the sesion open.
please see https://www.rackaid.com/blog/linux-screen-tutorial-and-how-to/
In my opinion it is not a good practice to use remote computers for the development stage unless you don't have an other option. If you want to make your application available after logging out from the ssh console, screen works, but it still a workaround.
I would suggest taking a look at this great tutorial on how to daemonize flask applications with Gunicorn+Nginx.
You needn't keep the console on, the app will still running after you close the console on your computer. But you may need to set a log to monitor it.

Python process suspends on SSH logout after nohup/screen

I have a remote server through Blue Host that's intended to run a server based on Twisted for Python. The only access I have to it is over SSH, so to keep Python running after I log out I tried using nohup python server.py & and screen -dm python server.py, getting the same results for each. Everything works fine until I log out of SSH - even though Python is running in the background as expected, once I've logged out, my client can no longer communicate with the server. The strange part is that if I log back in over SSH and check the running processes with ps aux, I see Python running and my client can successfully communicate with the server again. Even if I don't type anything at all once I log back in, everything works as expected. But, of course, as soon as I log back out, it's as if the server is gone.
I've contacted support for the hosting service in case this is some oddity on their end, but hopefully this is something that can be resolved on my end instead.
Edit: Looks like Blue Host doesn't want me doing server-y stuff without buying the VPS upgrade so it looks like that's the big problem.
Edit 2: Okay, so in case anybody ends up having a similar problem, here's what the main issue turned out to be. I was mistaken in my original description; I was able to connect to the server but I was getting kicked off immediately for what turned out to be a MySQL error. I guess trying to connect to a localhost database with no active connection somehow causes problems, so instead I changed the MySQL connection command to connect to my site's IP address instead, even though it was the same IP as the server. That seemed to do the trick in terms of my main issue.
Don't use this method to keep the server process running. Instead try using supervisor (apt-get install supervisor). It allows you to daemonize your process, and ability to stop/restart etc.
Here's a sample config entry (/etc/supervisor/supervisord.conf):
[program:my_server]
command=python /path/to/server/server.py
directory=/path/to/server/
autostart=true
autorestart=true
stdout_logfile=/var/log/server.log
stderr_logfile=/var/log/server_error.log
user=your_linux_user_name
After you edit your config, do
sudo service supervisor stop
sudo service supervisor start #need to do this - doing a `restart` doesn't reload the config file!
your server should now be running properly. You can manage its lifecycle via sudo supervisorctl

google compute engine connection keeps disconecting

I have an instance on google compute engine, connecting to it by terminal: gcutil ssh, on it I have several DJango servieces. I run the server using: python manage.py runserver 0.0.0.0:8000. the services are being called from an iPhone application IOS 6.1
the problem I'm facing is that every few minutes (between 10- 15) I'm getting disconnected and have to reconnect and run the server again.
Why is my server being disconnected and how can I keep the it running?
Try using supervisor.d. It sounds like for what your trying to do, supervisor can keep your process up and running. http://supervisord.org/
Here's an example conf:
[program:app]
process_name = app-%(process_num)s
command =python /home/ubuntu/production/current/app/src/app.py --port=%(process_num)s
# Increase numprocs to run multiple processes on different ports.
# Note that the chat demo won't actually work in that configuration
# because it assumes all listeners are in one process.
numprocs = 4
numprocs_start = 8000
This is for running multiple processes of the same program. Just change around the args and it should work for you.
SSH normally times out after a period of inactivity, and that may be what is happening here. If so, this article might be useful to help configure SSH to send a regular message so connections are less likely to be dropped.
However, the core issue is that you'd like software you started at the terminal to keep running even when you're logged out. Consider using screen or tmux to host your shell sessions. This will allow your shell software to run even when you are not connected, and for you to pick up right where you left off when you reconnect. Here is a nice getting started post about tmux.
Once you're ready for production, take a look at the Django deployment docs.

Categories

Resources