Linux Service running Python script does not update SQL database - python

I have a python script that scans a stock market for transactions and saves them in the SQL database. The script works on its own if I run it directly python3 fetchTradesLuno24Hours.py and this updates the database. However, if I run it as a service it stops updating the database. If I run systemctl status lunoDatabase.service it shows that service successfully run. The service is triggered by lunoDatabase.timer that runs it every several hours. If I run systemctl status lunoDatabase.timer or systemctl list-timers I see that the timer works and the script is triggered successfully. The service reports that the python script run-time is as expected (around 6 minutes) and the code exited successfully.
Before I tried running python script in the infinite loop and that worked fine and the database was updated correctly from the service. When I added timer it stopped updating the database. I would like the service to update the SQL database and to be trigger by the timer. How can I fix that?

The problem was in the python script. Since I address the python file from the root folder, I should have specified the absolute path the database in database.py.
db = sqlite3.connect('home/user/bot/transactions.db')
and not
db = sqlite3.connect('transactions.db')
Thank you, everyone!

Related

How to run a scheduled task to run a Flask app constantly?

I feel pretty stupid asking this, but I am still kind of a newbie when it comes to server-related tasks.
I have a flask app that scrapes data from a website and checks a Postgres database if updates to the scraped data have to be made. Now I would like this task to run constantly because the data is going to be visualized on a website of mine.
I found the Flask-APScheduler lib for which I successfully run a scheduled task every 60 minutes. Now my short-minded question:
I run this task through an SSH connection to my server from my work PC. At the end of the day, I usually deactivate my virtual environment, close my console, and turn off my PC. Doesn't this also shut down my script and it will not update my database anymore?
If you run the program inside a terminal multiplexer such as screen or tmux (https://github.com/tmux/tmux/wiki). You can detach your tmux session from your ssh session and it will keep on running even if you close the ssh session. When you want to connect to the running session again, you can ssh back into the server and attach yourself to the background tmux session that you left running.
You can find the details for doing attach and detach here:
https://github.com/tmux/tmux/wiki/Getting-Started#attaching-and-detaching

re-run Python script in started in docker-compose with MySQL server running

Using docker compose I've created a container for my MySQL DB and one for my Python script. When I use the command docker compose up, my images are built for my Python app container and my MySQL DB container, and the python script is run. After execution, the shell just hangs since MySQL server is still running and my script has completed execution. How can I either stop the MySQL server after my script runs, or re-run my script while the MySQL server continues to run?
When running docker-compose up, your current shell will display all the logs from all the containers defined in the docker-compose.yaml file. Also if you terminate the command with cmd + c in MacOS, all the containers will stop running.
As a result, this gives you the impression that the shell just hangs while everything is still running as normal.
What you want in this case is to let the containers continue to run in the background (detached mode)
docker-compose up --detach
The MySQL server will now continue to run until you stop it with docker-compose down.

Windows task scheduler issue python script automation

I'm running a python script using the task scheduler. The script gets data from a database using pandas and stores them in csv.gz files.
It's running properly but there is a difference in data size when the Task Scheduler runs automatically and when it is run manually via Task scheduler run button. The manual run gets all the data, whereas automated run gets lesser data.
It runs the same script. I've checked multiple times. But unable to identify the issue.
PS:
Using Windows Server 2008, pymssql to connect to Database

Is it Possible to Run a Python Code Forever?

I have coded a Python Script for Twitter Automation using Tweepy. Now, when i run on my own Linux Machine as python file.py The file runs successfully and it keeps on running because i have specified repeated Tasks inside the Script and I also don't want to stop the script either. But as it is on my Local Machine, the script might get stopped when my Internet Connection is off or at Night. So i couldn't keep running the Script Whole day on my PC..
So is there any way or website or Method where i could deploy my Script and make it Execute forever there ? I have heard about CRON JOBS before in Cpanel which can Help repeated Tasks but here in my case i want to keep running my Script on the Machine till i don't close the script .
Are their any such solutions. Because most of twitter bots i see are running forever, meaning their Script is getting executed somewhere 24x7 . This is what i want to know, How is that Task possible?
As mentioned by Jon and Vincent, it's better to run the code from a cloud service. But either way, I think what you're looking for is what to put into the terminal to run the code even after you close the terminal. This is what worked for me:
nohup python code.py &
You can add a systemd .service file, which can have the added benefit of:
logging (compressed logs at a central place, or over network to a log server)
disallowing access to /tmp and /home-directories
restarting the service if it fails
starting the service at boot
setting capabilities (ref setcap/getcap), disallowing file access if the process only needs network access, for instance

Running python code as a service on windows server 2012 R2

I have a python script which collects data from Twitter and stores into MongoDb. I need to run this code on the server as a service.
However, only 2 users can remain logged into the server at any given time and therefore If I log out my account, the python script would stop working and data collection will stop.
My question is - How can I run this script as a service on the server such that it keeps running irrespective of whichever user is logged in?
Task Scheduler is the easiest solution I know of. You can use it to run the code at startup as the NT AUTHORITY\SYSTEM user and automatically restart on failure. In case you need it, there's a basic overview of Task Scheduler here.

Categories

Resources