Scheduled Python script in task scheduler not working - python

I have a python script that I am trying to schedule to run in the task scheduler in my VM but it doesn't seem to be running, it returns (0x2) for last run result. I am able to run the script manually and it works. I even created a batch file to execute the script which works and tried scheduling that in Task Scheduler but it also gave the same error. My only guess is that it's not working because it uses the Google Sheets API and reads the credentials from a JSON file in the project folder but I'm still unsure as to why it wouldn't run when scheduled. If you have any ideas I would greatly appreciate it. In the task scheduler, I am using the path Z:\Python\PythonGSAPI\executePy.bat to execute the batch file. The content of the batch file is
#echo off
"C:\Python27\python.exe" "Z:\Python\PythonGSAPI\TF_Invoice.py"
pause

This occur due the PATH Enviroment variable, for exemple, if you use Anaconda Python it's needed to choose the first option during the installation or even configure this after.
enter image description here

Related

Windows Task Scheduler stuck by consent window

I am using Windows Task Scheduler to extract data from Bing Ads API
This process requires providing code on the first run, it automatically opens a browser and require copy code from url to the running program. After that, the process can run successfully without any input.
I set up a .bat file to run python script like python C:\abc.py then used window scheduler to open the .bat file to trigger the script.
I did test it by open .bat file directly. It worked.
However, It did not work by triggering window scheduler. The process is stuck at running status. I doubt that it is stuck at consent window input like it was running the first time.
Is there any way to verify that?
If it's true, Is there a way to solve my problem?

Windows Task Scheduler Last Run Result (0xF6)

What does Last Run Result (0xF6) mean?
I scheduled the Task Scheduler to run a *pythonw.exe with the arguments pointed to the *.py file.
User account is set for an admin account
Run whether user is logged on or not
Run with the highest privileges
Configured for Windows 10
I can't find a reference to this in a google search. Whatever is happening via the Task Scheduler I can't really tell because the python script does not finish-- meaning I can't read the log file to check how far it got or if it ran into any errors. If I run the script without the Task Scheduler it completes successfully.

pynetdicom not working correctly with Windows Task Scheduler

I am using a modified version of this pynetdicom script (the second example on this page) to download DICOM images to an office computer. Here is what the script does:
Opens a connection with PACS
Searches for DICOM images that match the current date for a given patient's medical record number and accession number.
If DICOM images are found that match the given criteria, then an SCP server connection is started to initiate the downloading of images to a folder on the local computer.
The script works when it is run using the Spyder IDE. I have created a scheduled task with Windows Task Scheduler and it works correctly if the script has first been run with the Spyder IDE and only if Spyder remains open and all variables have not been cleared.
However, if Spyder is closed or the Spyder kernel is restarted, then when the script is run through task scheduler, it will run correctly until it gets to the part where the SCP server calls the handle_store function that downloads images from PACS. The script does not call the handle_store function and the connection eventually times out.
I thought the solution would be changing the default working directory in Task Scheduler, but that did not work. Any ideas what is going on and how to fix this?
Okay, I did some more digging and found the source of the problem. In order to get images to download to my computer Python needs to be allowed through our corporate firewall. I had already allowed pythonw.exe through the firewall but not python.exe. Once both of these Python files were allowed through the firewall the script will run as expected when started with Windows Task Scheduler.

Script runs fine, but when run by cron can't do the Dropbox upload

I'm running a python script on my raspberry pi, which makes some modifications in a SQL database, writes a log, and uploads everything to dropbox.
When I'm launching it using command line everything works fine.
UPDATE: When I'm launching it using cron, everything works, except for the Dropbox upload. No error messages in the log. The file simply doesn't appear in my dropbox.
Here is the code I am using:
from subprocess import call
data = "/home/pi/scripts/Dropbox-Uploader/dropbox_uploader.sh upload /home/pi/scripts/database.db /"
call ([data], shell=True)
How can this be fixed?
It works from an interactive terminal and not from cron is almost always an evidence of an PATH or environment problem. When you use an interactive session, the profile and eventually basrc files are used to set a number of environment variables including PATH. None of them are used from cron. So good practices are:
always use absolute path in scripts that can be launched from cron
explicitely set PYTHON environment variables in your crontab, or use a minimal shell to set them first and then start python

Is it Possible to Run a Python Code Forever?

I have coded a Python Script for Twitter Automation using Tweepy. Now, when i run on my own Linux Machine as python file.py The file runs successfully and it keeps on running because i have specified repeated Tasks inside the Script and I also don't want to stop the script either. But as it is on my Local Machine, the script might get stopped when my Internet Connection is off or at Night. So i couldn't keep running the Script Whole day on my PC..
So is there any way or website or Method where i could deploy my Script and make it Execute forever there ? I have heard about CRON JOBS before in Cpanel which can Help repeated Tasks but here in my case i want to keep running my Script on the Machine till i don't close the script .
Are their any such solutions. Because most of twitter bots i see are running forever, meaning their Script is getting executed somewhere 24x7 . This is what i want to know, How is that Task possible?
As mentioned by Jon and Vincent, it's better to run the code from a cloud service. But either way, I think what you're looking for is what to put into the terminal to run the code even after you close the terminal. This is what worked for me:
nohup python code.py &
You can add a systemd .service file, which can have the added benefit of:
logging (compressed logs at a central place, or over network to a log server)
disallowing access to /tmp and /home-directories
restarting the service if it fails
starting the service at boot
setting capabilities (ref setcap/getcap), disallowing file access if the process only needs network access, for instance

Categories

Resources