Python Subprocess Keeps Stopping Itself - python

I'm running a python script where I'm using subprocess to call serious of rclone copy operations. Since rclone isn't a native command, I'm defining it from a shell script run automatically by my .bashrc file. I can confirm that works since subprocess.run("rclone") properly pulls the rclone menu.
The issue is when I run my script, I don't get any errors or exceptions. Instead my terminal window shows the following:
I understand the issue is related to the Linux subprocess being backgrounded. However, this solution
didn't seem to fix my issue, and I can't find anything about how to prevent this process from pausing. I can confirm it is distro independent as I have run on RedHat and on Amazon EC2.
Last key piece of info: I am calling the subprocess as bash rather than sh to load the alias via my bashrc file. Here is the minimum reproducible code:
start_date = datetime.strptime(datetime.now(timezone.utc).strftime("%Y%m%d"), "%Y%m%d")
# For good measure, double check the day before for more files if the date just changed
time = datetime.utcnow().strftime("%H")
if int(time) <= 3:
start_date = start_date - timedelta(days = 1)
end_date = start_date + timedelta(days = 2)
else:
# End tomorrow
end_date = start_date + timedelta(days = 1)
# Force python to use the bash shell
def bash_command(cmd):
subprocess.Popen(['/bin/bash', '-i', '-c', cmd])
for dt in daterange(start_date, end_date):
cmd = 'rclone copy "/home/test.png" "AWS test:"'
bash_command(cmd)

Related

How do I run multiple commands with python subprocess( ) without waiting for the end of each command? [duplicate]

This question already has an answer here:
Blocking and Non Blocking subprocess calls
(1 answer)
Closed 12 months ago.
There are two python scripts involved in this task.
My current task requires me to run a long process(takes about a day or two per each, and this is the first python script) in each of 29 available regions on GCP's instances. In order to finish the task as quick as possible, I'm trying to run each process in each instance all at once after spinning off 29 VMs all at once.
As manually running the first script by SSH-ing in to each of the instance is cumbersome, I wrote a python script(the second script) that SSHs into each region's VM and runs the first script I mentioned above.
The issue with the second script that runs first script in different regions is that it doesn't start off to run the first script in second region's VM until it finishes running in the first region's VM, whereas I need the second script to run the first script in every region without waiting for the process started by first script to end.
I use subprocess() in the second script to run the first script in each VMs.
The following code is the second script:
for zone, instance in zipped_zone_instance:
command = "gcloud compute ssh --zone " + zone + " " + instance + " --project cloud-000000 --command"
command_lst = command.split(" ")
command_lst.append("python3 /home/first_script.py")
subprocess.run(command_lst)
I need the subprocess.run(command_lst) to run for every 29 zones at once rather than it running for the second zone only after the first zone's process ends.
The following code is the first script:
for idx, bucket in enumerate(bucket_lst):
start = time.time()
sync_src = '/home/' + 'benchmark-' + var_
subprocess.run(['gsutil', '-m', '-o', 'GSUtil:parallel_composite_upload_threshold=40M', 'rsync', '-r', sync_src, bucket])
end = time.time() - start
time_lst.append(end)
tput_lst.append(tf_record_disk_usage / end)
What can I fix in the second script or the first script to achieve what I want??
Switch out your subprocess.run(command_lst) with Popen(command_lst, shell=True) in each of your scripts and and loop through the command list like the example below to run the processes in parallel.
This is how you implement Popen to run processes in parallel using arbitrary commands for simplicity.
from subprocess import Popen
commands = ['ls -l', 'date', 'which python']
processes = [Popen(cmd, shell=True) for cmd in commands]

How to restart a python script using another?

('The First script' takes input from the user and 'the second script' notify the task.)
I have been trying to restart a python script using another one but i couldn't succeed it after trying to do a few methods. I developed a reminder, notify user when time previously set by the user has arrived, app works on Linux and it have 2 python script. First one is for taking input that given by the user to schedule a task. For example, "Call the boss at 12:30 pm". Then Linux is going to notify it at 12:30 pm. The other one is checking the inputs and notify them when the time comes.
In first script, i am trying to restart the other script when the user give a new task because the script needs to read the new task to notify it. Also I want to terminate the first script when it ran the second script. But the second script must still be working. In first script, I tried these commands to do that:
os.system(f"pkill -f {path2}")
os.system(f"python {path2}")
These aren't work.
Also I want to run the second script at the startup of my os.
Summary:
1- I wanna restart a python script using another one and the first one should be terminated when the second one is run.
2- I wanna run the second script at the startup of my os.
Repository about my reminder app is here.
About 1 :
Assuming the name of the other script is 2.py (Changeable with the code below), this worked for me pretty well:
1.py:
import subprocess
import os
import time
OTHER_SCRIPT_NAME = "2.py"
process_outputs = subprocess.getoutput("ps aux | grep " + OTHER_SCRIPT_NAME) # Searching for the process running 2.py
wanted_process_info = process_outputs.split("\n")[0] # Getting the first line only
splitted_process_info = wanted_process_info.split(" ") # Splitting the string
splitted_process_info = [x for x in splitted_process_info if x != ''] # Removing empty items
pid = splitted_process_info[1] # PID is the secend item in the ps output
os.system("kill -9 " + str (pid)) # Killing the other process
exit()
time.sleep(1000) # Will not be called because exit() was called before
2.py:
import time
time.sleep(100)
About 2:
In linux, you can execute scripts on startup by writing it into the /etc/rc.local file
Just run your scripts from the rc.local file and you are good to go:
/etc/rc.local:
python '/path/to/your/scripts'

Python script doesn't run background process, when called by cron

I have a python script, runned by cron:
"*/5 * * * * python /home/alex/scripts/checker > /dev/null &";
It has several purposes, one of them is to check certain programs in ps list and run them if they are not there. The problem is that script when runned by cron not executed programs in backgroung correctly, all of them are in ps list look like:
/usr/bin/python /home/alex/exec/runnable
So they look like python scripts. When I launch my python script manually it seems that it executes runnable in background corretcly, but with cron nothing works.
Here's the example of code:
def exec(file):
file = os.path.abspath(file)
os.system("chmod +x " + file)
cmd = file
#os.system(cmd)
#subprocess.Popen([cmd])
subprocess.call([cmd])
I tried different approaches but nothing seems to work right.
Some code update:
pids = get_pids(program)
if pids == None:
exec(program)
print 'Restarted'

How to use batch file to run multiple python scripts simultaneously

I have many python scripts and it is a pain to run each one of them individually by clicking them. How to make a batch file to run them all at once?
just make a script like this backgrounding each task (on windows):
start /B python script1.py
start /B python script2.py
start /B python script3.py
on *nix:
python script1.py &
python script2.py &
python script3.py &
Assuming non of your script requires human interaction to run
Use the start command to initiate a process.
#echo off
start "" foo.py
start "" bar.py
start "" baz.py
Re comment: “is there way to start these minimized?”
You can always ask about how a command works by typing the command name followed by a /?. In this case, start /? tells us its command-line options include:
MIN Start window minimized.
Hence, to start the application minimized, use:
start "" /MIN quux.py
Multiprocessing .py files simultaneously
Run as many .py files simultaneously as you want. Create for each .py a .bat to start the python file. Define all the .bat files in the list of lists. The second parameter in the list is a delay to start the .bat file. Don't use zero for the delay. It works fine. On this way You leave parallelism to the operating system which is very fast and stable. For every .bat you start opens a command window to interact with the User.
from apscheduler.schedulers.background import BackgroundScheduler
import datetime as dt
from os import system
from time import sleep
parallel_tasks = [["Drive:\YourPath\First.bat", 1], ["Drive:\YourPath\Second.bat", 3]]
def DatTijd():
Nu = dt.datetime.now()
return Nu
def GetStartTime(Nu, seconds):
StartTime = (Nu + dt.timedelta(seconds=seconds)).strftime("%Y-%m-%d %H:%M:%S")
return StartTime
len_li = len(parallel_tasks)
sleepTime = parallel_tasks[len_li - 1][1] + 3
Nu = DatTijd()
for x in range(0, len_li):
parallel_tasks[x][0] = 'start cmd /C ' + parallel_tasks[x][0]
# if you want the command window stay open after the tasks are finished use: cmd /k in the line above
delta = parallel_tasks[x][1]
parallel_tasks[x][1] = GetStartTime(Nu, delta)
JobShedul = BackgroundScheduler()
JobShedul.start()
for x in range(0, len_li):
JobShedul.add_job(system, 'date', run_date=parallel_tasks[x][1], misfire_grace_time=3, args=[parallel_tasks[x][0]])
sleep(sleepTime)
JobShedul.shutdown()
exit()
Example.bat
echo off
Title Python is running [Your Python Name]
cls
echo "[Your Python Name] is starting up ..."
cd Drive:\YourPathToPythonFile
python YourPyFile.py

Rebooting a server after Windows application is installed using Python

I am developing some Python (version 3.6.1) code to install an application in Windows 7. The code used is this:
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
The application is installed successfully. The problem is that it always requires a reboot after it is finished (a popup with a message "You must restart your system for the configuration changes made to to take effect. Click Yes to restart now or No if you plan to restart later.).
I tried to insert parameter "/forcerestart" (source here) in the installation command but it still stops to request the reboot:
def installApp():
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /forcerestart /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
Another attempt was to create a following command like this one below, although since the previous command is not finished yet (as per my understanding) I realized it will never be called:
rebootSystem = 'shutdown -t 0 /r /f'
subprocess.Popen(rebootSystem, stdout=subprocess.PIPE, shell=True)
Does anyone had such an issue and could solve it?
As an ugly workaround, if you're not time-critical but you want to emphasise the "automatic" aspect, why not
run the installCMD in a thread
wait sufficiently long to be sure that the command has completed
perform the shutdown
like this:
import threading,time
def installApp():
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
t = threading.Thread(target=installApp)
t.start()
time.sleep(1800) # half-hour should be enough
rebootSystem = 'shutdown -t 0 /r /f'
subprocess.Popen(rebootSystem, stdout=subprocess.PIPE, shell=True)
Another (safer) way would be to find out which file is created last in the installation, and monitor for its existence in a loop like this:
while not os.path.isfile("somefile"):
time.sleep(60)
time.sleep(60) # another minute for safety
# perform the reboot
To be clean, you'd have to use subprocess.Popen for the installation process, export it as global and call terminate() on it in the main process, but since you're calling a shutdown that's not necessary.
(to be clean, we wouldn't have to do that hack in the first place)

Categories

Resources