('The First script' takes input from the user and 'the second script' notify the task.)
I have been trying to restart a python script using another one but i couldn't succeed it after trying to do a few methods. I developed a reminder, notify user when time previously set by the user has arrived, app works on Linux and it have 2 python script. First one is for taking input that given by the user to schedule a task. For example, "Call the boss at 12:30 pm". Then Linux is going to notify it at 12:30 pm. The other one is checking the inputs and notify them when the time comes.
In first script, i am trying to restart the other script when the user give a new task because the script needs to read the new task to notify it. Also I want to terminate the first script when it ran the second script. But the second script must still be working. In first script, I tried these commands to do that:
os.system(f"pkill -f {path2}")
os.system(f"python {path2}")
These aren't work.
Also I want to run the second script at the startup of my os.
Summary:
1- I wanna restart a python script using another one and the first one should be terminated when the second one is run.
2- I wanna run the second script at the startup of my os.
Repository about my reminder app is here.
About 1 :
Assuming the name of the other script is 2.py (Changeable with the code below), this worked for me pretty well:
1.py:
import subprocess
import os
import time
OTHER_SCRIPT_NAME = "2.py"
process_outputs = subprocess.getoutput("ps aux | grep " + OTHER_SCRIPT_NAME) # Searching for the process running 2.py
wanted_process_info = process_outputs.split("\n")[0] # Getting the first line only
splitted_process_info = wanted_process_info.split(" ") # Splitting the string
splitted_process_info = [x for x in splitted_process_info if x != ''] # Removing empty items
pid = splitted_process_info[1] # PID is the secend item in the ps output
os.system("kill -9 " + str (pid)) # Killing the other process
exit()
time.sleep(1000) # Will not be called because exit() was called before
2.py:
import time
time.sleep(100)
About 2:
In linux, you can execute scripts on startup by writing it into the /etc/rc.local file
Just run your scripts from the rc.local file and you are good to go:
/etc/rc.local:
python '/path/to/your/scripts'
Related
This question already has an answer here:
Blocking and Non Blocking subprocess calls
(1 answer)
Closed 12 months ago.
There are two python scripts involved in this task.
My current task requires me to run a long process(takes about a day or two per each, and this is the first python script) in each of 29 available regions on GCP's instances. In order to finish the task as quick as possible, I'm trying to run each process in each instance all at once after spinning off 29 VMs all at once.
As manually running the first script by SSH-ing in to each of the instance is cumbersome, I wrote a python script(the second script) that SSHs into each region's VM and runs the first script I mentioned above.
The issue with the second script that runs first script in different regions is that it doesn't start off to run the first script in second region's VM until it finishes running in the first region's VM, whereas I need the second script to run the first script in every region without waiting for the process started by first script to end.
I use subprocess() in the second script to run the first script in each VMs.
The following code is the second script:
for zone, instance in zipped_zone_instance:
command = "gcloud compute ssh --zone " + zone + " " + instance + " --project cloud-000000 --command"
command_lst = command.split(" ")
command_lst.append("python3 /home/first_script.py")
subprocess.run(command_lst)
I need the subprocess.run(command_lst) to run for every 29 zones at once rather than it running for the second zone only after the first zone's process ends.
The following code is the first script:
for idx, bucket in enumerate(bucket_lst):
start = time.time()
sync_src = '/home/' + 'benchmark-' + var_
subprocess.run(['gsutil', '-m', '-o', 'GSUtil:parallel_composite_upload_threshold=40M', 'rsync', '-r', sync_src, bucket])
end = time.time() - start
time_lst.append(end)
tput_lst.append(tf_record_disk_usage / end)
What can I fix in the second script or the first script to achieve what I want??
Switch out your subprocess.run(command_lst) with Popen(command_lst, shell=True) in each of your scripts and and loop through the command list like the example below to run the processes in parallel.
This is how you implement Popen to run processes in parallel using arbitrary commands for simplicity.
from subprocess import Popen
commands = ['ls -l', 'date', 'which python']
processes = [Popen(cmd, shell=True) for cmd in commands]
I've got a long running python script that I want to be able to end from another python script. Ideally what I'm looking for is some way of setting a process ID to the first script and being able to see if it is running or not via that ID from the second. Additionally, I'd like to be able to terminate that long running process.
Any cool shortcuts exist to make this happen?
Also, I'm working in a Windows environment.
I just recently found an alternative answer here: Check to see if python script is running
You could get your own PID (Process Identifier) through
import os
os.getpid()
and to kill a process in Unix
import os, signal
os.kill(5383, signal.SIGKILL)
to kill in Windows use
import subprocess as s
def killProcess(pid):
s.Popen('taskkill /F /PID {0}'.format(pid), shell=True)
You can send the PID to the other programm or you could search in the process-list to find the name of the other script and kill it with the above script.
I hope that helps you.
You're looking for the subprocess module.
import subprocess as sp
extProc = sp.Popen(['python','myPyScript.py']) # runs myPyScript.py
status = sp.Popen.poll(extProc) # status should be 'None'
sp.Popen.terminate(extProc) # closes the process
status = sp.Popen.poll(extProc) # status should now be something other than 'None' ('1' in my testing)
subprocess.Popen starts the external python script, equivalent to typing 'python myPyScript.py' in a console or terminal.
The status from subprocess.Popen.poll(extProc) will be 'None' if the process is still running, and (for me) 1 if it has been closed from within this script. Not sure about what the status is if it has been closed another way.
This worked for me under windows 11 and PyQt5:
subprocess.Popen('python3 MySecondApp.py')
Popen.terminate(app)
where app is MyFirstApp.py (the caller script, running) and MySecondApp.py (the called script)
I would like to have several scripts running on PythonAnywhere. In order to make sure that the scripts are not killed I would like to check for their status in an interval of five minutes (based on https://help.pythonanywhere.com/pages/LongRunningTasks/).
Two questions arise:
1. In the script which runs every five minutes I would like to check whether the other scripts (script2, script3) are still alive or not. If not, I would obviously like to run them. But how do I run several scripts from one script (script1) without script1 getting "stuck"? I.e. how do I start two scripts at the same time from one script?
If I just try to run the script using "import script2" I get an error
ImportError: No module named script2
How do I tell Python that the script is in a different folder (because that has to be the issue)?
Thanks in advance!
Try this:
import time
import subprocess
def check_process(proc,path):
if proc.poll()!=1:
print('%s still running' % proc)
elif proc.poll()==1:#will give a 1 if the child process has been killed
print('%s is dead. Re-running')
subprocess.Popen(['python.exe', path])
script1=subprocess.Popen(['python.exe', pathscript1])
script2=subprocess.Popen(['python.exe', pathscript2])
while True:
check_process(script1,pathscript1)
check_process(script2,pathscript2)
time.sleep(300)
I am developing some Python (version 3.6.1) code to install an application in Windows 7. The code used is this:
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
The application is installed successfully. The problem is that it always requires a reboot after it is finished (a popup with a message "You must restart your system for the configuration changes made to to take effect. Click Yes to restart now or No if you plan to restart later.).
I tried to insert parameter "/forcerestart" (source here) in the installation command but it still stops to request the reboot:
def installApp():
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /forcerestart /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
Another attempt was to create a following command like this one below, although since the previous command is not finished yet (as per my understanding) I realized it will never be called:
rebootSystem = 'shutdown -t 0 /r /f'
subprocess.Popen(rebootSystem, stdout=subprocess.PIPE, shell=True)
Does anyone had such an issue and could solve it?
As an ugly workaround, if you're not time-critical but you want to emphasise the "automatic" aspect, why not
run the installCMD in a thread
wait sufficiently long to be sure that the command has completed
perform the shutdown
like this:
import threading,time
def installApp():
winCMD = r'"C:\PowerBuild\setup.exe" /v"/qr /l C:\PowerBuild\TUmsi.log"'
output = subprocess.check_call(winCMD, shell = True)
t = threading.Thread(target=installApp)
t.start()
time.sleep(1800) # half-hour should be enough
rebootSystem = 'shutdown -t 0 /r /f'
subprocess.Popen(rebootSystem, stdout=subprocess.PIPE, shell=True)
Another (safer) way would be to find out which file is created last in the installation, and monitor for its existence in a loop like this:
while not os.path.isfile("somefile"):
time.sleep(60)
time.sleep(60) # another minute for safety
# perform the reboot
To be clean, you'd have to use subprocess.Popen for the installation process, export it as global and call terminate() on it in the main process, but since you're calling a shutdown that's not necessary.
(to be clean, we wouldn't have to do that hack in the first place)
I am attempting to to launch a python script from within another python script, but in a minimized console, then return control to the original shell.
I am able to open the required script in a new shell below, but it's not minimized:
#!/usr/bin/env python
import os
import sys
import subprocess
pyTivoPath="c:\pyTivo\pyTivo.py"
print "Testing: Open New Console"
subprocess.Popen([sys.executable, pyTivoPath], creationflags = subprocess.CREATE_NEW_CONSOLE)
print
raw_input("Press Enter to continue...")
Further, I will need to be able to later remotely KILL this shell from the original script, so I suspect I'll need to be explicit in naming the new process. Correct?
Looking for pointers, please. Thanks!
Note: python27 is mandatory for this application. Eventually will also need to work on Mac and Linux.
Do you need to have the other console open? If you now the commands to be sent, then I'd recommend using Popen.communicate(input="Shell commands") and it will automate the process for you.
So you could write something along the lines of:
# Commands to pass into subprocess (each command is separated by a newline)
commands = (
"command1\n" +
"command2\n"
)
# Your process
py_process = subprocess.Popen(*yourprocess_here*, stdin=PIPE, shell=True)
# Feed process the needed input
py_process.communicate(input=commands)
# Terminate when finished
py_process.terminate()
The code above will execute the process you specify and even send commands but it won't open a new console.