I have a bash script that invokes a python script in a loop. It is working fine (in-order execution) when I run it normally in the foreground or If I run it in the background using '&' and don't close the terminal window. I am running this in a Mac.
My script looks like below:
for file in */List*.bin;do
newFile=${file%.bin}.txt
./Prog1 $file > "$newFile"
wait
python PyProg.py "$newFile" >> Report.txt
wait
done
Thisis how I run the script:
(sudo ./Script > log.txt) &
But if close the terminal and I check with 'top' command from a different terminal, It is only showing 'Prog1' and not python.
I don't know if the python is not being run at all or the system does some out of order execution.
anyway, I cannot see any new line being added in Report.txt.
I tried disown -h %1 after running this to detach the script from the terminal. But the result is the same.
Related
I need to start a python program when the system boots. It must run in the background (forever) such that opening a terminal session and closing it does not affect the program.
I have demonstrated that by using tmux this can be done manually from a terminal session. Can the equivalent be done from a script that is run at bootup?
Then where done one put that script so that it will be run on bootup.
Create a startup script that runs on boot and launches the desired Python program in the background.
Here are the steps:
Create a shell script that launches the Python program in the background:
#!/bin/sh
python /path/to/your/python/program.py &
Make the shell script executable:
chmod +x /path/to/your/script.sh
Add the script to the startup applications:
On Ubuntu, this can be done by going to the Startup Applications program and adding the script.
On other systems, you may need to add the script to the appropriate startup folder, such as /etc/rc.d/ or /etc/init.d/.
After these steps, the Python program should start automatically on boot and run in the background.
It appears that in addition to putting a script that starts the program in /etc/init.d, one also has to put a link in /etc/rc2.d with
sudo ln -s /etc/init.d/scriptname.sh
sudo mv scriptname.sh S01scriptname.sh
The S01 was just copied from all the other files in /etc/rc2.d
My question seems to be quite easy, but for some reason I did not find a quick answer to it. I have a python script that I want to run on the terminal command line (Ubuntu linux server), which works for me. But then I can't use the command line until the script ends. The script takes a long time to run, and I would like to continue using the command line to perform other tasks. How can you do the work of a script when its progress is not shown on the command line, but keep its work? And how can I see the active processes that are running on the server to see if a process is running?
Run script command:
python script.py
Add next with & echo "123":
The script takes a long time to run, and I would like to continue
using the command line to perform other tasks.
It seems that you want to run said process in background, please try pasting following
python script.py &
echo "123"
it should start your script.py and then output 123 (without waiting until script.py ends)
how can I see the active processes that are running on the server to
see if a process is running?
Using ps command
ps -ef
will list all processes which you would probably want to filter to get interested one to you.
I want to run a simple python script 100 times in parallel using bash. If I run the python scrips serial everthing is fine, but if I run them in parallel I get the error
stdin is not a tty
Presumably because the same python file gets opened multiple times?
Here's the bash file
#!/bin/bash
for i in {1..5}; do
winpty python test.py &
done
If I remove the & sign everything works fine (serial) but not if I run it in parallel. The Python file is litterally just 1+1
PS: I run python with winpty python and not the usual python because I run it from the git-bash console and that thing has issues... But again, I don't think this is where the issue comes from, because running everything in serial works fine...
I don't have winpty to test following script, but try :
#!/bin/bash
for i in {1..5}; do
winpty python test.py < /dev/tty > /dev/tty 2>&1 &
done
I have a debian Instance in the compute engine,
I run a python script on it,
At the end of the script I have the following code to append a tkt file.
with open("Optimisation.txt", "a+") as myfile:
myfile.write(str(Results))
myfile.write("\n" )
I run my python script with python MyScript.py &
when I let the ssh console open, it correctly appends the file but when I close the console, the scipt continues working but the file is not appened anymore.
When I reopen the console and enter sudo -s & ps -fux I can see the scipt correctly running.
On my computer and my open console debian it works but not when I am closing the console.
python MyScript.py & would run the process in background using a subshell. If the current shell is terminated (say by logout), all subshells are also terminated so the background process would also be terminated. The nohup command ignores the HUP signal and thus even if the current shell is terminated, the subshell and the myprocess.out would continue to run in the background.
Ideally, you should invoke your script with the below command
nohup python MyScript.py > MyOutput.log 2>&1 &
I have ten python scripts in the same directory. How to run all of these from command line, that it will work in background?
I use SSH terminal to connect to server CentOS and run Python script as:
python index.py
But when I close client terminal SSH, proccess is died
You can use the & command to make things run in the background, and nohup so it continues on logout, such as
nohup python index.py &
If you want to run multiple things this way, it's probably easiest to just make a script to start them all (with a shell of your choice):
#!/bin/bash
nohup python index1.py &
nohup python index2.py &
...
As long as you don't need to interact with the scripts once they are started (and don't need any stdout printing) this could be pretty easily automated with another python script using the subprocess module:
for script in listofscripts:
#use subprocess.run() for python 3.x (this blocks until each script terminates)
subprocess.call(["python", script], *args) #use popen if you want non - blocking
*args is a link (it's coloring got overwritten by code highliting
also of note: stdout / stderr printing is possible, just more work..