I want to run a simple python script 100 times in parallel using bash. If I run the python scrips serial everthing is fine, but if I run them in parallel I get the error
stdin is not a tty
Presumably because the same python file gets opened multiple times?
Here's the bash file
#!/bin/bash
for i in {1..5}; do
winpty python test.py &
done
If I remove the & sign everything works fine (serial) but not if I run it in parallel. The Python file is litterally just 1+1
PS: I run python with winpty python and not the usual python because I run it from the git-bash console and that thing has issues... But again, I don't think this is where the issue comes from, because running everything in serial works fine...
I don't have winpty to test following script, but try :
#!/bin/bash
for i in {1..5}; do
winpty python test.py < /dev/tty > /dev/tty 2>&1 &
done
Related
I have a bash script that invokes a python script in a loop. It is working fine (in-order execution) when I run it normally in the foreground or If I run it in the background using '&' and don't close the terminal window. I am running this in a Mac.
My script looks like below:
for file in */List*.bin;do
newFile=${file%.bin}.txt
./Prog1 $file > "$newFile"
wait
python PyProg.py "$newFile" >> Report.txt
wait
done
Thisis how I run the script:
(sudo ./Script > log.txt) &
But if close the terminal and I check with 'top' command from a different terminal, It is only showing 'Prog1' and not python.
I don't know if the python is not being run at all or the system does some out of order execution.
anyway, I cannot see any new line being added in Report.txt.
I tried disown -h %1 after running this to detach the script from the terminal. But the result is the same.
I have this small piece of Bash script which enters a character continuosly 12 times to a serial device every 1 millisecond (it is run immediately after a reboot of the device). When I try to run this, it works fine from bash. If I try to run this from Python, with the same logic, it does nothing. Also I tried to simplay call this bash script from python, it still does not work. Here is my bash script:
for i in {1..12}
do echo 's' > /dev/ttyUSB2
sleep 0.1
done
And the python code I tried to run is:
#reboot here
for i in range (1, 12):
os.popen("echo s > /dev/ttyUSB2")
time.sleep(0.1)
Also I tried to just invoke the bash script from python
os.popen("source gotomode.sh")
where gotomode.sh is the bash script.
What could I be possibly doing wrong?
Is timing different while using python commands os.popen, subprocess.popen, popen2 etc?
Thanks in advance
I have 2 scripts that I need to run at the same time. One script collects data and the other plots the data live.
On PC, I can simply open 2 IDLE shells and they will run concurrently but on Mac, that isn't possible.
I wrote the following bash file as suggested in this post (Run multiple python scripts concurrently):
python script1.py &
python script2.py &
But this only runs my scripts one at a time. Is there anyway on a mac that I can get both scripts running at the same time?
You can do it all from within python by using subprocess.Popen()
import subprocess
import sys
s1 = subprocess.Popen([sys.executable, 'script1.py'])
s2 = subprocess.Popen([sys.executable, 'script2.py'])
s1.wait()
s2.wait()
For my purposes, I was able to find a workaround that's slightly more tedious. I have 2 separate bash scripts now, each containing one of the lines from the above script I initially posted. Running both the bash scripts will run both my scripts simultaneously in different shells.
As a side note, does anybody know how I can do a similar thing, where I use a single bash script to call both of the new bash scripts?
That's not true, on OS X (Mac) works as expected.
script1.py
#! /usr/bin/env python
import time
time.sleep(1)
print "script1.py"
script2.py
#! /usr/bin/env python
print "script2.py"
run
set executable permission and run in shell
./script1.py &
./script2.py &
and the output will be
script2.py
script1.py
proving that both were run concurrently (as output from second script is displayed first)
I have ten python scripts in the same directory. How to run all of these from command line, that it will work in background?
I use SSH terminal to connect to server CentOS and run Python script as:
python index.py
But when I close client terminal SSH, proccess is died
You can use the & command to make things run in the background, and nohup so it continues on logout, such as
nohup python index.py &
If you want to run multiple things this way, it's probably easiest to just make a script to start them all (with a shell of your choice):
#!/bin/bash
nohup python index1.py &
nohup python index2.py &
...
As long as you don't need to interact with the scripts once they are started (and don't need any stdout printing) this could be pretty easily automated with another python script using the subprocess module:
for script in listofscripts:
#use subprocess.run() for python 3.x (this blocks until each script terminates)
subprocess.call(["python", script], *args) #use popen if you want non - blocking
*args is a link (it's coloring got overwritten by code highliting
also of note: stdout / stderr printing is possible, just more work..
I can't seem to figure out how to get his bash script working.
#!/bin/bash
export WORKON_HOME=~/.envs
source /usr/local/bin/virtualenvwrapper.sh
workon staging_env
It is using viretualenv and virualenvwrapper in order to use a Python virtual environment.
Typing these commands in the shell work perfectly fine, running it as a bash script does not work though.
Any ideas?
When you run a script, it creates its own instance of the shell (bash, in this case). Because of this, the changes are lost when the script ends and the script's shell is closed.
To make the changes stick, you'll have to source the script instead of running it.