I have this small piece of Bash script which enters a character continuosly 12 times to a serial device every 1 millisecond (it is run immediately after a reboot of the device). When I try to run this, it works fine from bash. If I try to run this from Python, with the same logic, it does nothing. Also I tried to simplay call this bash script from python, it still does not work. Here is my bash script:
for i in {1..12}
do echo 's' > /dev/ttyUSB2
sleep 0.1
done
And the python code I tried to run is:
#reboot here
for i in range (1, 12):
os.popen("echo s > /dev/ttyUSB2")
time.sleep(0.1)
Also I tried to just invoke the bash script from python
os.popen("source gotomode.sh")
where gotomode.sh is the bash script.
What could I be possibly doing wrong?
Is timing different while using python commands os.popen, subprocess.popen, popen2 etc?
Thanks in advance
Related
My question seems to be quite easy, but for some reason I did not find a quick answer to it. I have a python script that I want to run on the terminal command line (Ubuntu linux server), which works for me. But then I can't use the command line until the script ends. The script takes a long time to run, and I would like to continue using the command line to perform other tasks. How can you do the work of a script when its progress is not shown on the command line, but keep its work? And how can I see the active processes that are running on the server to see if a process is running?
Run script command:
python script.py
Add next with & echo "123":
The script takes a long time to run, and I would like to continue
using the command line to perform other tasks.
It seems that you want to run said process in background, please try pasting following
python script.py &
echo "123"
it should start your script.py and then output 123 (without waiting until script.py ends)
how can I see the active processes that are running on the server to
see if a process is running?
Using ps command
ps -ef
will list all processes which you would probably want to filter to get interested one to you.
I want to run a simple python script 100 times in parallel using bash. If I run the python scrips serial everthing is fine, but if I run them in parallel I get the error
stdin is not a tty
Presumably because the same python file gets opened multiple times?
Here's the bash file
#!/bin/bash
for i in {1..5}; do
winpty python test.py &
done
If I remove the & sign everything works fine (serial) but not if I run it in parallel. The Python file is litterally just 1+1
PS: I run python with winpty python and not the usual python because I run it from the git-bash console and that thing has issues... But again, I don't think this is where the issue comes from, because running everything in serial works fine...
I don't have winpty to test following script, but try :
#!/bin/bash
for i in {1..5}; do
winpty python test.py < /dev/tty > /dev/tty 2>&1 &
done
I am running a python2 code which is triggered by dial plan. In order to process saved recording I need to run a python 3 script. Is there any way to do it. If I am switching the code to python3 the code is not working.
This is the extension
same=>n,AGI(code.py)
in code.py on giving the header
#!/usr/bin/env python2
i am able to run the function
def run_cmd(cmd):
#This runs the general command
sys.stderr.write(cmd)
sys.stderr.flush()
sys.stdout.write(cmd)
sys.stdout.flush()
result = sys.stdin.readline().strip()
checkresult(result)
which is able to process various agi command
but on switching it to python 3 #!/usr/bin/env python3
code wont run.
Now I need to use google cloud engine to process something thats written in python 3
Is there a way to make it run
i have done
def run_sys_command(command):
subprocess.call(command, shell=True)
checkresult(result)
command = "sudo python3 /root/Downloads/check2.py"
run_sys_command(command)
Is there any way to run the python 3 script or any way to run python 3 script directly with agi.
I have checked permission n everything
Sure you can run threads inside AGI.
But it should be stopped before AGI script end.
The simplest way do what you want - setup some type of queue(rabbitmq/simple tasks list in mysql?) and process it outside asterisk process scope.
There is no any problem with running python3 as AGI script. I have plenty of such scripts in my projects. Just check your code.
I have ten python scripts in the same directory. How to run all of these from command line, that it will work in background?
I use SSH terminal to connect to server CentOS and run Python script as:
python index.py
But when I close client terminal SSH, proccess is died
You can use the & command to make things run in the background, and nohup so it continues on logout, such as
nohup python index.py &
If you want to run multiple things this way, it's probably easiest to just make a script to start them all (with a shell of your choice):
#!/bin/bash
nohup python index1.py &
nohup python index2.py &
...
As long as you don't need to interact with the scripts once they are started (and don't need any stdout printing) this could be pretty easily automated with another python script using the subprocess module:
for script in listofscripts:
#use subprocess.run() for python 3.x (this blocks until each script terminates)
subprocess.call(["python", script], *args) #use popen if you want non - blocking
*args is a link (it's coloring got overwritten by code highliting
also of note: stdout / stderr printing is possible, just more work..
I just set up my first aws server for a personal project I'm doing. I'm running ubuntu linux, and I have a python script that accesses an sqlite database file in order to send email. I have these same files on my own ubuntu machine and the script works fine. I'm having trouble, however, figuring out how to run my script from the terminal in my aws vm. Normally I use idle to run my python script on my linux machine, so I'm trying to figure out how to run it from the terminal and it's giving me some trouble.
I tried
python script.py
which did nothing, so I converted it to an executable, ran it, and got
./script.py: line 1: import: command not found
...and so on
I realized that I had to add
#!/usr/bin/env python3
to my script, so I did that, converted to executable again, and ran it by entering
./script.py
which also did nothing. If the program had run, it would have delivered an email to my inbox. Is there any way I can tell if it's actually trying to run my script? Or am I trying to run it incorrectly?
You can modify the script to add verbose that prints out to the console the status of the script, or if you just want to know whether your script is running in the background, you can check whether the process is active using ps (process name would be the name of the script) :
ps aux | grep "script.py"
Anyways the former is a better practice, since you can exactly know execution flow of your script.