I have an .sh file that produces a picture on Raspberry Pi. and inside this file I have the following:
Config.sh:
#!/bin/bash
suffix=$(date +%H%M%S)
cd /home/pi/photobooth_images/
sudo cp image1.jpg /usb/photobooth_images/image-${suffix}-1.jpg
sudo convert -size 1800x1200 xc:white \
image1.jpg -geometry 1536x1152+240+24 -composite \
/home/pi/template/logo.png -geometry 192x1152+24+24 -composite \
PB_${suffix}.jpg
sudo cp PB_${suffix}.jpg /usb/photobooth_montage/PB_${suffix}.jpg
sudo rm /home/pi/photobooth_images/*
returnvalue=PB_${suffix}.jpg
echo "$returnvalue"
What I am trying to do here is get the PB_${suffix}.jpg "returnvalue" value (file name) it generated into Python. Now my Python program has this line, it runs the .sh file above.
Main.py:
return_value = subprocess.call("sudo ./" + config.sh, shell=True)
print "The Value is: " + str(return_value) + " This value from Python"
The output I get is this
[08:33:02 04-10-2016] [PHOTO] Assembling pictures according to 1a template.
PB_083302.jpg
The Value is: 0 This value from Python
The output I am expected should be something like "PB_070638.jpg"
Any help is greatly appreciated.
That is because subprocess.call only returns the return code of executing the script (documentation). You want the actual output of what your script returns, so you should be using check_output, and avoid using shell=True
subprocess.check_output(["sudo", "./", config.sh])
You also might want to revise running your script without root permission via sudo. It does not seem like something that should run using root privileges.
Try using the Popen constructor the with stdout arg:
subprocess.Popen(['"sudo ./" + config.sh'], stdout=subprocess.PIPE)
See also:
Get return value from shell command in python
Also, here's more info on Popen in the Python docs.
https://docs.python.org/2/library/subprocess.html#popen-constructor
Related
I have a python script (not created by me), let's call it myscript, which I call with several parameters.
So I run the script like this in Windows cmd:
Code:
/wherever/myscript --username=whoever /some/other/path/parameter
And then a prompt appears and I can pass arguments to the python script:
Process started successfully, blabla
Python 2.7.2 blabla
(LoggingConsole)
>>>
And I write my stuff, then quit to be back into cmd:
>>> command1()
>>> command2()
>>> quit()
I suspect some errors occurring in this part, but only once for a hundred trials. So I want to do it by a script.
I want to pipe to this script the internal command1 command2, so that I can test this function thousand times and see when it breaks. I have the following piece of code:
echo 'command1()' | py -i /wherever/myscript --username=whoever /some/other/path/parameter
This unfortunately doesn't generate the same behaviour, as if it would be manually entered.
Can I simulate this behaviour with pipes/redirecting output? Why doesn't it work? I expect that the 'command1()' text will be entered when the script waits for the commands, but it seems I'm wrong.
Thanks!
EDIT 16/02/2021 3:33PM :
I was looking for the cmd shell way to solve this, no python stuff
The piece of script
echo 'command1()' | py -i /wherever/myscript --username=whoever /some/other/path/parameter
is almost correct, just remove the '' :
echo command1() | py -i /wherever/myscript --username=whoever /some/other/path/parameter
my issues were coming from myscript. Once I fixed the weird things on this side, this part was all ok. You can even put all commands together:
echo command1();command2();quit(); | py -i /wherever/myscript --username=whoever /some/other/path/parameter
This question is adapted from a question of gplayersv the 23/08/2012 on unix.com, but the original purpose made the question not answered.
Easy to have pipes.
If you want to get the standard input :
import sys
imput = sys.stdin.read()
print(f'the standard imput was\n{imput}')
sys.stderr.write('This is an error message that will be ignored by piping')
If you want to use the standard input as argument:
echo param | xargs myprogram.py
Python's built-in fileinput module makes this simple and concise:
#!/usr/bin/env python3
import fileinput
with fileinput.input() as f:
for line in f:
print(line, end='')
Than you can accept input in whatever mechanism is easier for you:
$ ls | ./filein.py
$ ./filein.py /etc/passwd
$ ./filein.py < $(uname -r)
I have this shell command:
$ docker run -it --env-file=.env -e "CONFIG=$(cat /path/to/your/config.json | jq -r tostring)" algolia/docsearch-scraper
And I want to run it as a python subprocess.
I thought I'll only need an equivalent of the jq -r tostring, but if I use the config.json as a normal string the " don't get escaped. I also escaped them by using json.load(config.json).
With the original jq command the " don't get escaped either and it's just returning the json string.
When I use the json returned as a string in python subprocess i get always a FileNotFoundError on the subprocess line.
#main.command()
def algolia_scrape():
with open(f"{WORKING_DIR}/conf_dev.json") as conf:
CONFIG = json.load(conf)
subprocess.Popen(f'/usr/local/bin/docker -it --env-file={WORKING_DIR}/algolia.env -e "CONFIG={json.dumps(CONFIG)}" algolia/docsearch-scraper')
You get "file not found" because (without shell=True) you are trying to run a command whose name is /usr/local/bin/docker -it ... when you want to run /usr/local/bin/docker with some arguments. And of course it would be pretty much a nightmare to try to pass the JSON through the shell because you need to escape any shell metacharacters from the string; but just break up the command string into a list of strings, like the shell would.
def algolia_scrape():
with open(f"{WORKING_DIR}/conf_dev.json") as conf:
CONFIG = json.load(conf)
p = subprocess.Popen(['/usr/local/bin/docker', '-it',
f'--env-file={WORKING_DIR}/algolia.env',
'-e', f'CONFIG={json.dumps(CONFIG)}',
'algolia/docsearch-scraper'])
You generally want to save the result of subprocess.Popen() because you will need to wait for the process when it terminates.
I have some code here trying to open up the cmd.exe from Python and input some lines for the command to use.
Here it is:
PDF= "myPDF"
output= "my output TIF"
def my_subprocess(command,c='C:\here'):
process = subprocess.Popen(command,stdout=subprocess.PIPE,shell=True,cwd=c)
communicate = process.communicate()[0].strip()
my_subprocess('"cmd.exe" && "C:\\here\\myinfamous.bat" && "C:\\my directory and lines telling cmd to do stuff"'+ PDF + " " + output)
When run with the rest of my script, the command prompt does not even open up and there seems to be no output or errors at all. My thought is that it has not even run the cmd.exe command so none of this code is going in to create the final output.
Is there something I am not doing properly?
Thank you.
You need to replace subprocess.Popen with subprocess.call
Here is a working code on windows 8 that opens a text file using notepad. First field is the command itself and second field is argument.
You can modify these and test with your files.
import subprocess
subprocess.call(['C:\\Windows\\System32\\Notepad.exe', 'C:\\openThisfile.txt'])
I need to call matlab with a defined MATLABPATH from a python script and I try to do this with the following code (snippet) in python:
addMatlabPath = os.path.join(<validPath>,'src') + ":" + \
os.path.join(<someOtherValidPaht>,'src') + ":"
matlabPathCommand = "export MATLABPATH="+addMatlabPath+"$MATLABPATH"
commandLine = matlabPathCommand+" && echo $MATLABPATH && "+"/Applications/MATLAB_R2015a.app/bin/matlab -nodisplay -nosplash -r \"my_matlab_script\"".format(os.path.realpath(os.path.dirname(__file__)),output_dir)
I try to execute the commandLine through subprocess:
process = subprocess.check_call(commandLine, stdout=out_buffer, stderr=subprocess.STDOUT,shell=True)
with which I can call matlab perfectly without the matlabPathCommand and the echo part in front.
The paths I use for the variable addMatlabPath are valid. I tested the command matlabPathCommand+ && echo $MATLABPPATH and this works correctly.
So, both parts of the command work individually as expected but not together. Python seems to hang in the check_call command and doesn't return even after a multiple of time the call to matlab normally takes.
Does anyone have a hint where my error could be?
The python code snippet above is correct. There was a problem in the matlab code which I didn't recognise while testing. This question and answer can be deleted.
I'm trying to implement my own version of the 'cd' command that presents the user with a list of hard-coded directories to choose from, and the user has to enter a number corresponding to an entry in the list. The program, named my_cd.py for now, should then effectively 'cd' the user to the chosen directory. Example of how this should work:
/some/directory
$ my_cd.py
1) ~
2) /bin/
3) /usr
Enter menu selection, or q to quit: 2
/bin
$
Currently, I'm trying to 'cd' using os.chdir('dir'). However, this doesn't work, probably because my_cd.py is kicked off in its own child process. I tried wrapping the call to my_cd.py in a sourced bash script named my_cd.sh:
#! /bin/bash
function my_cd() {
/path/to/my_cd.py
}
/some/directory
$ . my_cd.sh
$ my_cd
... shows list of dirs, but doesn't 'cd' in the interactive shell
Any ideas on how I can get this to work? Is it possible to change my interactive shell's current directory from a python script?
Change your sourced bash code to:
#! /bin/bash
function my_cd() {
cd `/path/to/my_cd.py`
}
and your Python code to do all of its cosmetic output (messages to the users, menus, etc) on sys.stderr, and, at the end, instead of os.chdir, just print (to sys.stdout) the path to which the directory should be changed.
my_cd.py:
#!/usr/bin/env python
import sys
dirs = ['/usr/bin', '/bin', '~']
for n, dir in enumerate(dirs):
sys.stderr.write('%d) %s\n' % (n+1, dir))
sys.stderr.write('Choice: ')
n = int(raw_input())
print dirs[n-1]
Usage:
nosklo:/tmp$ alias mcd="cd \$(/path/to/my_cd.py)"
nosklo:/tmp$ mcd
1) /usr/bin
2) /bin
3) ~
Choice: 1
nosklo:/usr/bin$
This can't be done. Changes to the working directory are not visible to parent processes. At best you could have the Python script print the directory to change to, then have the sourced script actually change to that directory.
For what its worth, since this question is also tagged "bash", here is a simple bash-only solution:
$ cat select_cd
#!/bin/bash
PS3="Number: "
dir_choices="/home/klittle /local_home/oracle"
select CHOICE in $dir_choices; do
break
done
[[ "$CHOICE" != "" ]] && eval 'cd '$CHOICE
Now, this script must be source'd, not executed:
$ pwd
/home/klittle/bin
$ source select_cd
1) /home/klittle
2) /local_home/oracle
Number: 2
$ pwd
/local_home/oracle
So,
$ alias mycd='source /home/klittle/bin/select_cd'
$ mycd
1) /home/klittle
2) /local_home/oracle
Number:
To solve your case, you could have the command the user runs be an alias that sources a bash script, which does the dir selection first, then dives into a python program after the cd has been done.
Contrary to what was said, you can do this by replacing the process image, twice.
In bash, replace your my_cd function with:
function my_cd() {
exec /path/to/my_cd.py "$BASH" "$0"
}
Then your python script has to finish with:
os.execl(sys.argv[1], sys.argv[2])
Remember to import os, sys at the beginning of the script.
But note that this is borderline hack. Your shell dies, replacing itself with the python script, running in the same process. The python script makes changes to the environment and replaces itself with the shell, back again, still in the same process. This means that if you have some other local unsaved and unexported data or environment in the previous shell session, it will not persist to the new one. It also means that rc and profile scripts will run again (not usually a problem).