Passing Python variable as argument to shell script - python

I have a python code in which I need to run a shell script by passing arguments. Below is my code
#!/usr/bin/env python3
import subprocess
import os
import sys
filename=sys.argv[1]
subprocess.call(['bash','./shell.sh',filename])
print("Successfully completed")
Here I am trying to get an input as an argument while running ./file.py (filename) .But If I try to pass that as an argument to the shell script I am not able exceute the prograam.
Also I tried the code below
#!/usr/bin/env python3
import subprocess
import shlex
import sys
filename=sys.argv[1]
subprocess.call(shlex.split('./shell2.sh filename'))
For the above code I am getting error like this:
awk: fatal: cannot open file `filename' for reading (No such file or directory)
My script code:
file=$1
awk '{
if($0 ~ /LogType/){
if(hold ~ /LogType:stderr/){
print hold;
}
hold=$0
}else{
if($0 ~ /ERROR/ || $0 ~/WARN/){hold=hold "\n" $0 ": line " NR}else{hold=hold "\n" $0}
}
}END{
if(hold ~ /LogType:stderr/){
print hold
}
}' $file | sed -n -e 's/^.*LogType:\(stderr\)$/\1/p' \
-e 's/^.*Log Upload Time :\(.*\)/\1/p' \
-e 's/^.*LogLength:\(.*\)$/\1/p' \
-e 's/^.*\(ERROR\|WARN\).*$/\0/p'
How should I pass an input value to a shell script.Kindly help me to solve this issue .Thanks a lot!

Related

Can't assign bash variable in python subprocess

I am trying to assign to a variable the fingerprint of a pgp key in a bash subprocess of a python script.
Here's a snippet:
import subprocess
subprocess.run(
'''
export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"
echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"
''',
shell=True, check=True,
executable='/bin/bash')
The code runs but echo shows an empty variable:
KEY FINGERPRINT IS:
and if I try to use that variable for other commands I get the following error:
gpg: key "" not found: Not found
HOWEVER, if I run the same exact two lines of bash code in a bash script, everything works perfectly, and the variable is correctly assigned.
What is my python script missing?
Thank you all in advance.
The problem is the backslashes in your sed command. When you paste those into a Python string, python is escaping the backslashes. To fix this, simply add an r in front of your string to make it a raw string:
import subprocess
subprocess.run(
r'''
export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"
echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"
''',
shell=True, check=True,
executable='/bin/bash')
in order to run 2 commands in subprocess you need to run them one after each other or use ;
import subprocess
ret = subprocess.run('export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"; echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"', capture_output=True, shell=True)
print(ret.stdout.decode())
you can use popen:
commands = '''
export KEYFINGERPRINT="$(gpg --with-colons --fingerprint --list-secret-keys | sed -n 's/^fpr:::::::::\([[:alnum:]]\+\):/\1/p')"
echo "KEY FINGERPRINT IS: ${KEYFINGERPRINT}"
'''
process = subprocess.Popen('/bin/bash', stdin=subprocess.PIPE, stdout=subprocess.PIPE)
out, err = process.communicate(commands)
print out

Get a string in Shell/Python using sys.argv

I'm beginning with bash and I'm executing a script :
$ ./readtext.sh ./InputFiles/applications.txt
Here is my readtext.sh code :
#!/bin/bash
filename="$1"
counter=1
while IFS=: true; do
line=''
read -r line
if [ -z "$line" ]; then
break
fi
echo "$line"
python3 ./download.py \
-c ./credentials.json \
--blobs \
"$line"
done < "$filename"
I want to print the string ("./InputFiles/applications.txt") in a python file, I used sys.argv[1] but this line gives me -c. How can I get this string ? Thank you
It is easier for you to pass the parameter "$1" to the internal command python3.
If you don't want to do that, you can still get the external command line parameter with the trick of /proc, for example:
$ cat parent.sh
#!/usr/bin/bash
python3 child.py
$ cat child.py
import os
ext = os.popen("cat /proc/" + str(os.getppid()) + "/cmdline").read()
print(ext.split('\0')[2:-1])
$ ./parent.sh aaaa bbbb
['aaaa', 'bbbb']
Note:
the shebang line in parent.sh is important, or you should execute ./parent.sh with bash, else you will get no command line param in ps or /proc/$PPID/cmdline.
For the reason of [2:-1]: ext.split('\0') = ['bash', './parent.sh', 'aaaa', 'bbbb', ''], real parameter of ./parent.sh begins at 2, ends at -1.
Update: Thanks to the command of #cdarke that "/proc is not portable", I am not sure if this way of getting command line works more portable:
$ cat child.py
import os
ext = os.popen("ps " + str(os.getppid()) + " | awk ' { out = \"\"; for(i = 6; i <= NF; i++) out = out$i\" \" } END { print out } ' ").read()
print(ext.split(" ")[1 : -1])
which still have the same output.
This is the python file that you can use in ur case
import sys
file_name = sys.argv[1]
with open(file_name,"r") as f:
data = f.read().split("\n")
print("\n".join(data))
How to use sys.argv
How to use join method inside my python code

Why does this valid shell command throw an error in python via subprocess?

The line awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master) in bash returns my system's current volume (e.g. "97%").
I tried to incorporate this in Python 3
#!/usr/bin/env python3
import subprocess
command = "awk -F'[][]' '/dB/ { print $2 }' <(amixer sget Master)"
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).stdout.read()
print(output)
However the output from the shell returns
/bin/sh: 1: Syntax error: "(" unexpected
b''
Why does this fail and how do I fix my code?
As already pointed out, the syntax you are using is a bash syntax (a.k.a. bashism). The default shell used in subprocess.Popen is /bin/sh & it does not support process substitution.
You can specify the shell to be used via executable argument.
Try this:
output = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE, executable="/bin/bash").stdout.read()
Because you are using bashism in form of a process substitution, and your /bin/sh doesn't support that:
<(...)
Changing this to a pipe should solve your problem:
command = "amixer sget Master | awk -F'[][]' '/dB/ { print $2 }'"
Alternative you can start bash from within sh:
command = "bash -c 'amixer sget Master | awk -F'\\''[][]'\\'' '\\''/dB/ { print $2 }'\\'''"
But as you will soon realize, quoting and escaping will become a nightmare

return value from python script to shell script

I am new in Python. I am creating a Python script that returns a string "hello world." And I am creating a shell script. I am adding a call from the shell to a Python script.
i need to pass arguments from the shell to Python.
i need to print the value returned from Python in the shell script.
This is my code:
shellscript1.sh
#!/bin/bash
# script for testing
clear
echo "............script started............"
sleep 1
python python/pythonScript1.py
exit
pythonScript1.py
#!/usr/bin/python
import sys
print "Starting python script!"
try:
sys.exit('helloWorld1')
except:
sys.exit('helloWorld2')
You can't return message as exit code, only numbers. In bash it can accessible via $?. Also you can use sys.argv to access code parameters:
import sys
if sys.argv[1]=='hi':
print 'Salaam'
sys.exit(0)
in shell:
#!/bin/bash
# script for tesing
clear
echo "............script started............"
sleep 1
result=`python python/pythonScript1.py "hi"`
if [ "$result" == "Salaam" ]; then
echo "script return correct response"
fi
Pass command line arguments to shell script to Python like this:
python script.py $1 $2 $3
Print the return code like this:
echo $?
You can also use exit() without sys; one less thing to import. Here's an example:
$ python
>>> exit(1)
$ echo $?
1
$ python
>>> exit(0)
$ echo $?
0

Passing arguments to a embedded python script in bash

I'm having difficulties passing arguments to a embedded bash script.
#!/bin/bash/
function my_function() {
MYPARSER="$1" python - <<END
<<Some Python Code>>
class MyParser(OptionParser):
def format_epilog(self, formatter):
return self.epilog
parser=MyParser(version=VER, usage=USAGE, epilog=DESC)
parser.add_option("-s", "--Startdir", dest="StartDir",
metavar="StartDir"
)
parser.add_option("-r", "--report", dest="ReportDir",
metavar="ReportDir"
)
<<More Python Code>>
END
}
foo="-s /mnt/folder -r /storagefolder/"
my_function "$foo"
I've read Steve's Blog: Embedding python in bash scripts which helped but I'm still unable to pass the argument. I've tried both parser and myparser as environmental variables.
Is it as simple as defining $2 and passing them individually?
Thanks
You're overcomplicating this rather a lot. Why mess with a parser where
value="hello" python -c 'import os; print os.environ["value"]'
Or, for a longer script:
value="hello" python <<'EOF'
import os
print os.environ["value"]
EOF
If you need to set sys.argv for compatibility with existing code:
python - first second <<<'import sys; print sys.argv'
Thus:
args=( -s /mnt/folder -r /storagefolder/ )
python - "${args[#]}" <<'EOF'
import sys
print sys.argv # this is what an OptionParser will be looking at by default.
EOF

Categories

Resources