I'm going nuts trying to figure out how to correctly pass arguments from a shell script to python when backticks are involved.
This is my ./foo.sh
#!/bin/bash
EXEC_SCRIPT="./foo.py -c $1"
result=`${EXEC_SCRIPT}`
echo ${result}
This is my ./foo.py
#!/usr/bin/python
import argparse
ap = argparse.ArgumentParser()
ap.add_argument('-c', required=True)
args,u = ap.parse_known_args()
args = vars(args)
print ("I GOT {}".format(args['c']))
I do:
./foo.sh ABC
and I get :
I GOT ABC
Perfect.
But then I do:
./foo.sh "Hello World"
And I get:
I GOT Hello
Trying to change the bash script to:
EXEC_SCRIPT="./foo.py -c \"$1\""
Produces:
I GOT "Hello
None of this is an issue if I don't use backticks. Escaped quotes work great.
What am I missing?
What I really want is the python script should get my argument, whether its 1 word or 2 without quotes.
Further clarification: Thanks to Gordon and AK47 for making the same suggestion. It looks like the root cause is I am stuffing the command in a variable called EXEC_SCRIPT. Invoking the command directly inside the backticks works. My real script is more complex and EXEC_SCRIPT points to different values due to different conditions. What's a good way to achieve clean code which lets me figure out the right command and then invoke it at the end? Using a variable is logical, as I did, but it apparently doesn't work
I have #that other guy to thank and #Gordon Davisson for the comment clarification to the suggestion.
Modifying foo.sh to execute an array instead of a string like so:
#!/bin/bash
EXEC_SCRIPT=(./foo.py -c "$1")
result=$("${EXEC_SCRIPT[#]}")
echo ${result}
Works perfectly well!!
foo.sh
#!/bin/bash
result=`./foo.py -c "$1"`
echo ${result}
foo.py
#!/usr/bin/python
import argparse
ap = argparse.ArgumentParser()
ap.add_argument('-c', required=True)
args,u = ap.parse_known_args()
args = vars(args)
print ("I GOT {}".format(args['c']))
Testing:
MacBook-Pro:~ ak$ sh foo.sh "Hello World"
I GOT Hello World
Related
I have written two python scripts A.py and B.py So B.py gets called in A.py like this:
config_object = {}
with open(config_filename) as data:
config_object = json.load(data, object_pairs_hook=OrderedDict)
command = './scripts/B.py --config-file={} --token-a={} --token-b={}'.format(promote_config_filename, config_object['username'], config_object['password'])
os.system(command)
In here config_object['password'] contains & in it. Say it is something like this S01S0lQb1T3&BRn2^Qt3
Now when this value get passed to B.py it gets password as S01S0lQb1T3 So after & whatever it is getting ignored.
How to solve this?
os.system runs a shell. You can escape arbitrary strings for the shell with shlex.quote() ... but a much superior solution is to use subprocess instead, like the os.system documentation also recommends.
subprocess.run(
['./scripts/B.py',
'--config-file={}'.format(promote_config_filename),
'--token-a={}'.format(config_object['username']),
'--token-b={}'.format(config_object['password'])])
Because there is no shell=True, the strings are now passed to the subprocess verbatim.
Perhaps see also Actual meaning of shell=True in subprocess
#tripleee has good suggestions. In terms of why this is happening, if you are running Linux/Unix at least, the & would start a background process. You can search "linux job control" for more info on that. The shortest (but not best) solution is to wrap your special characters in single or double quotes in the final command.
See this bash for a simple example:
$ echo foo&bar
[1] 20054
foo
Command 'bar' not found, but can be installed with:
sudo apt install bar
[1]+ Done echo foo
$ echo "foo&bar"
foo&bar
I have a python script (not created by me), let's call it myscript, which I call with several parameters.
So I run the script like this in Windows cmd:
Code:
/wherever/myscript --username=whoever /some/other/path/parameter
And then a prompt appears and I can pass arguments to the python script:
Process started successfully, blabla
Python 2.7.2 blabla
(LoggingConsole)
>>>
And I write my stuff, then quit to be back into cmd:
>>> command1()
>>> command2()
>>> quit()
I suspect some errors occurring in this part, but only once for a hundred trials. So I want to do it by a script.
I want to pipe to this script the internal command1 command2, so that I can test this function thousand times and see when it breaks. I have the following piece of code:
echo 'command1()' | py -i /wherever/myscript --username=whoever /some/other/path/parameter
This unfortunately doesn't generate the same behaviour, as if it would be manually entered.
Can I simulate this behaviour with pipes/redirecting output? Why doesn't it work? I expect that the 'command1()' text will be entered when the script waits for the commands, but it seems I'm wrong.
Thanks!
EDIT 16/02/2021 3:33PM :
I was looking for the cmd shell way to solve this, no python stuff
The piece of script
echo 'command1()' | py -i /wherever/myscript --username=whoever /some/other/path/parameter
is almost correct, just remove the '' :
echo command1() | py -i /wherever/myscript --username=whoever /some/other/path/parameter
my issues were coming from myscript. Once I fixed the weird things on this side, this part was all ok. You can even put all commands together:
echo command1();command2();quit(); | py -i /wherever/myscript --username=whoever /some/other/path/parameter
This question is adapted from a question of gplayersv the 23/08/2012 on unix.com, but the original purpose made the question not answered.
Easy to have pipes.
If you want to get the standard input :
import sys
imput = sys.stdin.read()
print(f'the standard imput was\n{imput}')
sys.stderr.write('This is an error message that will be ignored by piping')
If you want to use the standard input as argument:
echo param | xargs myprogram.py
Python's built-in fileinput module makes this simple and concise:
#!/usr/bin/env python3
import fileinput
with fileinput.input() as f:
for line in f:
print(line, end='')
Than you can accept input in whatever mechanism is easier for you:
$ ls | ./filein.py
$ ./filein.py /etc/passwd
$ ./filein.py < $(uname -r)
I have a python script that looks like this:
a='{}'
From my cmdline i want to be able to pass a parameter for that value in the brackets. So:
python- my_script.py Hello World
The output would then be:
'Hello World'
Any ideas or suggestion as to how to pass a parameter on the command line in a python script when running it? I am new to python so any tips would help!
Reference here!
I'm also fairly new to Python so not 100% positive this works but assuming it does, To recreate what your searching for.
#!/usr/bin/python
import sys
print ('Number of arguments:', len(sys.argv), 'arguments.')
print ('Argument List:', str(sys.argv))
After adding the Import sys
Where your python my_script.py 'Hello World'
Note that:
my_script.py equals sys.argv[0]
'Hello World' equals sys.argv[1]
Therefore setting a = sys.argv[1] should work!
To test it try to print(a)
Also note that this will not work if your cmdline is
python my_script.py Hello World
This will treat Hello and world as two separate arguments.
I writing CLI tool using argparse.
Below is my code snippet:
parser.ArgumentParser()
parser.add_argument('--username')
parser.add_argument('--password')
args = parser.parse_args()
print(args.password)
I run the script on my Mac:
>>> prog.py --username xyz --password abc$xyz
Output:
>>> abc
I know it's bash shell that interprets '$xyz' as a bash variable and tries to substitute '$xyz'
Is there are way I can get around this without enclosing the password in quotes?
You can put a backslash in front of the $: prog.py --username xyz --password abc\$xyz.
Otherwise, no. You correctly observed that it is bash that is doing this so, by the time Python/argparse receives the command, there is nothing that can be done.
I'm using similar approach to call python function from my shell script:
python -c 'import foo; print foo.hello()'
But I don't know how in this case I can pass arguments to python script and also is it possible to call function with parameters from command line?
python -c 'import foo, sys; print foo.hello(); print(sys.argv[1])' "This is a test"
or
echo "Wham" | python -c 'print(raw_input(""));'
There's also argparse (py3 link) that could be used to capture arguments, such as the -c which also can be found at sys.argv[0].
A second library do exist but is discuraged, called getopt.getopt.
You don't want to do that in shell script.
Try this. Create a file named "hello.py" and put the following code in the file (assuming you are on unix system):
#!/usr/bin/env python
print "Hello World"
and in your shell script, write something lke this
#!/bin/sh
python hello.py
and you should see Hello World in the terminal.
That's how you should invoke a script in shell/bash.
To the main question: how do you pass arguments?
Take this simple example:
#!/usr/bin/env python
import sys
def hello(name):
print "Hello, " + name
if __name__ == "__main__":
if len(sys.argv) > 1:
hello(sys.argv[1])
else:
raise SystemExit("usage: python hello.py <name>")
We expect the len of the argument to be at least two. Like shell programming, the first one (index 0) is always the file name.
Now modify the shell script to include the second argument (name) and see what happen.
haven't tested my code yet but conceptually that's how you should go about
edit:
If you just have a line or two simple python code, sure, -c works fine and is neat. But if you need more complex logic, please put the code into a module (.py file).
You need to create one .py file.
And after you call it this way :
python file.py argv1 argv2
And after in your file, you have sys.argv list, who give you list of argvs.