I have written two python scripts A.py and B.py So B.py gets called in A.py like this:
config_object = {}
with open(config_filename) as data:
config_object = json.load(data, object_pairs_hook=OrderedDict)
command = './scripts/B.py --config-file={} --token-a={} --token-b={}'.format(promote_config_filename, config_object['username'], config_object['password'])
os.system(command)
In here config_object['password'] contains & in it. Say it is something like this S01S0lQb1T3&BRn2^Qt3
Now when this value get passed to B.py it gets password as S01S0lQb1T3 So after & whatever it is getting ignored.
How to solve this?
os.system runs a shell. You can escape arbitrary strings for the shell with shlex.quote() ... but a much superior solution is to use subprocess instead, like the os.system documentation also recommends.
subprocess.run(
['./scripts/B.py',
'--config-file={}'.format(promote_config_filename),
'--token-a={}'.format(config_object['username']),
'--token-b={}'.format(config_object['password'])])
Because there is no shell=True, the strings are now passed to the subprocess verbatim.
Perhaps see also Actual meaning of shell=True in subprocess
#tripleee has good suggestions. In terms of why this is happening, if you are running Linux/Unix at least, the & would start a background process. You can search "linux job control" for more info on that. The shortest (but not best) solution is to wrap your special characters in single or double quotes in the final command.
See this bash for a simple example:
$ echo foo&bar
[1] 20054
foo
Command 'bar' not found, but can be installed with:
sudo apt install bar
[1]+ Done echo foo
$ echo "foo&bar"
foo&bar
Related
I know similar questions have been asked before, but they all seem to have been resolved by reworking how arguments are passed (i.e. using a list, etc).
However, I have a problem here in that I don't have that option. There is a particular command line program (I am using a Bash shell) to which I must pass a quoted string. It cannot be unquoted, it cannot have a replicated argument, it just has to be either single or double quoted.
command -flag 'foo foo1'
I cannot use command -flag foo foo1, nor can I use command -flag foo -flag foo1. I believe this is an oversight in how the command was programmed to receive input, but I have no control over it.
I am passing arguments as follows:
self.commands = [
self.path,
'-flag1', quoted_argument,
'-flag2', 'test',
...etc...
]
process = subprocess.Popen(self.commands, stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
results = process.communicate(input)
Where quoted_argument is something like 'foo foo1 foo2'.
I have tried escaping the single quote ("\'foo foo1 foo2\'"), but I get no output.
I know this is considered bad practice because it is ambiguous to interpret, but I don't have another option. Any ideas?
The shell breaks command strings into lists. The quotes tell the shell to put multiple words into a single list item. Since you are building the list yourself, you add the words as a single item without the quotes.
These two Popen commands are equivalent
Popen("command -flag 'foo foo1'", shell=True)
Popen(["command", "-flag", "foo foo1"])
EDIT
This answer deals with escaping characters in the shell. If you don't use the shell, you don't add any quotes or escapes, just put in the string itself. There are other issues with skipping the shell, like piping commands, running background jobs, using shell variables and etc. These all can be done in python instead of the shell.
A mental model of process and shells that I found very helpful:
This mental model has helped me a lot through the years.
Processes in your operating system receive an array of strings representing the arguments. In Python, this array can be accessed from sys.argv. In C, this is the argv array passed to the main function. And so on.
When you open a terminal, you are running a shell inside that terminal, for example bash or zsh. What happens if you run a command like this one?
$ /usr/bin/touch one two
What happens is that the shell interprets the command that you wrote and splits it by whitespace to create the array ["/usr/bin/touch", "one", "two"]. It then launches a new process using that list of arguments, in this case creating two files named one and two.
What if you wanted one file named one two with a space? You can't pass the shell a list of arguments as you might want to do, you can only pass it a string. Shells like Bash and Zsh use single quotes to workaround this:
$ /usr/bin/touch 'one two'
The shell will create a new process with the arguments ["/usr/bin/touch", "one two"], which in this case create a file named one two.
Shells have special features like piping. With a shell, you can do something like this:
$ /usr/bin/echo 'This is an example' | /usr/bin/tr a-z A-Z
THIS IS AN EXAMPLE
In this case, the shell interprets the | character differently. In creates a process with the arguments ["/usr/bin/echo", "This is an example"] and another process with the arguments ["/usr/bin/tr", "a-z", "A-Z"], and will pipe the output of the former to the input of the latter.
How this applies to subprocess in Python
Now, in Python, you can use subprocess with shell=False (which is the default, or with shell=True. If you use the default behaviour shell=False, then subprocess expects you to pass it a list of arguments. You cannot use special shell features like shell piping. On the plus side, you don't have to worry about escaping special characters for the shell:
import subprocess
# create a file named "one two"
subprocess.call(["/usr/bin/touch", "one two"])
If you do want to use shell features, you can do something like:
subprocess.call(
"/usr/bin/echo 'This is an example' | /usr/bin/tr a-z A-Z",
shell=True,
)
If you are using variables with no particular guarantees, remember to escape the command:
import shlex
import subprocess
subprocess.call(
"/usr/bin/echo " + shlex.quote(variable) + " | /usr/bin/tr a-z A-Z",
shell=True,
)
(Note that shlex.quote is only designed for UNIX shells, and not for DOS on Windows.)
I need to execute the following in bash, but called from a Python script:
<command> --searchBody="{\"query":{\"range":{\"#timestamp\":{\"gte\": \"2020-10-16T00:00:00\",\"lte\": \"2020-10-17T00:00:00\"}}}}"
The double quotes must be escaped except for the ones that bound the --searchBody section
I have the following code in Python
execution = cmd+ ' --searchBody="{\\"query\":{\\"range\":{\\"#timestamp\\":{\\"gte\\": \\"'+startQuery+'\\",\\"lte\\": \\"'+endQuery+'\\"}}}}"'
print(execution)
os.system(execution)
cmd is the rest of the command already pre-populated, startQuery and endQuery are some date strings
The print statement prints the command exactly as it needs to, but when the os.system is run all the backslashes are removed from what is sent to the CLI.
I have tried all manor of escaping with multiple quotes but cannot get it to work. Any ideas?
Thanks
It's better to use subprocess module.
Try something like:
arg = '--searchBody="{\\"query\":{\\"range\":{\\"#timestamp\\":{\\"gte\\": \\"'+startQuery+'\\",\\"lte\\": \\"'+endQuery+'\\"}}}}"'
cmd = 'foobar'
subprocess.call([cmd, arg])
You should carefully read info about shell parameter and shlex module:
https://docs.python.org/3/library/subprocess.html#security-considerations
Using os.system() is like "I don't care about security, and all this crap, just run it"
I'm using a radio sender on my RPi to control some light-devices at home. I'm trying to implement a time control and had successfully used the program "at" in the past.
#!/usr/bin/python
import subprocess as sp
##### some code #####
sp.call(['at', varTime, '<<<', '\"sudo', './codesend', '111111\"'])
When I execute the program, i receive the
errmsg:
syntax error. Last token seen: <
Garbled time
This codesnipped works fine with every command by itself (as long every parameter is from type string).
It's neccessary to call "at" in this way: at 18:25 <<< "sudo ./codesend 111111" to hold the command in the queue (viewable in "atq"),
because sudo ./codesend 111111 | at 18:25 just executes the command directly and writes down the execution in "/var/mail/user".
My question ist, how can I avoid the syntax error.
I'm using a lot of other packages in this program, so I have to stay with Python
I hope someone has a solution for this problem or can help to find my mistake.
Many thanks in advance
Preface: Shared Code
Consider the following context to be part of both branches of this answer.
import subprocess as sp
try:
from shlex import quote # Python 3
except ImportError:
from pipes import quote # Python 2
# given the command you want to schedule, as an array...
cmd = ['sudo', './codesend', '111111']
# ...generate a safely shell-escaped string.
cmd_str = ' '.join(quote(x) for x in cmd))
Solution A: Feed Stdin In Python
<<< is shell syntax. It has no meaning to at, and it's completely normal and expected for at to reject it if given as a literal argument.
You don't need to invoke a shell, though -- you can do the same thing directly from native Python:
p = sp.Popen(['at', vartime], stdin=sp.PIPE)
p.communicate(cmd_str)
Solution B: Explicitly Invoke A Shell
Moreover, <<< isn't /bin/sh syntax -- it's an extension honored in bash, ksh, and others; so you can't reliably get it just by adding the shell=True flag (which uses /bin/sh and so guarantees only POSIX-baseline features). If you want it, you need to explicitly invoke a shell with the feature, like so:
bash_script = '''
at "$1" <<<"$2"
'''
sp.call(['bash', '-c', bash_script,
'_', # this is $0 for that script
vartime, # this is its $1
cmd_str, # this is its $2
])
In either case, note that we're using shlex.quote() or pipes.quote() (as appropriate for our Python release) when generating a shell command from an argument list; this is critical to avoid creating shell injection vulnerabilities in our software.
I am writing a bash script in which a small python script is embedded. I want to pass a variable from python to bash. After a few search I only found method based on os.environ.
I just cannot make it work. Here is my simple test.
#!/bin/bash
export myvar='first'
python - <<EOF
import os
os.environ["myvar"] = "second"
EOF
echo $myvar
I expected it to output second, however it still outputs first. What is wrong with my script? Also is there any way to pass variable without export?
summary
Thanks for all answers. Here is my summary.
A python script embedded inside bash will run as child process which by definition is not able to affect parent bash environment.
The solution is to pass assignment strings out from python and eval it subsequently in bash.
An example is
#!/bin/bash
a=0
b=0
assignment_string=$(python -<<EOF
var1=1
var2=2
print('a={};b={}'.format(var1,var2))
EOF
)
eval $assignment_string
echo $a
echo $b
Unless Python is used to do some kind of operation on the original data, there's no need to import anything. The answer could be as lame as:
myvar=$(python - <<< "print 'second'") ; echo "$myvar"
Suppose for some reason Python is needed to spit out a bunch of bash variables and assignments, or (cautiously) compose code on-the-fly. An eval method:
myvar=first
eval "$(python - <<< "print('myvar=second')" )"
echo "$myvar"
Complementing the useful Cyrus's comment in question, you just can't do it. Here is why,
Setting an environment variable sets it only for the current process and any child processes it launches. os.environ will set it only for the shell that is running to execute the command you provided. When that command finishes, the shell goes away, and so does the environment variable.
You can pretty much do that with a shell script itself and just source it to reflect it on the current shell.
There are a few "dirty" ways of getting something like this done. Here is an example:
#!/bin/bash
myvar=$(python - <<EOF
print "second"
EOF
)
echo "$myvar"
The output of the python process is stored in a bash variable. It gets a bit messy if you want to return more complex stuff, though.
You can make python return val and pass it to bash:
pfile.py
print(100)
bfile.sh
var=$(python pfile.py)
echo "$var"
output: 100
Well, this may not be what you want but one option could be running the other batch commands in python using subprocess
import subprocess
x =400
subprocess.call(["echo", str(x)])
But this is more of a temporary work around. The other solutions are more along what you are looking for.
Hope I was able to help!
I would like to write a Unit Test for a (rather complex) Bash completion script, preferrably with Python - just something that gets the values of a Bash completion programmatically.
The test should look like this:
def test_completion():
# trigger_completion should return what a user should get on triggering
# Bash completion like this: 'pbt createkvm<TAB>'
assert trigger_completion('pbt createkvm') == "module1 module2 module3"
How can I simulate Bash completion programmatically to check the completion values inside a testsuite for my tool?
Say you have a bash-completion script in a file called asdf-completion, containing:
_asdf() {
COMPREPLY=()
local cur prev
cur=$(_get_cword)
COMPREPLY=( $( compgen -W "one two three four five six" -- "$cur") )
return 0
}
complete -F _asdf asdf
This uses the shell function _asdf to provide completions for the fictional asdf command. If we set the right environment variables (from the bash man page), then we can get the same result, which is the placement of the potential expansions into the COMPREPLY variable. Here's an example of doing that in a unittest:
import subprocess
import unittest
class BashTestCase(unittest.TestCase):
def test_complete(self):
completion_file="asdf-completion"
partial_word="f"
cmd=["asdf", "other", "arguments", partial_word]
cmdline = ' '.join(cmd)
out = subprocess.Popen(['bash', '-i', '-c',
r'source {compfile}; COMP_LINE="{cmdline}" COMP_WORDS=({cmdline}) COMP_CWORD={cword} COMP_POINT={cmdlen} $(complete -p {cmd} | sed "s/.*-F \\([^ ]*\\) .*/\\1/") && echo ${{COMPREPLY[*]}}'.format(
compfile=completion_file, cmdline=cmdline, cmdlen=len(cmdline), cmd=cmd[0], cword=cmd.index(partial_word)
)],
stdout=subprocess.PIPE)
stdout, stderr = out.communicate()
self.assertEqual(stdout, "four five\n")
if (__name__=='__main__'):
unittest.main()
This should work for any completions that use -F, but may work for others as well.
je4d's comment to use expect is a good one for a more complete test.
bonsaiviking's solution almost worked for me. I had to change the bash string script. I added an extra ';' separator to the executed bash script otherwise the execution wouldn't work on Mac OS X. Not really sure why.
I also generalized the initialization of the various COMP_ arguments a bit to handle the various cases I ended up with.
The final solution is a helper class to test bash completion from python so that the above test would be written as:
from completion import BashCompletionTest
class AdsfTestCase(BashCompletionTest):
def test_orig(self):
self.run_complete("other arguments f", "four five")
def run_complete(self, command, expected):
completion_file="adsf-completion"
program="asdf"
super(AdsfTestCase, self).run_complete(completion_file, program, command, expected)
if (__name__=='__main__'):
unittest.main()
The completion lib is located under https://github.com/lacostej/unity3d-bash-completion/blob/master/lib/completion.py