I have a simple shell script where i want to be able to pass variables to some inline python i will write. For example like this
funny=879
echo $funny
python -c "
print(f"hello {$funny}")
"
However this prints
879
File "<string>", line 2
print(fhello
^
SyntaxError: unexpected EOF while parsing
(pipeline) $
Any thoughts on what i could be doing wrong? I know i am setting the variable correct because when i do echo it prints out the variable so it is definitely set correct but for some reason python script is not able to use it.
It's because you're using outer double quotes.
python -c "print(f"hello {$funny}")"
Gets turned into:
python -c print(fhello {879})
So python is passed 2 separate strings.
The inner double quotes would need to be escaped in order to get passed through to python.
$ funny=879; python3 -c "print(f\"hello $funny\")"
hello 879
Instead of messing around with quoting - if you export your variables you can access them from python using the os.environ dict.
$ export funny=879; python -c 'import os; print(os.environ["funny"])'
879
You can use the var=value command syntax and omit the export (note the lack of a semicolon)
$ funny=879 fonny=978 python3 -c 'import os; print(os.environ["funny"], os.environ["fonny"])'
879 978
Related
I wrote a python function called plot_ts_ex that takes two arguments ts_file and ex_file (and the file name for this function is pism_plot_routine). I want to run this function from a bash script from terminal.
When I don't use variables in the bash script in pass the function arguments (in this case ts_file = ts_g10km_10ka_hy.nc and ex_file = ex_g10km_10ka_hy.nc) directly, like this:
#!/bin/sh
python -c 'import pism_plot_routine; pism_plot_routine.plot_ts_ex("ts_g10km_10ka_hy.nc", "ex_g10km_10ka_hy.nc")'
which is similar as in Run function from the command line, that works.
But when I define variables for the input arguments, it doesn't work:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c 'import pism_plot_routine; pism_plot_routine.plot_ts_ex("$ts_name", "$ex_name")'
It gives the error:
FileNotFoundError: [Errno 2] No such file or directory: b'$ts_name'
Then I found a similar question passing an argument to a python function from bash for a python function with only one argument and I tried
#!/bin/sh
python -c 'import sys, pism_plot_routine; pism_plot_routine.plot_ts_ex(sys.argv[1])' "$ts_name" "$ex_name"
but that doesn't work.
So how can I pass 2 arguments for a python function in a bash script using variables?
When you use single quotes the variables aren’t going to be expanded, you should use double quotes instead:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c "import pism_plot_routine; pism_plot_routine.plot_ts_ex('$ts_name', '$ex_name')"
You can also use sys.argv, arguments are stored in a list, so ts_name is sys.arv[1] and ex_name is sys.argv[2]:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c 'import sys, pism_plot_routine; pism_plot_routine.plot_ts_ex(sys.argv[1], sys.argv[2])' "$ts_name" "$ex_name"
You are giving the value $ts_name to python as string, bash does not do anything with it. You need to close the ', so that it becomes a string in bash, and then open it again for it to become a string in python.
The result will be something like this:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c 'import pism_plot_routine; pism_plot_routine.plot_ts_ex("'$ts_name'", "'$ex_name'")'
For issues like this it is often nice to use some smaller code to test it, I used python3 -c 'print("${test}")' to figure out what it was passing to python, without the bother of the pism_plot.
Assume that python refers to C:\Program\python.exe as standard and I have a program which should be run with C:\Program\python_2.exe.
If I do
#!/bin/bash/
python=C:\Program\python_2.exe
python -c "print('Hello world!')" >log.txt 2>&1
it still uses the standard python and not python_2
first, let's check what your script actually does:
the first line, assigns a value C:\Program\python_2.exe to a variable named python:
python=C:\Program\python_2.exe
however, the next line doesn't use this variable at all.
it will simply run a program python:
python -c "print('Hello world!')" >log.txt 2>&1
funnily the program has the same name as one of the many variables, but that doesn't really matter.
for the shell, variables are totally unrelated to program-names (which are literals, searched for in ${PATH}).
if you want to use a variable for the program, you must make this explicit:
${python} -c "print('Hello world!')" >log.txt 2>&1
(
this still might not work, as backslashes on un*x systems (and bash comes from that realm) are considered special, so the ${python} variable might not actually hold what you think it does:
$ echo ${python}
C:Programpython_2.exe
so you probably need to escape the backslashes:
python="C:\\Program\\python_2.exe"
)
if you don't want to use a variable for calling your program but the literal python, you could define a function:
#!/bin/sh
# this defines a *shell-function* named `python`, which can be used as if it were a program:
python() {
# call a program (python_2.exe) with all the arguments that were given to the function:
C:\\Program\\python_2.exe "$#"
}
# call the 'python' function with some args:
python -c "print('Hello world!')" >log.txt 2>&1
You can try to use scl enable, to enable python version 2.7:
cd /var/www/python/scripts/
scl enable python27 "python runAllUpserts.py >/dev/null 2>&1"
Then you can use :
python -V
I am running a python3 script which performs the following snippet on Debian 9:
os.environ["PA_DIR"] = "/home/myname/some_folder"
command_template = ("sudo java -Dconfig.file=$PA_DIR/google.conf "
"-jar ~/big/cromwell-42.jar run $PA_DIR/WholeGenomeGermlineSingleSample.wdl "
"-i {} -o $PA_DIR/au_options.json > FDP{}.log 2>&1")
command = command_template.format("test.json, "1")
os.system("screen -dm -S S{} bash -c '{}'".format("1", command))
The use of PA_DIR works as intended. When I tried it on command line:
PA_DIR="/home/myname/some_folder"
screen -dm -S S1 bash -c 'sudo java -Dconfig.file=$PA_DIR/google.conf -jar ~/big/cromwell-42.jar run $PA_DIR/WholeGenomeGermlineSingleSample.wdl -i test.json -o $PA_DIR/au_options.json > FDP1.log 2>&1'
it doesn't do variable substitution due to single quotes and I had to replace them with double quotes (it complains it cannot find the file /google.conf).
What is different when python runs it?
Thanks!
The Python os.system() invokes the underlying system function of the C library, which on POSIX systems is equivalent to do something like
sh -c "your_command and all its arguments"
So the command and all arguments are already surrounded by double-quotes, which does environment variable substitution. Any single quotes inside the string is irrelevant for the variable substitution.
You can test it easily. In a shell do something like
$ foo="bar"
$ echo "foo is '$foo'" # Will print foo is 'bar'
$ echo 'foo is "$foo"' # Will print foo is "$foo"
Waiting for your answer to daltonfury42, I'd bet the problem is, when running in a command line, you are not exporting the PA_DIR environment variable so it is not present in the second bash interpreter. And it behaves different beacuse of what Mihir answered.
If you run
PA_DIR=foo
you only declare a bash variable but it is not an environment variable. Then
bash -c "echo $PA_DIR"
this will output foo because your current interpreter interpolates $PA_DIR and then raises a second bash process with the command echo foo. But
bash -c 'echo $PA_DIR'
this prevents your bash interpreter from interpolating it so it raises a second bash process with the comand echo $PA_DIR. But in this second process the variable PA_DIR does not exist.
If you start your journey running
export PA_DIR=foo
this will become an environment variable that will be accessible to children processes, thus
bash -c 'echo $PA_DIR'
will output foo because the nested bash interpreter has access to the variable even if the parent bash interpreter did not interpolate it.
The same is true for any kind of children process. Try running
PA_DIR=foo
python3 -c 'import os; print(os.environ.get("PA_DIR"))'
python3 -c "import os; print(os.environ.get('PA_DIR'))"
export PA_DIR=foo
python3 -c 'import os; print(os.environ.get("PA_DIR"))'
python3 -c "import os; print(os.environ.get('PA_DIR'))"
in your shell. No quotes are involved here!
When you use the os.environ dictionary in a Python script, Python will export the variables for you. That's why you will see the variable interpolated by either
os.system("bash -c 'echo $PA_DIR'")
or
os.system('bash -c "echo $PA_DIR"')
But beware that in each case it is either the parent or either the children shell process who is interpolating the variable.
You must understand your process tree here:
/bin/bash # but it could be a zsh, fish, sh, ...
|- /usr/bin/python3 # presumably
|- /bin/sh # because os.system uses that
|- /bin/bash
If you want an environment variable to exist in the most nested process, you must export it anywhere in the upper tree. Or in that very process.
I aimed to open multiple files (one by one, using for loop in bash terminal) and modify it using PLINK (a programme) and later on, python function. Following are the codes:
for i in {1..10}; do
plink --cow --noweb --lfile $i --extract extract1.snp --recode --out 1$i
python -c 'import file_convert;file_convert.convert_tree_mix("1$i.map","tmp$i")'
done
But, as expected, python could not read and could not open "11.map", it did not replace "$i" with 1. How can i modify the code so that python function, in combination with for loop, open different file each time based on the value of "i"
Have you tried calling python like that:
python -c 'import sys; import file_convert;file_convert.convert_tree_mix(sys.argv[1],sys.argv[2])' "1$i.map" "tmp$i";
?
You need to include the whole python code inside double quotes, so that the $1 inside the python code will expand. $1 in shell refers to the first parameter.
python -c "import file_convert;file_convert.convert_tree_mix(\"1$i.map\",\"tmp$i\")"
Is it possible to have Python evaluate a Python statement within a Bash shell script? I am thinking about something along the lines of perl's -e option.
The problem at hand is that I'd like to use Python's split function on a Bash string. I know it's doable in Bash alone, but I am curious.
The equivalent of perl -e is python -c:
$ python -c "import sys;print sys.argv[1].split(',')" "foo,bar,baz"
['foo', 'bar', 'baz']
One way could be pass the statement to python executable as argument of -c as below, this should be an equivalent of perl -e
python -c 'print "ABC DEF HIJ".split()'
Another possible method might be to put your python statement in a python file and pass it as argument to the command as below
exec 'python xxx.py'