Is it possible to have Python evaluate a Python statement within a Bash shell script? I am thinking about something along the lines of perl's -e option.
The problem at hand is that I'd like to use Python's split function on a Bash string. I know it's doable in Bash alone, but I am curious.
The equivalent of perl -e is python -c:
$ python -c "import sys;print sys.argv[1].split(',')" "foo,bar,baz"
['foo', 'bar', 'baz']
One way could be pass the statement to python executable as argument of -c as below, this should be an equivalent of perl -e
python -c 'print "ABC DEF HIJ".split()'
Another possible method might be to put your python statement in a python file and pass it as argument to the command as below
exec 'python xxx.py'
Related
I wrote a python function called plot_ts_ex that takes two arguments ts_file and ex_file (and the file name for this function is pism_plot_routine). I want to run this function from a bash script from terminal.
When I don't use variables in the bash script in pass the function arguments (in this case ts_file = ts_g10km_10ka_hy.nc and ex_file = ex_g10km_10ka_hy.nc) directly, like this:
#!/bin/sh
python -c 'import pism_plot_routine; pism_plot_routine.plot_ts_ex("ts_g10km_10ka_hy.nc", "ex_g10km_10ka_hy.nc")'
which is similar as in Run function from the command line, that works.
But when I define variables for the input arguments, it doesn't work:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c 'import pism_plot_routine; pism_plot_routine.plot_ts_ex("$ts_name", "$ex_name")'
It gives the error:
FileNotFoundError: [Errno 2] No such file or directory: b'$ts_name'
Then I found a similar question passing an argument to a python function from bash for a python function with only one argument and I tried
#!/bin/sh
python -c 'import sys, pism_plot_routine; pism_plot_routine.plot_ts_ex(sys.argv[1])' "$ts_name" "$ex_name"
but that doesn't work.
So how can I pass 2 arguments for a python function in a bash script using variables?
When you use single quotes the variables aren’t going to be expanded, you should use double quotes instead:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c "import pism_plot_routine; pism_plot_routine.plot_ts_ex('$ts_name', '$ex_name')"
You can also use sys.argv, arguments are stored in a list, so ts_name is sys.arv[1] and ex_name is sys.argv[2]:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c 'import sys, pism_plot_routine; pism_plot_routine.plot_ts_ex(sys.argv[1], sys.argv[2])' "$ts_name" "$ex_name"
You are giving the value $ts_name to python as string, bash does not do anything with it. You need to close the ', so that it becomes a string in bash, and then open it again for it to become a string in python.
The result will be something like this:
#!/bin/sh
ts_name="ts_g10km_10ka_hy.nc"
ex_name="ex_g10km_10ka_hy.nc"
python -c 'import pism_plot_routine; pism_plot_routine.plot_ts_ex("'$ts_name'", "'$ex_name'")'
For issues like this it is often nice to use some smaller code to test it, I used python3 -c 'print("${test}")' to figure out what it was passing to python, without the bother of the pism_plot.
I have a simple shell script where i want to be able to pass variables to some inline python i will write. For example like this
funny=879
echo $funny
python -c "
print(f"hello {$funny}")
"
However this prints
879
File "<string>", line 2
print(fhello
^
SyntaxError: unexpected EOF while parsing
(pipeline) $
Any thoughts on what i could be doing wrong? I know i am setting the variable correct because when i do echo it prints out the variable so it is definitely set correct but for some reason python script is not able to use it.
It's because you're using outer double quotes.
python -c "print(f"hello {$funny}")"
Gets turned into:
python -c print(fhello {879})
So python is passed 2 separate strings.
The inner double quotes would need to be escaped in order to get passed through to python.
$ funny=879; python3 -c "print(f\"hello $funny\")"
hello 879
Instead of messing around with quoting - if you export your variables you can access them from python using the os.environ dict.
$ export funny=879; python -c 'import os; print(os.environ["funny"])'
879
You can use the var=value command syntax and omit the export (note the lack of a semicolon)
$ funny=879 fonny=978 python3 -c 'import os; print(os.environ["funny"], os.environ["fonny"])'
879 978
Assume that python refers to C:\Program\python.exe as standard and I have a program which should be run with C:\Program\python_2.exe.
If I do
#!/bin/bash/
python=C:\Program\python_2.exe
python -c "print('Hello world!')" >log.txt 2>&1
it still uses the standard python and not python_2
first, let's check what your script actually does:
the first line, assigns a value C:\Program\python_2.exe to a variable named python:
python=C:\Program\python_2.exe
however, the next line doesn't use this variable at all.
it will simply run a program python:
python -c "print('Hello world!')" >log.txt 2>&1
funnily the program has the same name as one of the many variables, but that doesn't really matter.
for the shell, variables are totally unrelated to program-names (which are literals, searched for in ${PATH}).
if you want to use a variable for the program, you must make this explicit:
${python} -c "print('Hello world!')" >log.txt 2>&1
(
this still might not work, as backslashes on un*x systems (and bash comes from that realm) are considered special, so the ${python} variable might not actually hold what you think it does:
$ echo ${python}
C:Programpython_2.exe
so you probably need to escape the backslashes:
python="C:\\Program\\python_2.exe"
)
if you don't want to use a variable for calling your program but the literal python, you could define a function:
#!/bin/sh
# this defines a *shell-function* named `python`, which can be used as if it were a program:
python() {
# call a program (python_2.exe) with all the arguments that were given to the function:
C:\\Program\\python_2.exe "$#"
}
# call the 'python' function with some args:
python -c "print('Hello world!')" >log.txt 2>&1
You can try to use scl enable, to enable python version 2.7:
cd /var/www/python/scripts/
scl enable python27 "python runAllUpserts.py >/dev/null 2>&1"
Then you can use :
python -V
It looks like when python is invoked from powershell, later arguments of the form "-foo.bar" are split at the dot and seen by python as two separate arguments. Is this a bug? A feature of powershell? Something else?
powershell:
> python -c "import sys; print(sys.argv)" foo.bar -foo.bar
['-c', 'foo.bar', '-foo', '.bar']
cmd or linux:
> python -c "import sys; print(sys.argv)" foo.bar -foo.bar
['-c', 'foo.bar', '-foo.bar']
This also happens when doing e.g. python -mfoo.bar which is normally equivalent to python -m foo.bar, but, from powershell, becomes python -mfoo .bar.
I aimed to open multiple files (one by one, using for loop in bash terminal) and modify it using PLINK (a programme) and later on, python function. Following are the codes:
for i in {1..10}; do
plink --cow --noweb --lfile $i --extract extract1.snp --recode --out 1$i
python -c 'import file_convert;file_convert.convert_tree_mix("1$i.map","tmp$i")'
done
But, as expected, python could not read and could not open "11.map", it did not replace "$i" with 1. How can i modify the code so that python function, in combination with for loop, open different file each time based on the value of "i"
Have you tried calling python like that:
python -c 'import sys; import file_convert;file_convert.convert_tree_mix(sys.argv[1],sys.argv[2])' "1$i.map" "tmp$i";
?
You need to include the whole python code inside double quotes, so that the $1 inside the python code will expand. $1 in shell refers to the first parameter.
python -c "import file_convert;file_convert.convert_tree_mix(\"1$i.map\",\"tmp$i\")"