I know that I can run a python script from my bash script using the following:
python python_script.py
But what about if I wanted to pass a variable / argument to my python script from my bash script. How can I do that?
Basically bash will work out a filename and then python will upload it, but I need to send the filename from bash to python when I call it.
To execute a python script in a bash script you need to call the same command that you would within a terminal. For instance
> python python_script.py var1 var2
To access these variables within python you will need
import sys
print(sys.argv[0]) # prints python_script.py
print(sys.argv[1]) # prints var1
print(sys.argv[2]) # prints var2
Beside sys.argv, also take a look at the argparse module, which helps define options and arguments for scripts.
The argparse module makes it easy to write user-friendly command-line interfaces.
Use
python python_script.py filename
and in your Python script
import sys
print sys.argv[1]
Embedded option:
Wrap python code in a bash function.
#!/bin/bash
function current_datetime {
python - <<END
import datetime
print datetime.datetime.now()
END
}
# Call it
current_datetime
# Call it and capture the output
DT=$(current_datetime)
echo Current date and time: $DT
Use environment variables, to pass data into to your embedded python script.
#!/bin/bash
function line {
PYTHON_ARG="$1" python - <<END
import os
line_len = int(os.environ['PYTHON_ARG'])
print '-' * line_len
END
}
# Do it one way
line 80
# Do it another way
echo $(line 80)
http://bhfsteve.blogspot.se/2014/07/embedding-python-in-bash-scripts.html
use in the script:
echo $(python python_script.py arg1 arg2) > /dev/null
or
python python_script.py "string arg" > /dev/null
The script will be executed without output.
I have a bash script that calls a small python routine to display a message window. As I need to use killall to stop the python script I can't use the above method as it would then mean running killall python which could take out other python programmes so I use
pythonprog.py "$argument" & # The & returns control straight to the bash script so must be outside the backticks. The preview of this message is showing it without "`" either side of the command for some reason.
As long as the python script will run from the cli by name rather than python pythonprog.py this works within the script. If you need more than one argument just use a space between each one within the quotes.
and take a look at the getopt module.
It works quite good for me!
Print all args without the filename:
for i in range(1, len(sys.argv)):
print(sys.argv[i])
Related
I currently have the following piece of code in bash, now I want to do this in python as well. However the python script being called is very long and changing that to a function would be a very tedious task. How can I do this in python without modifying the script being called?
gfs15_to_am10.py $LAT $LON $ALT $GFS_CYCLE $FORECAST_HOUR \
> layers.amc 2>layers.err
you can use os lib.
import os
os.system("bash commands")
Two option to use parameters:
option A:
import os
LAT = ''
os.system(f"echo {LAT}")
option B:
use argparse lib to get parameters as script arguments:
argparse python
My perl script is at path:
a/perl/perlScript.pl
my python script is at path:
a/python/pythonScript.py
pythonScript.py gets an argument from stdin, and returns result to stdout. From perlScript.pl , I want to run pythonScript.py with the argument hi to stdin, and save the results in some variable. That's what I tried:
my $ret = `../python/pythonScript.py < hi`;
but I got the following error:
The system cannot find the path specified.
Can you explain the path can't be found?
The qx operator (backticks) starts a shell (sh), in which prog < input syntax expects a file named input from which it will read lines and feed them to the program prog. But you want the python script to receive on its STDIN the string hi instead, not lines of a file named hi.
One way is to directly do that, my $ret = qx(echo "hi" | python_script).
But I'd suggest to consider using modules for this. Here is a simple example with IPC::Run3
use warnings;
use strict;
use feature 'say';
use IPC::Run3;
my #cmd = ('program', 'arg1', 'arg2');
my $in = "hi";
run3 \#cmd, \$in, \my $out;
say "script's stdout: $out";
The program is the path to your script if it is executable, or perhaps python script.py. This will be run by system so the output is obtained once that completes, what is consistent with the attempt in the question. See documentation for module's operation.
This module is intended to be simple while "satisfy 99% of the need for using system, qx, and open3 [...]. For far more power and control see IPC::Run.
You're getting this error because you're using shell redirection instead of just passing an argument
../python/pythonScript.py < hi
tells your shell to read input from a file called hi in the current directory, rather than using it as an argument. What you mean to do is
my $ret = `../python/pythonScript.py hi`;
Which correctly executes your python script with the hi argument, and returns the result to the variable $ret.
The Some of the other answers assume that hi must be passed as a command line parameter to the Python script but the asker says it comes from stdin.
Thus:
my $ret = `echo "hi" | ../python/pythonScript.py`;
To launch your external script you can do
system "python ../python/pythonScript.py hi";
and then in your python script
import sys
def yourFct(a, b):
...
if __name__== "__main__":
yourFct(sys.argv[1])
you can have more informations on the python part here
I am writing a bash script in which a small python script is embedded. I want to pass a variable from python to bash. After a few search I only found method based on os.environ.
I just cannot make it work. Here is my simple test.
#!/bin/bash
export myvar='first'
python - <<EOF
import os
os.environ["myvar"] = "second"
EOF
echo $myvar
I expected it to output second, however it still outputs first. What is wrong with my script? Also is there any way to pass variable without export?
summary
Thanks for all answers. Here is my summary.
A python script embedded inside bash will run as child process which by definition is not able to affect parent bash environment.
The solution is to pass assignment strings out from python and eval it subsequently in bash.
An example is
#!/bin/bash
a=0
b=0
assignment_string=$(python -<<EOF
var1=1
var2=2
print('a={};b={}'.format(var1,var2))
EOF
)
eval $assignment_string
echo $a
echo $b
Unless Python is used to do some kind of operation on the original data, there's no need to import anything. The answer could be as lame as:
myvar=$(python - <<< "print 'second'") ; echo "$myvar"
Suppose for some reason Python is needed to spit out a bunch of bash variables and assignments, or (cautiously) compose code on-the-fly. An eval method:
myvar=first
eval "$(python - <<< "print('myvar=second')" )"
echo "$myvar"
Complementing the useful Cyrus's comment in question, you just can't do it. Here is why,
Setting an environment variable sets it only for the current process and any child processes it launches. os.environ will set it only for the shell that is running to execute the command you provided. When that command finishes, the shell goes away, and so does the environment variable.
You can pretty much do that with a shell script itself and just source it to reflect it on the current shell.
There are a few "dirty" ways of getting something like this done. Here is an example:
#!/bin/bash
myvar=$(python - <<EOF
print "second"
EOF
)
echo "$myvar"
The output of the python process is stored in a bash variable. It gets a bit messy if you want to return more complex stuff, though.
You can make python return val and pass it to bash:
pfile.py
print(100)
bfile.sh
var=$(python pfile.py)
echo "$var"
output: 100
Well, this may not be what you want but one option could be running the other batch commands in python using subprocess
import subprocess
x =400
subprocess.call(["echo", str(x)])
But this is more of a temporary work around. The other solutions are more along what you are looking for.
Hope I was able to help!
I have seen plenty examples of running a python script from inside a bash script and either passing in variables as arguments or using export to give the child shell access, I am trying to do the opposite here though.
I am running a python script and have a separate file, lets call it myGlobalVariables.bash
myGlobalVariables.bash:
foo_1="var1"
foo_2="var2"
foo_3="var3"
My python script needs to use these variables.
For a very simple example:
myPythonScript.py:
print "foo_1: {}".format(foo_1)
Is there a way I can import them directly? Also, I do not want to alter the bash script if possible since it is a common file referenced many times elsewhere.
If your .bash file is formatted as you indicated - you might be able to just import it direct as a Python module via the imp module.
import imp
bash_module = imp.load_source("bash_module, "/path/to/myGlobalVariables.bash")
print bash_module.foo_1
You can also use os.environ:
Bash:
#!/bin/bash
# works without export as well
export testtest=one
Python:
#!/usr/bin/python
import os
os.environ['testtest'] # 'one'
I am very new to python, so I would welcome suggestions for more idiomatic ways to do this, but the following code uses bash itself to tell us which values get set by first calling bash with an empty environment (env -i bash) to tell us what variables are set as a baseline, then I call it again and tell bash to source your "variables" file, and then tell us what variables are now set. After removing some false-positives and an apparently-blank line, I loop through the "additional" output, looking for variables that were not in the baseline. Newly-seen variables get split (carefully) and put into the bash dictionary. I've left here (but commented-out) my previous idea for using exec to set the variables natively in python, but I ran into quoting/escaping issues, so I switched gears to using a dict.
If the exact call (path, etc) to your "variables" file is different than mine, then you'll need to change all of the instances of that value -- in the subprocess.check_output() call, in the list.remove() calls.
Here's the sample variable file I was using, just to demonstrate some of the things that could happen:
foo_1="var1"
foo_2="var2"
foo_3="var3"
if [[ -z $foo_3 ]]; then
foo_4="test"
else
foo_4="testing"
fi
foo_5="O'Neil"
foo_6='I love" quotes'
foo_7="embedded
newline"
... and here's the python script:
#!/usr/bin/env python
import subprocess
output = subprocess.check_output(['env', '-i', 'bash', '-c', 'set'])
baseline = output.split("\n")
output = subprocess.check_output(['env', '-i', 'bash', '-c', '. myGlobalVariables.bash; set'])
additional = output.split("\n")
# these get set when ". myGlobal..." runs and so are false positives
additional.remove("BASH_EXECUTION_STRING='. myGlobalVariables.bash; set'")
additional.remove('PIPESTATUS=([0]="0")')
additional.remove('_=myGlobalVariables.bash')
# I get an empty item at the end (blank line from subprocess?)
additional.remove('')
bash = {}
for assign in additional:
if not assign in baseline:
name, value = assign.split("=", 1)
bash[name]=value
#exec(name + '="' + value + '"')
print "New values:"
for key in bash:
print "Key: ", key, " = ", bash[key]
Another way to do it:
Inspired by Marat's answer, I came up with this two-stage hack. Start with a python program, let's call it "stage 1", which uses subprocess to call bash to source the variable file, as my above answer does, but it then tells bash to export all of the variables, and then exec the rest of your python program, which is in "stage 2".
Stage 1 python program:
#!/usr/bin/env python
import subprocess
status = subprocess.call(
['bash', '-c',
'. myGlobalVariables.bash; export $(compgen -v); exec ./stage2.py'
]);
Stage 2 python program:
#!/usr/bin/env python
# anything you want! for example,
import os
for key in os.environ:
print key, " = ", os.environ[key]
As stated in #theorifice answer, the trick here may be that such formatted file may be interpreted by both as bash and as python code. But his answer is outdated. imp module is deprecated in favour of importlib.
As your file has extension other than ".py", you can use the following approach:
from importlib.util import spec_from_loader, module_from_spec
from importlib.machinery import SourceFileLoader
spec = spec_from_loader("foobar", SourceFileLoader("foobar", "myGlobalVariables.bash"))
foobar = module_from_spec(spec)
spec.loader.exec_module(foobar)
I do not completely understand how this code works (where there are these foobar parameters), however, it worked for me. Found it here.
I'm using similar approach to call python function from my shell script:
python -c 'import foo; print foo.hello()'
But I don't know how in this case I can pass arguments to python script and also is it possible to call function with parameters from command line?
python -c 'import foo, sys; print foo.hello(); print(sys.argv[1])' "This is a test"
or
echo "Wham" | python -c 'print(raw_input(""));'
There's also argparse (py3 link) that could be used to capture arguments, such as the -c which also can be found at sys.argv[0].
A second library do exist but is discuraged, called getopt.getopt.
You don't want to do that in shell script.
Try this. Create a file named "hello.py" and put the following code in the file (assuming you are on unix system):
#!/usr/bin/env python
print "Hello World"
and in your shell script, write something lke this
#!/bin/sh
python hello.py
and you should see Hello World in the terminal.
That's how you should invoke a script in shell/bash.
To the main question: how do you pass arguments?
Take this simple example:
#!/usr/bin/env python
import sys
def hello(name):
print "Hello, " + name
if __name__ == "__main__":
if len(sys.argv) > 1:
hello(sys.argv[1])
else:
raise SystemExit("usage: python hello.py <name>")
We expect the len of the argument to be at least two. Like shell programming, the first one (index 0) is always the file name.
Now modify the shell script to include the second argument (name) and see what happen.
haven't tested my code yet but conceptually that's how you should go about
edit:
If you just have a line or two simple python code, sure, -c works fine and is neat. But if you need more complex logic, please put the code into a module (.py file).
You need to create one .py file.
And after you call it this way :
python file.py argv1 argv2
And after in your file, you have sys.argv list, who give you list of argvs.