I'm trying to figure out if it is possible to pass values stored in a variable in Bash in as argument values in a python script.
We have a python script that we use to create DNS records in BIND and I've been tasked with cleaning up our outdated DNS database. so far I have something like this in bash:
HOSTNAMES=$(</Users/test/test.txt)
ZONE="zone name here"
IP=$(</Users/test/iptest.txt)
for host in $HOSTNAMES
do
python pytest.py --host $HOSTNAMES --zone $ZONE --ip $IP
done
Unfortunately, I don't have a test environment where I can test this on before I run it on prod and I don't have any experience with Python or Bash scripting. I've mainly only done Powershell scripting, but was wondering if something like what I have above would work.
I've looked around on the forums here but have not found anything that I could make sense of. This post seems to answer my question somewhat, but I'm still not completely sure how this works. How to pass a Bash variable to Python?
Yes, seems to work just fine to me.
To test, I threw together a very quick python script called test.py:
#!/usr/bin/python
import sys
print 'number of arguments: ', len(sys.argv)
#we know sys.argv[0] is the script name, let's look at sys.argv[1]
print sys.argv[1]
Then, in your terminal, set a variable:
>testvar="TESTING"
And try your script, passing the variable into the script:
>python test.py $testvar
>number of arguments: 2
>TESTING
Admittedly I'm not very well-versed in Python, but this did seem to accomplish what you wanted. That said, this assumes the Python script you referenced is already set up to parse the names of the parameters being passed to it - you'll notice my test script does NOT do that, it simply spits out whatever you pass into it.
As long as the variables are exported, they're accessible from Python.
$ export VAR=val
$ python -c "import os; print os.environ['VAR']"
val
Related
Here is a minimal working example:
I have a python script test.py that contains:
print("Hello")
and I have a bash script test.sh that calls that python function
#!/usr/bin/bash
python test.py
and when I run test.sh from terminal there is no output.
Based on a few similar questions, I have tried appending sys.stdout.flush and calling python -u instead, but there is still no output.
How do I get the output of print to show up?
Edit
In more complicated examples, how do I ensure that python print statements appear when called within a bash script? And ensure that those statements can be appropriately redirected with, e.g. &> operators?
(Also, I tried searching for a while before asking, but couldn't find a question that addressed this exactly. Any links to more thorough explanations would be greatly appreciated!)
My python output was missing when assigning it to a bash variable. I can't replicate your exact issue either, but I think this could help:
#!/usr/bin/bash
script_return=$(python test.py)
echo "$script_return"
I have a python script that I like to run with python -i script.py, which runs the script and then enters interactive mode so that I can play around with the results.
Is it possible to have the script itself invoke this option, such that I can just run python script.py and the script will enter interactive mode after running?
Of course, I can simply add the -i, or if that is too much effort, I can write a shell script to invoke this.
From within script.py, set the PYTHONINSPECT environment variable to any nonempty string. Python will recheck this environment variable at the end of the program and enter interactive mode.
import os
# This can be placed at top or bottom of the script, unlike code.interact
os.environ['PYTHONINSPECT'] = 'TRUE'
In addition to all the above answers, you can run the script as simply ./script.py by making the file executable and setting the shebang line, e.g.
#!/usr/bin/python -i
this = "A really boring program"
If you want to use this with the env command in order to get the system default python, then you can try using a shebang like #donkopotamus suggested in the comments
#!/usr/bin/env PYTHONINSPECT=1 python
The success of this may depend on the version of env installed on your platform however.
You could use an instance of code.InteractiveConsole to get this to work:
from code import InteractiveConsole
i = 20
d = 30
InteractiveConsole(locals=locals()).interact()
running this with python script.py will launch an interactive interpreter as the final statement and make the local names defined visible via locals=locals().
>>> i
20
Similarly, a convenience function named code.interact can be used:
from code import interact
i = 20
d = 30
interact(local=locals())
This creates the instance for you, with the only caveat that locals is named local instead.
In addition to this, as #Blender stated in the comments, you could also embed the IPython REPL by using:
import IPython
IPython.embed()
which has the added benefit of not requiring the namespace that has been populated in your script to be passed with locals.
I think you're looking for this?
import code
foo = 'bar'
print foo
code.interact(local=locals())
I would simply accompany the script with a shell script that invokes it.
exec python -i "$(dirname "$0")/script.py"
I am implementing a bash script that will call a python script's function/method. I want to collect the return valuie of this function into a local variable in the calling bash script.
try1.sh contains:
#!/bin/sh
RETURN_VALUE=`python -c 'import try3; try3.printTry()'`
echo $RETURN_VALUE
Now the python script:
#!/usr/bin/python
def printTry():
print 'Hello World'
return 'true'
on excuting the bash script:
$./tr1.sh
Hello World
there is no 'true' or in that place any other type echoed to stdout as is desired.
Another thing I would want to be able to do is, my avtual python code will have around 20-30 functions returning various state values of my software state machine, and I would call these functions from a bash script. In the bash script, I have to store these return values in local variables which are to be used further down the state machine logic implemented in the calling bash script.
For each value, I would have do the python -c 'import python_module; python_module.method_name', which would re-enumerate the defined states of the state machine again and again, which I do not want. I want to avoid making the entire python script run just for calling a single function. Is that possible?
What possible solutions/suggestions/ideas can be thought of here?
I would appreciate the replies.
To clarify my intent, the task is to have a part of the bash script replaced by the python script for improving readability. The bash script is really very large(~ 15000 lines), and hence cannot be replaced by a single python script entirely. So parts which can be idetified to be improved can be replaced by python.
Also, I had thought of replacing the entire bash script by a python script as suggested by Victor in the comment below, but it wouldn't be feasible in my situation. Hence, I would have to have the state machine divided into bash and python, where python would have some required methods returning state values required by the bash script.
Regards,
Yusuf Husainy.
If you don't care about what the python function prints to stdout, you could do this:
$ py_ret_val=$(python -c '
from __future__ import print_function
import sys, try3
print(try3.printTry(), file=sys.stderr)
' 2>&1 1>/dev/null)
$ echo $py_ret_val
true
I am trying to set environment variable using python. And this variable is used in another script.
My code is:
#!/usr/bin/env python
import os
os.environ['VAR'] = '/current_working_directory'
after executing above python script,i execute second script which uses the same variable 'VAR', but it is not working.
But when i do export VAR='/current_working_directory and run the second script, it works fine. I tried putenv() also.
This depends on how the second python script get's called.
If you have a shell, and the shell first runs the first python script, then the second, it won't work. The reason is that the first python script inherits the environment from the shell. But modifying os.environ[] or calling putenv() will then only modify the inherited environment --- the one from the second python script, not the one from the shell script.
If now the shell script runs the second python script, it will again inherit the environment from the shell ... and because the shell script is unmodified, you cannot see the modification the first python script did.
One way to achive your goal is using a helper file:
#!/bin/bash
rm -f envfile
./first_pythonscript
test -f envfile && . envfile
rm -f envfile
./second_pythonscript
That code is crude, it won't work if two instances of the shell script run, so don't use it as-is. But I hope you get the idea.
Even another way is to make your second_pythonscript not a program, but a Python module that the first_pythonscript can import. You can also make it a hybrid, library when imported, program when run via the if __name__ == "__main__": construct.
And finally you can use one of the os function, e.g. os.spawnvpe
This code should provide the required environment to your 2nd script:
#!/usr/bin/env python
import os
os.environ['VAR'] = '/current_working_directory'
execfile("/path/to/your/second/script.py")
A child process cannot change the environment of its parent process.
The general shell programming solution to this is to make your first script print out the value you need, and assign it in the calling process. Then you can do
#!/bin/sh
# Assign VAR to the output from first.py
VAR="$(first.py)"
export VAR
exec second.py
or even more succinctly
#!/bin/sh
VAR="$(first.py)" second.py
Obviously, if the output from first.py is trivial to obtain without invoking Python, that's often a better approach; but for e.g. having two scripts call different functions from a library and/or communicating with a common back end, this is a common pattern.
Using Python for the communication between two pieces of Python code is often more elegant and Pythonic, though.
#!/usr/bin/env python
from yourlib import first, second
value=first()
# maybe putenv VAR here if that's really, really necessary
second(value)
I just want to enter integrate these two commands from cmd and use them in a python script, but I'm novice in both programming and python. How I can achieve this?
cd C:\
python dumpimages.py http://google.com C:\images\
Use the os.subprocess module.
Note that exec() will also work but it is deprecated in favour of os.subprocess.
Further to my comment above, if all you want to do is retrieve images from a webpage, use GNU wget with the -A flag:
wget -r -P /save/location -A jpeg,jpg,bmp,gif,png http://www.domain.com
I suspect the cd C:\ is not necessary, and once that's gone all you seem to be asking for is a way to launch a python script.
Perhaps all you want is to modify saveimage.py so that it has the hard-coded arguments you want? I don't know what that script is--maybe you can tell us if it's important or you aren't allowed to modify it.
I think you are looking for sys.argv
Try:
import sys
print(sys.argv)
on top of your saveimage.py.
It should print something like:
['C:\saveimage.py', 'http://google.com', 'C:\images\']
You see, sys.argv is a list of strings. The first item is the name of the script itself, the others are the parameters.
Since lists are 0-based, you can access the i-th parameter to the script with sys.argv[i]