Python print output not appearing when called from bash - python

Here is a minimal working example:
I have a python script test.py that contains:
print("Hello")
and I have a bash script test.sh that calls that python function
#!/usr/bin/bash
python test.py
and when I run test.sh from terminal there is no output.
Based on a few similar questions, I have tried appending sys.stdout.flush and calling python -u instead, but there is still no output.
How do I get the output of print to show up?
Edit
In more complicated examples, how do I ensure that python print statements appear when called within a bash script? And ensure that those statements can be appropriately redirected with, e.g. &> operators?
(Also, I tried searching for a while before asking, but couldn't find a question that addressed this exactly. Any links to more thorough explanations would be greatly appreciated!)

My python output was missing when assigning it to a bash variable. I can't replicate your exact issue either, but I think this could help:
#!/usr/bin/bash
script_return=$(python test.py)
echo "$script_return"

Related

Edit shell script and python script while it's running

Recently, I have found the article about editing shell script while it's running.
Edit shell script while it's running
I prepared that code to reproduce the phenomenon with additional python script calling. I found echo in bash was NOT affected by editing while python script was affected.
Could someone explain this phenomenon? I have expected that all std outputs should be "modified".
test.sh
#!/bin/bash
sleep 30
echo "not modified"
python my_python.py
echo "not modified"
my_python.py
print("not modified")
Output result
$ bash test.sh // while sleeping, I edited test.sh and my_python.py to "modified"
not modified
modified
not modified
The bash script is already loaded in memory and executing and the results will not be affected until the next run. The python script is not loaded yet and is loaded in memory after you modify it.
If you do the reverse and launch the bash script from an equivalent python script, you will get the same behavior in reverse.
EDIT 05/10/2020
As pointed out by Gordon Davisson;
"Different versions of bash do different things. Some read the file byte-by-byte as they execute it, some I think load it in 8KB blocks (not necessarily the whole file), some do even more complicated things (AIUI it can also depend on the OS they're running under). See my answer Overwrite executing bash script files. Net result: do not count on any particular behavior."
That said, the OP's OS behavior seem to indicate a complete load of the script which explain the current behavior, albeit does not guarantee it.

How to pass Bash variables as Python arguments

I'm trying to figure out if it is possible to pass values stored in a variable in Bash in as argument values in a python script.
We have a python script that we use to create DNS records in BIND and I've been tasked with cleaning up our outdated DNS database. so far I have something like this in bash:
HOSTNAMES=$(</Users/test/test.txt)
ZONE="zone name here"
IP=$(</Users/test/iptest.txt)
for host in $HOSTNAMES
do
python pytest.py --host $HOSTNAMES --zone $ZONE --ip $IP
done
Unfortunately, I don't have a test environment where I can test this on before I run it on prod and I don't have any experience with Python or Bash scripting. I've mainly only done Powershell scripting, but was wondering if something like what I have above would work.
I've looked around on the forums here but have not found anything that I could make sense of. This post seems to answer my question somewhat, but I'm still not completely sure how this works. How to pass a Bash variable to Python?
Yes, seems to work just fine to me.
To test, I threw together a very quick python script called test.py:
#!/usr/bin/python
import sys
print 'number of arguments: ', len(sys.argv)
#we know sys.argv[0] is the script name, let's look at sys.argv[1]
print sys.argv[1]
Then, in your terminal, set a variable:
>testvar="TESTING"
And try your script, passing the variable into the script:
>python test.py $testvar
>number of arguments: 2
>TESTING
Admittedly I'm not very well-versed in Python, but this did seem to accomplish what you wanted. That said, this assumes the Python script you referenced is already set up to parse the names of the parameters being passed to it - you'll notice my test script does NOT do that, it simply spits out whatever you pass into it.
As long as the variables are exported, they're accessible from Python.
$ export VAR=val
$ python -c "import os; print os.environ['VAR']"
val

Collecting return values from python functions in bash

I am implementing a bash script that will call a python script's function/method. I want to collect the return valuie of this function into a local variable in the calling bash script.
try1.sh contains:
#!/bin/sh
RETURN_VALUE=`python -c 'import try3; try3.printTry()'`
echo $RETURN_VALUE
Now the python script:
#!/usr/bin/python
def printTry():
print 'Hello World'
return 'true'
on excuting the bash script:
$./tr1.sh
Hello World
there is no 'true' or in that place any other type echoed to stdout as is desired.
Another thing I would want to be able to do is, my avtual python code will have around 20-30 functions returning various state values of my software state machine, and I would call these functions from a bash script. In the bash script, I have to store these return values in local variables which are to be used further down the state machine logic implemented in the calling bash script.
For each value, I would have do the python -c 'import python_module; python_module.method_name', which would re-enumerate the defined states of the state machine again and again, which I do not want. I want to avoid making the entire python script run just for calling a single function. Is that possible?
What possible solutions/suggestions/ideas can be thought of here?
I would appreciate the replies.
To clarify my intent, the task is to have a part of the bash script replaced by the python script for improving readability. The bash script is really very large(~ 15000 lines), and hence cannot be replaced by a single python script entirely. So parts which can be idetified to be improved can be replaced by python.
Also, I had thought of replacing the entire bash script by a python script as suggested by Victor in the comment below, but it wouldn't be feasible in my situation. Hence, I would have to have the state machine divided into bash and python, where python would have some required methods returning state values required by the bash script.
Regards,
Yusuf Husainy.
If you don't care about what the python function prints to stdout, you could do this:
$ py_ret_val=$(python -c '
from __future__ import print_function
import sys, try3
print(try3.printTry(), file=sys.stderr)
' 2>&1 1>/dev/null)
$ echo $py_ret_val
true

How to mix bash with python

I enjoy using unix commands very much, but I came to the point, where I would find embedded python parts useful. This is my code:
#!/bin/bash -
echo "hello!";
exec python <<END_OF_PYTHON
#!/usr/bin/env python
import sys
print ("xyzzy")
sys.exit(0)
END_OF_PYTHON
echo "goodbye!";
However, only "hello" gets printed.
$ ./script.sh
hello!
xyzzy
How can I modify the bash script to fully embedd python? And would it then be possible to pass values from python variables into bash variables? Thanks a lot.
On the exec python ... line, you're exec()ing the Python interpreter on your PATH, so the python image will replace the bash image, and there is absolutely no hope of the echo "goodbye!" ever being executed. If that's what you want, that's fine, but otherwise, just omit the exec.
The shebang (“#!”) line in the python code is completely unnecessary. When you try to run an ordinary file, the kernel sees the “#!”, runs whatever follows it (/usr/bin/env python), and feeds the rest of the file to the stdin of whatever has been run. This is a general facility used to invoke interpreters. Since you are invoking the python interpreter yourself, not asking the kernel to do it, this is neither needed nor useful.
The sys.exit(0) is also unnecessary, since the Python interpreter will naturally exit when it gets to the end of its input (at END_OF_PYTHON) anyway. This means that the import sys is also unnecessary.
In summary, the following is what I would write to achieve what you appear to want to achieve:
#!/bin/bash
echo "hello!";
python <<END_OF_PYTHON
print ("xyzzy")
END_OF_PYTHON
echo "goodbye!";
Don't use exec. That replaces the shell process with the program you're running, so the rest of the script doesn't execute.
#!/bin/bash -
echo "hello!";
python <<END_OF_PYTHON
#!/usr/bin/env python
import sys
print ("xyzzy")
sys.exit(0)
END_OF_PYTHON
echo "goodbye!";
Don't use exec python, just use python.
The exec tells the shell to replace itself with the Python interpreter, so it's no longer running after that point.
Others have answered your specific issue, but in answer to the general question "How to mix bash with python", Xonsh may be useful to you. It's a special shell that allows you to use python and bash side-by-side. There's also sultan if you want to be able to easily call bash from python.
Or maybe utilizing the commenting and quoting feature of both language:
''':'
# bash code below
echo 'hello world (I am bash) !'
python $0
exit 0 # 'exit' is necessary.
#'''
# python code below
import os, sys
print("hello world (I am python) !")
Output:
bash-3.1$ ./bash-with-python
hello world (I am bash) !
hello world (I am python) !

embedding Bash Script in python without using subprocess call

I have been able to use subprocess to embed bash script into python. I happen to navigate through a python code today and stumbled across this line of code below, which also embed bash script into python - using construct analogous to docstring.
#!/bin/bash -
''''echo -n
if [[ $0 == "file" ]]; then
..
fi
'''
Can someone throw light on this approach. What is this approach called, and perhaps the benefits associated. I can obviously see simplicity but I think there's more to this than that.
This is a somewhat clever way to make the file both a valid Python script and a valid bash script. Note that it does not cause a subprocess to magically be spawned. Rather, if the file is evaluated by bash, the bash script will be run, and if it is evaluated by Python, the bash script will be ignored.
It's clever, but probably not a good software engineering practice in general. It usually makes more sense to have separate scripts.
To give a more concrete example (say this file is called "polyglot"):
''''echo hello from bash
exit
'''
print('hello from python')
As you note, bash will ignore the initial quotes, and print "hello from bash", and then exit before reaching the triple quote. And Python will treat the bash script as a string, and ignore it, running the Python script below.
$ python polyglot
hello from python
$ bash polyglot
hello from bash
But naturally, this can usually (and more clearly) be refactored into two scripts, one in each language.
no, that's not embedded into python, the shebang says it's a bash script
the '''' is '' twice, which is just an empty string, it doesn't have any effect.
the ''' is invalid, as the last ' is not closed.

Categories

Resources