How to mix bash with python - python

I enjoy using unix commands very much, but I came to the point, where I would find embedded python parts useful. This is my code:
#!/bin/bash -
echo "hello!";
exec python <<END_OF_PYTHON
#!/usr/bin/env python
import sys
print ("xyzzy")
sys.exit(0)
END_OF_PYTHON
echo "goodbye!";
However, only "hello" gets printed.
$ ./script.sh
hello!
xyzzy
How can I modify the bash script to fully embedd python? And would it then be possible to pass values from python variables into bash variables? Thanks a lot.

On the exec python ... line, you're exec()ing the Python interpreter on your PATH, so the python image will replace the bash image, and there is absolutely no hope of the echo "goodbye!" ever being executed. If that's what you want, that's fine, but otherwise, just omit the exec.
The shebang (“#!”) line in the python code is completely unnecessary. When you try to run an ordinary file, the kernel sees the “#!”, runs whatever follows it (/usr/bin/env python), and feeds the rest of the file to the stdin of whatever has been run. This is a general facility used to invoke interpreters. Since you are invoking the python interpreter yourself, not asking the kernel to do it, this is neither needed nor useful.
The sys.exit(0) is also unnecessary, since the Python interpreter will naturally exit when it gets to the end of its input (at END_OF_PYTHON) anyway. This means that the import sys is also unnecessary.
In summary, the following is what I would write to achieve what you appear to want to achieve:
#!/bin/bash
echo "hello!";
python <<END_OF_PYTHON
print ("xyzzy")
END_OF_PYTHON
echo "goodbye!";

Don't use exec. That replaces the shell process with the program you're running, so the rest of the script doesn't execute.
#!/bin/bash -
echo "hello!";
python <<END_OF_PYTHON
#!/usr/bin/env python
import sys
print ("xyzzy")
sys.exit(0)
END_OF_PYTHON
echo "goodbye!";

Don't use exec python, just use python.
The exec tells the shell to replace itself with the Python interpreter, so it's no longer running after that point.

Others have answered your specific issue, but in answer to the general question "How to mix bash with python", Xonsh may be useful to you. It's a special shell that allows you to use python and bash side-by-side. There's also sultan if you want to be able to easily call bash from python.

Or maybe utilizing the commenting and quoting feature of both language:
''':'
# bash code below
echo 'hello world (I am bash) !'
python $0
exit 0 # 'exit' is necessary.
#'''
# python code below
import os, sys
print("hello world (I am python) !")
Output:
bash-3.1$ ./bash-with-python
hello world (I am bash) !
hello world (I am python) !

Related

Forcing -i from a command line script [duplicate]

I have a python script that I like to run with python -i script.py, which runs the script and then enters interactive mode so that I can play around with the results.
Is it possible to have the script itself invoke this option, such that I can just run python script.py and the script will enter interactive mode after running?
Of course, I can simply add the -i, or if that is too much effort, I can write a shell script to invoke this.
From within script.py, set the PYTHONINSPECT environment variable to any nonempty string. Python will recheck this environment variable at the end of the program and enter interactive mode.
import os
# This can be placed at top or bottom of the script, unlike code.interact
os.environ['PYTHONINSPECT'] = 'TRUE'
In addition to all the above answers, you can run the script as simply ./script.py by making the file executable and setting the shebang line, e.g.
#!/usr/bin/python -i
this = "A really boring program"
If you want to use this with the env command in order to get the system default python, then you can try using a shebang like #donkopotamus suggested in the comments
#!/usr/bin/env PYTHONINSPECT=1 python
The success of this may depend on the version of env installed on your platform however.
You could use an instance of code.InteractiveConsole to get this to work:
from code import InteractiveConsole
i = 20
d = 30
InteractiveConsole(locals=locals()).interact()
running this with python script.py will launch an interactive interpreter as the final statement and make the local names defined visible via locals=locals().
>>> i
20
Similarly, a convenience function named code.interact can be used:
from code import interact
i = 20
d = 30
interact(local=locals())
This creates the instance for you, with the only caveat that locals is named local instead.
In addition to this, as #Blender stated in the comments, you could also embed the IPython REPL by using:
import IPython
IPython.embed()
which has the added benefit of not requiring the namespace that has been populated in your script to be passed with locals.
I think you're looking for this?
import code
foo = 'bar'
print foo
code.interact(local=locals())
I would simply accompany the script with a shell script that invokes it.
exec python -i "$(dirname "$0")/script.py"

Does Ruby have a version of `python -i`?

I've been looking for a while, but I haven't found anything in Ruby like python's -i flag.
Common behaviour for me if I'm testing something is to run the unfinished python script with a -i flag so that I can see and play around with the values in each variable.
If I try irb <file>, it still terminates at EOF, and, obviously ruby <file> doesn't work either. Is there a command-line flag that I'm missing, or some other way this functionality can be achieved?
Edit: Added an explanation of what kind of functionality I'm talking about.
Current Behaviour in Python
file.py
a = 1
Command Prompt
$ python -i file.py
>>> a
1
As you can see, the value of the variable a is available in the console too.
You can use irb -r ./filename.rb (-r for "require"), which should basically do the same as python -i ./filename.py.
Edit to better answer the refined question:
Actually, irb -r ./filename.rb does the equivalent of running irb and subsequently running
irb(main):001:0> require './filename.rb'. Thus, local variables from filename.rb do not end up in scope for inspection.
python -i ./filename.py seems to do the equivalent of adding binding.irb to the last line of the file and then running it with ruby ./filename.rb. There seems to be no one-liner equivalent to achieve this exact behaviour for ruby.
Is there a command-line flag that I'm missing, or some other way this functionality can be achieved?
Yes, there are both. I'll cover an "other way".
Starting with ruby 2.5, you can put a binding.irb in some place of your code and then the program will go into an interactive console at that point.
% cat stop.rb
puts 'hello'
binding.irb
Then
% ruby stop.rb
hello
From: stop.rb # line 3 :
1: puts 'hello'
2:
=> 3: binding.irb
irb(main):001:0>
It was possible for a long time before, with pry. But now it's in the standard package.
You can use the command irb. When that has started you can load and execute any ruby file with load './filename.rb'

How to get a python script to invoke "python -i" when called normally?

I have a python script that I like to run with python -i script.py, which runs the script and then enters interactive mode so that I can play around with the results.
Is it possible to have the script itself invoke this option, such that I can just run python script.py and the script will enter interactive mode after running?
Of course, I can simply add the -i, or if that is too much effort, I can write a shell script to invoke this.
From within script.py, set the PYTHONINSPECT environment variable to any nonempty string. Python will recheck this environment variable at the end of the program and enter interactive mode.
import os
# This can be placed at top or bottom of the script, unlike code.interact
os.environ['PYTHONINSPECT'] = 'TRUE'
In addition to all the above answers, you can run the script as simply ./script.py by making the file executable and setting the shebang line, e.g.
#!/usr/bin/python -i
this = "A really boring program"
If you want to use this with the env command in order to get the system default python, then you can try using a shebang like #donkopotamus suggested in the comments
#!/usr/bin/env PYTHONINSPECT=1 python
The success of this may depend on the version of env installed on your platform however.
You could use an instance of code.InteractiveConsole to get this to work:
from code import InteractiveConsole
i = 20
d = 30
InteractiveConsole(locals=locals()).interact()
running this with python script.py will launch an interactive interpreter as the final statement and make the local names defined visible via locals=locals().
>>> i
20
Similarly, a convenience function named code.interact can be used:
from code import interact
i = 20
d = 30
interact(local=locals())
This creates the instance for you, with the only caveat that locals is named local instead.
In addition to this, as #Blender stated in the comments, you could also embed the IPython REPL by using:
import IPython
IPython.embed()
which has the added benefit of not requiring the namespace that has been populated in your script to be passed with locals.
I think you're looking for this?
import code
foo = 'bar'
print foo
code.interact(local=locals())
I would simply accompany the script with a shell script that invokes it.
exec python -i "$(dirname "$0")/script.py"

Collecting return values from python functions in bash

I am implementing a bash script that will call a python script's function/method. I want to collect the return valuie of this function into a local variable in the calling bash script.
try1.sh contains:
#!/bin/sh
RETURN_VALUE=`python -c 'import try3; try3.printTry()'`
echo $RETURN_VALUE
Now the python script:
#!/usr/bin/python
def printTry():
print 'Hello World'
return 'true'
on excuting the bash script:
$./tr1.sh
Hello World
there is no 'true' or in that place any other type echoed to stdout as is desired.
Another thing I would want to be able to do is, my avtual python code will have around 20-30 functions returning various state values of my software state machine, and I would call these functions from a bash script. In the bash script, I have to store these return values in local variables which are to be used further down the state machine logic implemented in the calling bash script.
For each value, I would have do the python -c 'import python_module; python_module.method_name', which would re-enumerate the defined states of the state machine again and again, which I do not want. I want to avoid making the entire python script run just for calling a single function. Is that possible?
What possible solutions/suggestions/ideas can be thought of here?
I would appreciate the replies.
To clarify my intent, the task is to have a part of the bash script replaced by the python script for improving readability. The bash script is really very large(~ 15000 lines), and hence cannot be replaced by a single python script entirely. So parts which can be idetified to be improved can be replaced by python.
Also, I had thought of replacing the entire bash script by a python script as suggested by Victor in the comment below, but it wouldn't be feasible in my situation. Hence, I would have to have the state machine divided into bash and python, where python would have some required methods returning state values required by the bash script.
Regards,
Yusuf Husainy.
If you don't care about what the python function prints to stdout, you could do this:
$ py_ret_val=$(python -c '
from __future__ import print_function
import sys, try3
print(try3.printTry(), file=sys.stderr)
' 2>&1 1>/dev/null)
$ echo $py_ret_val
true

embedding Bash Script in python without using subprocess call

I have been able to use subprocess to embed bash script into python. I happen to navigate through a python code today and stumbled across this line of code below, which also embed bash script into python - using construct analogous to docstring.
#!/bin/bash -
''''echo -n
if [[ $0 == "file" ]]; then
..
fi
'''
Can someone throw light on this approach. What is this approach called, and perhaps the benefits associated. I can obviously see simplicity but I think there's more to this than that.
This is a somewhat clever way to make the file both a valid Python script and a valid bash script. Note that it does not cause a subprocess to magically be spawned. Rather, if the file is evaluated by bash, the bash script will be run, and if it is evaluated by Python, the bash script will be ignored.
It's clever, but probably not a good software engineering practice in general. It usually makes more sense to have separate scripts.
To give a more concrete example (say this file is called "polyglot"):
''''echo hello from bash
exit
'''
print('hello from python')
As you note, bash will ignore the initial quotes, and print "hello from bash", and then exit before reaching the triple quote. And Python will treat the bash script as a string, and ignore it, running the Python script below.
$ python polyglot
hello from python
$ bash polyglot
hello from bash
But naturally, this can usually (and more clearly) be refactored into two scripts, one in each language.
no, that's not embedded into python, the shebang says it's a bash script
the '''' is '' twice, which is just an empty string, it doesn't have any effect.
the ''' is invalid, as the last ' is not closed.

Categories

Resources