Calling a python function from a subshell - python

I am writing a python program which is runs some BASH scripts in sub-processes.
One might say that I am using BASH as a scripting language for my python program.
Is there a way to inject python functions into my bash script without having to reload the python code from bash?
strawman example: If this were Python running Javascript, then I could bind my Python functions into the js2py VM and call them from within javascript.
I thought of calling: python some_file_of_mine.py from bash , but this would be launching a new python process, without access to my python program's data.
Also thought of calling: python -c $SOME_INJECTED_PYTHON_CODE. This could use Python's inspect.source() to pre-inject some simple python code along with some bound data from the parent process into the child bash shell. However this will be very quote(`/") sensetive, and will cause some problems with $import.
What I would really like, is a simple way of calling the parent process the bash subprocess, and getting some data back (short of using a Flask + CURL combination)

You can send the result of your functions to the standard output by asking the Python interpreter to print the result:
python -c 'import test; print test.get_foo()'
The -c option simply asks Python to execute some Python commands.
In order to store the result in a variable, you can therefore do:
RESULT_FOO=`python -c 'import test; print test.get_foo()'`
or, equivalently
RESULT=$(python -c 'import test; print test.get_foo()')
since backticks and $(…) evaluate a command and replace it by its output.
PS: Getting the result of each function requires parsing the configuration file each time, with this approach. This can be optimized by returning all the results in one go, with something like:
ALL_RESULTS=$(python -c 'import test; print test.get_foo(), test.get_bar()')
The results can then be split and put in different variables with
RESULT_BAR=$(echo $ALL_RESULTS | cut -d' ' -f2)
which takes the second result and puts it in RESULT_BAR for example (and similarly: -fn for result #n).
PS2: it would probably be easier to do everything in a single interpreter (Python, but maybe also the shell), if possible, instead of calculating variables in one program and using them in another one.

Related

how to chain 2 commands on terminal

I am trying to use subprocess in my python script to open Julia and then run a script.
To run on my machine, I enter this in terminal:
$ julia
$ include(test.jl); func("in.csv", "out.csv")
How do I replicate this process and chain both of these commands so that I can run from subprocess in a single call?
I've tried julia; include(test.jl); func("in.csv", "out.csv") and julia && include(test.jl) && func("in.csv", "out.csv")
but both result in
-bash: syntax error near unexpected token `"test.jl"`
The key here is that you're not really chaining two commands from the standpoint of Python's subprocess. There's just one command: julia. You want to pass a somewhat complicated argument to Julia that will execute multiple Julia expressions.
In short, you just want to do:
subprocess.run(['julia','-e','include("test.jl"); func("in.csv", "out.csv")'])
What's happening here is that you're just executing one subprocess, julia, started up with the -e command line flag that just runs whatever comes next in Julia. You can optionally use the capitalized -E flag instead which will print out whatever func (your last expression there) returns.
It's worth pointing out, though, that there are better ways of getting Julia and Python interoperating — especially if you need to transfer data back and forth.

Is it possible to set bash/linux terminal environment Variables from Python Script? [duplicate]

i'm calling a python script inside my bash script and I was wondering if there is a simple way to set my bash variables within my python script.
Example:
My bash script:
#!/bin/bash
someVar=""
python3 /some/folder/pythonScript.py
My python script:
anotherVar="HelloWorld"
Is there a way I can set my someVar to the value of anotherVar? I was thinking of printing properties in a file inside the python script and then read them from my bash script but maybe there is another way. Also I don't know and don't think it makes any difference but I can name both variable with the same name (someVar/someVar instead of someVar/anotherVar)
No, when you execute python, you start a new process, and every process has access only to their own memory. Imagine what would happen if a process could influence another processes memory! Even for parent/child processes like this, this would be a huge security problem.
You can make python print() something and use that, though:
#!/usr/bin/env python3
print('Hello!')
And in your shell script:
#!/usr/bin/env bash
someVar=$(python3 myscript.py)
echo "$someVar"
There are, of course, many others IPC techniques you could use, such as sockets, pipes, shared memory, etc... But without context, it's difficult to make a specific recommendation.
shlex.quote() in Python 3, or pipes.quote() in Python 2, can be used to generate code which can be evaled by the calling shell. Thus, if the following script:
#!/usr/bin/env python3
import sys, shlex
print('export foobar=%s' % (shlex.quote(sys.argv[1].upper())))
...is named setFoobar and invoked as:
eval "$(setFoobar argOne)"
...then the calling shell will have an environment variable set with the name foobar and the value argOne.

How to set a environment variable in the current shell with Python?

I want to set an environment variable with a Python script, influencing the shell I am starting the script in. Here is what I mean
python -c "import os;os.system('export TESTW=1')"
But the command
echo ${TESTW}
returns nothing. Also with the expression
python -c "import os;os.environ['TEST']='1'"
it does not work.
Is there another way to do this in the direct sense? Or is it better to write the variables in a file which I execute from 'outside' of the Python script?
You can influence environment via: putenv BUT it will not influence the caller environment, only environment of forked children.
It's really much better to setup environment before launching the python script.
I may propose such variant. You create a bash script and a python script. In bash script you call the python script with params. One param - one env variable. Eg:
#!/bin/bash
export TESTV1=$(python you_program.py testv1)
export TESTV2=$(python you_program.py testv2)
and you_program.py testv1 returns value just for one env variable.
I would strongly suggest using the solution proposed by chepner and Maxym (where the Python script provides the values and your shell exports the variables). If that is not an option for you, you could still use eval to execute what the python script writes in your current Bash process:
eval $( python -c "print('export TESTW=1')" )
Caution: eval is usually read "evil" in Bash programming. As a general rule of thumb, one should avoid "blindly" executing code that is not fully under one's control. That includes being generated by another program at runtime as in this case. See also Stack Overflow question Why should eval be avoided in Bash, and what should I use instead?.

Collecting return values from python functions in bash

I am implementing a bash script that will call a python script's function/method. I want to collect the return valuie of this function into a local variable in the calling bash script.
try1.sh contains:
#!/bin/sh
RETURN_VALUE=`python -c 'import try3; try3.printTry()'`
echo $RETURN_VALUE
Now the python script:
#!/usr/bin/python
def printTry():
print 'Hello World'
return 'true'
on excuting the bash script:
$./tr1.sh
Hello World
there is no 'true' or in that place any other type echoed to stdout as is desired.
Another thing I would want to be able to do is, my avtual python code will have around 20-30 functions returning various state values of my software state machine, and I would call these functions from a bash script. In the bash script, I have to store these return values in local variables which are to be used further down the state machine logic implemented in the calling bash script.
For each value, I would have do the python -c 'import python_module; python_module.method_name', which would re-enumerate the defined states of the state machine again and again, which I do not want. I want to avoid making the entire python script run just for calling a single function. Is that possible?
What possible solutions/suggestions/ideas can be thought of here?
I would appreciate the replies.
To clarify my intent, the task is to have a part of the bash script replaced by the python script for improving readability. The bash script is really very large(~ 15000 lines), and hence cannot be replaced by a single python script entirely. So parts which can be idetified to be improved can be replaced by python.
Also, I had thought of replacing the entire bash script by a python script as suggested by Victor in the comment below, but it wouldn't be feasible in my situation. Hence, I would have to have the state machine divided into bash and python, where python would have some required methods returning state values required by the bash script.
Regards,
Yusuf Husainy.
If you don't care about what the python function prints to stdout, you could do this:
$ py_ret_val=$(python -c '
from __future__ import print_function
import sys, try3
print(try3.printTry(), file=sys.stderr)
' 2>&1 1>/dev/null)
$ echo $py_ret_val
true

embedding Bash Script in python without using subprocess call

I have been able to use subprocess to embed bash script into python. I happen to navigate through a python code today and stumbled across this line of code below, which also embed bash script into python - using construct analogous to docstring.
#!/bin/bash -
''''echo -n
if [[ $0 == "file" ]]; then
..
fi
'''
Can someone throw light on this approach. What is this approach called, and perhaps the benefits associated. I can obviously see simplicity but I think there's more to this than that.
This is a somewhat clever way to make the file both a valid Python script and a valid bash script. Note that it does not cause a subprocess to magically be spawned. Rather, if the file is evaluated by bash, the bash script will be run, and if it is evaluated by Python, the bash script will be ignored.
It's clever, but probably not a good software engineering practice in general. It usually makes more sense to have separate scripts.
To give a more concrete example (say this file is called "polyglot"):
''''echo hello from bash
exit
'''
print('hello from python')
As you note, bash will ignore the initial quotes, and print "hello from bash", and then exit before reaching the triple quote. And Python will treat the bash script as a string, and ignore it, running the Python script below.
$ python polyglot
hello from python
$ bash polyglot
hello from bash
But naturally, this can usually (and more clearly) be refactored into two scripts, one in each language.
no, that's not embedded into python, the shebang says it's a bash script
the '''' is '' twice, which is just an empty string, it doesn't have any effect.
the ''' is invalid, as the last ' is not closed.

Categories

Resources