I have a python script (using a pseudo-terminal) to pass an environment variable called "CDP":
def download(self, dPluzz, donnees=None): # to call the bash script
self.child_pid = self.v.fork_command(None, ['/bin/bash', 'dPluzz-cli', '-f', dest, '-u', adresse])
os.environ["CDP"] = "False" # set cancel as "False"
def cancel(self, dPluzz, donnees=None):
if self.annul == 0:
if self.time > 10 and self.percent != 100:
os.environ["CDP"] = "True"
print os.environ["CDP"] # returns True
self.child_pid = str(self.child_pid)
cmd = 'kill -TERM' + " " + self.child_pid
subprocess.Popen(cmd, shell=True)
def __init__(self): #Pseudo-Terminal in GTK window
self.v = vte.Terminal() #(displayed in a notebook)
self.v.connect ("child-exited", lambda term: self.verif(self, a))
self.v.connect('contents-changed', self.term)
self.v.set_size(70,20)
self.v.set_encoding("UTF-8")
self.v.set_cursor_blinks(False)
self.v.show()
self.page.add(self.v)
The bash script is:
kill_jobs()
{
pkill -TERM -P "$BASHPID"
echo -e "$CDP" # returns False, should be True
if [ "$CDP" == "True" ]; then
echo -e "OPERATIONS ANNULEES"
elif [ "$CDP" == "False" ]; then
echo -e "OPERATIONS TERMINEES"
fi
}
The problem is, $CDP = False so the message displayed is not good.
What is the reason?
Thanks
After setting the environment via
os.environ["CDP"] = "True"
You can get this value in you bash only if you call the bash script via os.system(), os.popen() or os.fork() and os.execv().
so If you can add
os.system('/bin/bash script.sh')
You shall be able to use the value of CDP in bash script normally.
Please read os.putenv()
I guess os.environ and os.putenv() are closely related.
Related
I am using the subprocess.call to execute a shell script from another application I am integrating with. This script sets environment variables with export MY_VAR=foo. Next, I need to execute more commands over subprocess with the environment that was set by the shell script.
How to extract the state of environment from the child process? It only returns the errno code.
i.e. I want to run:
subprocess.call(["export", "MY_VAR=foo"]
subprocess.call(["echo", "$MY_VAR"]) # should print 'foo'.
I know that I can set environment with env keyword, but the point of my question is how to get the environment variables that a subprocess sets. In shell you can source any script to get it's declared environment variables. What's the alternative in python?
I ran into this issue just recently. It seems that this is a difficult problem for reasons upstream of Python: posix_spawn doesn't give a way to read the environment variables of the spawned process, nor is there any easy way to read the environment of a running process.
Bash's source is specific to running bash code in the bash interpreter: it just evals the file in the current bash interpreter rather than starting a subprocess. This mechanism can't work if you are running bash code from Python.
It is possible to make a separate mechanism specific to running bash code from Python. The following is the best that I could manage. Would be nice to have a less flimsy solution.
import json
import os
import subprocess
import sys
from contextlib import AbstractContextManager
class BashRunnerWithSharedEnvironment(AbstractContextManager):
"""Run multiple bash scripts with persisent environment.
Environment is stored to "env" member between runs. This can be updated
directly to adjust the environment, or read to get variables.
"""
def __init__(self, env=None):
if env is None:
env = dict(os.environ)
self.env: Dict[str, str] = env
self._fd_read, self._fd_write = os.pipe()
def run(self, cmd, **opts):
if self._fd_read is None:
raise RuntimeError("BashRunner is already closed")
write_env_pycode = ";".join(
[
"import os",
"import json",
f"os.write({self._fd_write}, json.dumps(dict(os.environ)).encode())",
]
)
write_env_shell_cmd = f"{sys.executable} -c '{write_env_pycode}'"
cmd += "\n" + write_env_shell_cmd
result = subprocess.run(
["bash", "-ce", cmd], pass_fds=[self._fd_write], env=self.env, **opts
)
self.env = json.loads(os.read(self._fd_read, 5000).decode())
return result
def __exit__(self, exc_type, exc_value, traceback):
if self._fd_read:
os.close(self._fd_read)
os.close(self._fd_write)
self._fd_read = None
self._fd_write = None
def __del__(self):
self.__exit__(None, None, None)
Example:
with BashRunnerWithSharedEnvironment() as bash_runner:
bash_runner.env.pop("A", None)
res = bash_runner.run("A=6; echo $A", stdout=subprocess.PIPE)
assert res.stdout == b'6\n'
assert bash_runner.env.get("A", None) is None
bash_runner.run("export A=2")
assert bash_runner.env["A"] == "2"
res = bash_runner.run("echo $A", stdout=subprocess.PIPE)
assert res.stdout == b'2\n'
res = bash_runner.run("A=6; echo $A", stdout=subprocess.PIPE)
assert res.stdout == b'6\n'
assert bash_runner.env.get("A", None) == "6"
bash_runner.env["A"] = "7"
res = bash_runner.run("echo $A", stdout=subprocess.PIPE)
assert res.stdout == b'7\n'
assert bash_runner.env["A"] == "7"
It is not possible, because the environment is changed only in the child process. You might from there return it as output to STDOUT, STDERR - but as soon as the subprocess is terminated, You can not access anything from it.
# this is process #1
subprocess.call(["export", "MY_VAR=foo"]
# this is process #2 - it can not see the environment of process #1
subprocess.call(["echo", "$MY_VAR"]) # should print 'foo'.
Not sure I see the problem here. You just need to remember the following:
each subprocess that gets started is independent of any setups done in previous subprocesses
if you want to set up some variables and use them, do both those things in ONE process
So make setupVars.sh like this:
export vHello="hello"
export vDate=$(date)
export vRandom=$RANDOM
And make printVars.sh like this:
#!/bin/bash
echo $vHello, $vDate, $vRandom
And make that executable with:
chmod +x printVars.sh
Now your Python looks like this:
import subprocess
subprocess.call(["bash","-c","source setupVars.sh; ./printVars.sh"])
Output
hello, Mon Jul 12 00:32:29 BST 2021, 8615
I tried to use python practice if __name__ == "__main__": on shellscript.
Sample scripts are the following:
a.sh:
#!/bin/bash
filename="a.sh"
function main() {
echo "start from $0"
echo "a.sh is called"
source b.sh
bfunc
}
[[ "$0" == "${filename}" ]] && main
b.sh:
#!/bin/bash
filename="b.sh"
function main() {
echo "start from $0"
echo "b.sh is called"
}
function bfunc() {
echo "hello bfunc"
}
[[ "$0" == "${filename}" ]] && main
You can call it with bash a.sh.
If you call bash a.sh, you'll get like the following:
start from a.sh
a.sh is called
hello bfunc
Here is my question.
How can I get file name itself without using $0?
I don't want to write file name directly, i.e. I want to pass the file name value to ${filename}.
See the link if you don't know that is the above python practice: What does if __name__ == "__main__": do?
How can I check wheather b.sh is started from command line or was executed by including from a.sh?
You may use the variable $BASH_SOURCE to get the name of the current script file.
if [[ "$0" == "$BASH_SOURCE" ]]
then
: "Execute only if started from current script"
else
: "Execute when included in another script"
fi
I am new in Python. I am creating a Python script that returns a string "hello world." And I am creating a shell script. I am adding a call from the shell to a Python script.
i need to pass arguments from the shell to Python.
i need to print the value returned from Python in the shell script.
This is my code:
shellscript1.sh
#!/bin/bash
# script for testing
clear
echo "............script started............"
sleep 1
python python/pythonScript1.py
exit
pythonScript1.py
#!/usr/bin/python
import sys
print "Starting python script!"
try:
sys.exit('helloWorld1')
except:
sys.exit('helloWorld2')
You can't return message as exit code, only numbers. In bash it can accessible via $?. Also you can use sys.argv to access code parameters:
import sys
if sys.argv[1]=='hi':
print 'Salaam'
sys.exit(0)
in shell:
#!/bin/bash
# script for tesing
clear
echo "............script started............"
sleep 1
result=`python python/pythonScript1.py "hi"`
if [ "$result" == "Salaam" ]; then
echo "script return correct response"
fi
Pass command line arguments to shell script to Python like this:
python script.py $1 $2 $3
Print the return code like this:
echo $?
You can also use exit() without sys; one less thing to import. Here's an example:
$ python
>>> exit(1)
$ echo $?
1
$ python
>>> exit(0)
$ echo $?
0
I'm calling a bash script which is exporting a few variables, i found a way to get those variables and it's working, once i'm trying to add to args to my bash script it's failing.
Here is part of my python script:
bash_script = "./testBash.sh"
script_execution = Popen(["bash", "-c", "trap 'env' exit; source \"$1\" > /dev/null 2>&1",
"_", bash_script], shell=False, stdout=PIPE)
err_code = script_execution.wait()
variables = script_execution.communicate()[0]
This is my sample Bash script:
export var1="test1"
export var2=$var1/test2
echo "this is firsr var: var1=$var1"
echo "this is the second var: var2=$var2"
Once i'm changing the bash_script = "./testBash.sh" to bash_script = "./testBash.sh test test"
I'm not getting back the exported variables from the bash script into the variables variable in the Python script.
The provided above is a sample, and of course my real scripts are much more bigger.
If you change bash_script = "./testBash.sh" to bash_script = "./testBash.sh test test" then the name of the bash_script changes to "./testBash.sh test test". The 'test test' is not interpreted as separate arguments.
Instead, add the extra arguments to the list being passed to Popen:
bash_script = "./testBash.sh"
script_execution = Popen(
["bash", "-c", "trap 'env' exit; source \"$1\" > /dev/null 2>&1",
"_", bash_script, 'test', 'test'], shell=False, stdout=PIPE)
Then err_code will be 0 (indicating success), instead of 1. It's not clear from your posted code however what you want to happen. The extra test arguments are ignored.
The extra argument are received by the bash script, however. If instead you put
export var1="$2"
in testBash.sh, then the variables (in the Python script) would contain
var1=test
You might also find it more convenient to use
import subprocess
import os
def source(script, update=True):
"""
http://pythonwise.blogspot.fr/2010/04/sourcing-shell-script.html (Miki Tebeka)
http://stackoverflow.com/a/20669683/190597 (unutbu)
"""
proc = subprocess.Popen(
". %s; env -0" % script, stdout=subprocess.PIPE, shell=True)
output = proc.communicate()[0]
env = dict((line.split("=", 1) for line in output.split('\x00') if line))
if update:
os.environ.update(env)
return env
bash_script = "./testBash.sh"
variables = source(bash_script)
print(variables)
which yields the environment variables
{ 'var1': 'test1', 'var2': 'test1/test2', ... }
in a dict.
I have two scripts:
Python:
if canceled==True:
os.environ["c"] = "0"
if canceled==False:
os.environ["c"] = "1"
Bash:
kill_jobs()
{
pkill -TERM -P "$BASHPID"
echo $c
if [ $c == "0" ]
then
echo -e "OPERATIONS CANCELED"
elif [ $c == "1" ]
then
echo -e "OPERATIONS FINISHED"
fi
}
trap kill_jobs EXIT
How can I do to pass a python variable to bash script ?
(My level in bash is near to 0 ...)
Thanks
Edit: Now I have this error: (in french)
[: == : opérateur unaire attendu
Or you can try:
os.environ["c"] = "value"
You can set it this way, I guess
Refer
The python script should end with:
print c
Then you use it in bash with:
c=$(python_script)