Load environment variables from a shell script - python

I have a file with some environment variables that I want to use in a python script
The following works form the command line
$ source myFile.sh
$ python ./myScript.py
and from inside the python script I can access the variables like
import os
os.getenv('myvariable')
How can I source the shell script, then access the variables, from with the python script?

If you are saying backward environment propagation, sorry, you can't. It's a security issue. However, directly source environment from python is definitely valid. But it's more or less a manual process.
import subprocess as sp
SOURCE = 'your_file_path'
proc = sp.Popen(['bash', '-c', 'source {} && env'.format(SOURCE)], stdout=sp.PIPE)
source_env = {tup[0].strip(): tup[1].strip() for tup in map(lambda s: s.strip().split('=', 1), proc.stdout)}
Then you have everything you need in source_env.
If you need to write it back to your local environment (which is not recommended, since source_env keeps you clean):
import os
for k, v in source_env.items():
os.environ[k] = v
Another tiny attention needs to be paid here, is since I called bash here, you should expect the rules are applied here too. So if you want your variable to be seen, you will need to export them.
export VAR1='see me'
VAR2='but not me'

You can not load environmental variables in general from a bash or shell script, it is a different language. You will have to use bash to evaluate the file and then somehow print out the variables and then read them. see Forcing bash to expand variables in a string loaded from a file

Related

How can you create an os.environ object with a modified environment, e.g. after loading many different modules with "module load"?

I have a python script that calls an application using subprocess. I am calling this application many times, currently I am doing something along the lines of
out, err = subprocess.Popen(f"module load {' '.join(my_module_list)} && ./my_binary", stdout=sp.PIPE, stderr=sp.STDOUT, shell = True).communicate()
to run my program. Ideally I would like to first generate a modified os.environ object that already contains all the paths to the modules I am loading, and then pass it to subprocess.Popen under the env argument. However, since the printenv command doesn't output a python dictionary format, I'm not sure how to access all the modifications that modules load makes to the environment variables. Is there a good, clean way to create the required modified os.environ object?
I'd be tempted to call python in the subprocess and dump from os.environ in it
python -c 'import os; print(os.environ)'
Once you know what you're after, you can pass a dict directly to subprocess's env arg to set custom environmental vars, which could be something like
custom_env = os.environ.copy()
custom_env["foo"] = "bar"
subprocess.Popen(
...
env=custom_env,
)

How to call . /home/test.sh file in python script

I have file called . /home/test.sh (the space between the first . and / is intentional) which contains some environmental variables. I need to load this file and run the .py. If I run the command manually first on the Linux server and then run python script it generates the required output. However, I want to call . /home/test.sh from within python to load the profile and run rest of the code. If this profile is not loaded python scripts runs and gives 0 as an output.
The call
subprocess.call('. /home/test.sh',shell=True)
runs fine but the profile is not loaded on the Linux terminal to execute python code and give the desired output.
Can someone help?
Environment variables are not inherited directly by the parent process, which is why your simple approach does not work.
If you are trying to pick up environment variables that have been set in your test.sh, then one thing you could do instead is to use env in a sub-shell to write them to stdout after sourcing the script, and then in Python you can parse these and set them locally.
The code below will work provided that test.sh does not write any output itself. (If it does, then what you could do to work around it would be to echo some separator string afterward sourcing it, and before running the env, and then in the Python code, strip off the separator string and everything before it.)
import subprocess
import os
p = subprocess.Popen(". /home/test.sh; env -0", shell=True,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
out, _ = p.communicate()
for varspec in out.decode().split("\x00")[:-1]:
pos = varspec.index("=")
name = varspec[:pos]
value = varspec[pos + 1:]
os.environ[name] = value
# just to test whether it works - output of the following should include
# the variables that were set
os.system("env")
It is also worth considering that if all that you want to do is set some environment variables every time before you run any python code, then one option is just to source your test.sh from a shell-script wrapper, and not try to set them inside python at all:
#!/bin/sh
. /home/test.sh
exec "/path/to/your/python/script $#"
Then when you want to run the Python code, you run the wrapper instead.

set environment variables by file using python

I have a file contains set of environment variables .
env_script.env:
export a=hjk
export b=jkjk
export c=kjjhh
export i=jkkl
..........
I want set these environment variables by reading from file .
how can i do this in python
Tried sample code:
pipe = subprocess.Popen([".%s;env", "/home/user/env_script.env"], stdout=subprocess.PIPE, shell=True)
output = pipe.communicate()[0]
env = dict((line.split("=", 1) for line in output.splitlines()))
os.environ.update(env)
Please give some suggestion
There's a great python library python-dotenv that allows you to have your variables exported to your environment from a .env file, or any file you want, which you can keep out of source control (i.e. add to .gitignore):
# to install
pip install -U python-dotenv
# your .env file
export MY_VAR_A=super-secret-value
export MY_VAR_B=other-very-secret-value
...
And you just load it in python when your start like:
# settings.py
from dotenv import load_dotenv
load_dotenv()
Then, you can access any variable later in your code:
from os import environ
my_value_a = environ.get('MY_VALUE_A')
print(my_value_a) # 'super-secret-value'
You don't need to use subprocess.
Read lines and split environment variable name, value and assign it to os.environ:
import os
with open('/home/user/env_script.env') as f:
for line in f:
if 'export' not in line:
continue
if line.startswith('#'):
continue
# Remove leading `export `
# then, split name / value pair
key, value = line.replace('export ', '', 1).strip().split('=', 1)
os.environ[key] = value
or using dict.update and generator expression:
with open('env_script.env') as f:
os.environ.update(
line.replace('export ', '', 1).strip().split('=', 1) for line in f
if 'export' in line
)
Alternatively, you can make a wrapper shell script, which sources the env_script.env, then execute the original python file.
#!/bin/bash
source /home/user/env_script.env
python /path/to/original_script.py
Modern operating systems do not allow a child process to change the environment of its parent. The environment can only be changed for the current process and its descendants. And a Python interpreter is a child of the calling shell.
That's the reason why source is not an external command but is interpreted directly by the shell to allow a change in its environment.
It used to be possible in the good old MS/DOS system with the .COM executable format. A .com executable file had a preamble of 256 (0x100) bytes among which was a pointer to the COMMAND.COM's environment string! So with low level memory functions, and after ensuring not overwriting anything past the environment, a command could change directly its parent environment.
It may still be possible in modern OS, but require cooperation from system. For example Windows can allow a process to get read/write access to the memory of another process, provided the appropriate permissions are set. But this is really a hacky way, and I would not dare doing this in Python.
TL/DR: if your requirement is to change the environment of the calling shell from a Python script, you have misunderstood your requirement.
But what is easy is to start a new shell with a modified environment:
import os
import subprocess
env = os.environ.copy() # get a copy of current environment
# modify the copy of environment at will using for example falsetru's answer
# here is just an example
env['AAA'] = 'BBB'
# and open a subshell with the modified environment
p = subprocess.Popen("/bin/sh", env = env)
p.wait()

Import bash variables from a python script

I have seen plenty examples of running a python script from inside a bash script and either passing in variables as arguments or using export to give the child shell access, I am trying to do the opposite here though.
I am running a python script and have a separate file, lets call it myGlobalVariables.bash
myGlobalVariables.bash:
foo_1="var1"
foo_2="var2"
foo_3="var3"
My python script needs to use these variables.
For a very simple example:
myPythonScript.py:
print "foo_1: {}".format(foo_1)
Is there a way I can import them directly? Also, I do not want to alter the bash script if possible since it is a common file referenced many times elsewhere.
If your .bash file is formatted as you indicated - you might be able to just import it direct as a Python module via the imp module.
import imp
bash_module = imp.load_source("bash_module, "/path/to/myGlobalVariables.bash")
print bash_module.foo_1
You can also use os.environ:
Bash:
#!/bin/bash
# works without export as well
export testtest=one
Python:
#!/usr/bin/python
import os
os.environ['testtest'] # 'one'
I am very new to python, so I would welcome suggestions for more idiomatic ways to do this, but the following code uses bash itself to tell us which values get set by first calling bash with an empty environment (env -i bash) to tell us what variables are set as a baseline, then I call it again and tell bash to source your "variables" file, and then tell us what variables are now set. After removing some false-positives and an apparently-blank line, I loop through the "additional" output, looking for variables that were not in the baseline. Newly-seen variables get split (carefully) and put into the bash dictionary. I've left here (but commented-out) my previous idea for using exec to set the variables natively in python, but I ran into quoting/escaping issues, so I switched gears to using a dict.
If the exact call (path, etc) to your "variables" file is different than mine, then you'll need to change all of the instances of that value -- in the subprocess.check_output() call, in the list.remove() calls.
Here's the sample variable file I was using, just to demonstrate some of the things that could happen:
foo_1="var1"
foo_2="var2"
foo_3="var3"
if [[ -z $foo_3 ]]; then
foo_4="test"
else
foo_4="testing"
fi
foo_5="O'Neil"
foo_6='I love" quotes'
foo_7="embedded
newline"
... and here's the python script:
#!/usr/bin/env python
import subprocess
output = subprocess.check_output(['env', '-i', 'bash', '-c', 'set'])
baseline = output.split("\n")
output = subprocess.check_output(['env', '-i', 'bash', '-c', '. myGlobalVariables.bash; set'])
additional = output.split("\n")
# these get set when ". myGlobal..." runs and so are false positives
additional.remove("BASH_EXECUTION_STRING='. myGlobalVariables.bash; set'")
additional.remove('PIPESTATUS=([0]="0")')
additional.remove('_=myGlobalVariables.bash')
# I get an empty item at the end (blank line from subprocess?)
additional.remove('')
bash = {}
for assign in additional:
if not assign in baseline:
name, value = assign.split("=", 1)
bash[name]=value
#exec(name + '="' + value + '"')
print "New values:"
for key in bash:
print "Key: ", key, " = ", bash[key]
Another way to do it:
Inspired by Marat's answer, I came up with this two-stage hack. Start with a python program, let's call it "stage 1", which uses subprocess to call bash to source the variable file, as my above answer does, but it then tells bash to export all of the variables, and then exec the rest of your python program, which is in "stage 2".
Stage 1 python program:
#!/usr/bin/env python
import subprocess
status = subprocess.call(
['bash', '-c',
'. myGlobalVariables.bash; export $(compgen -v); exec ./stage2.py'
]);
Stage 2 python program:
#!/usr/bin/env python
# anything you want! for example,
import os
for key in os.environ:
print key, " = ", os.environ[key]
As stated in #theorifice answer, the trick here may be that such formatted file may be interpreted by both as bash and as python code. But his answer is outdated. imp module is deprecated in favour of importlib.
As your file has extension other than ".py", you can use the following approach:
from importlib.util import spec_from_loader, module_from_spec
from importlib.machinery import SourceFileLoader
spec = spec_from_loader("foobar", SourceFileLoader("foobar", "myGlobalVariables.bash"))
foobar = module_from_spec(spec)
spec.loader.exec_module(foobar)
I do not completely understand how this code works (where there are these foobar parameters), however, it worked for me. Found it here.

From Python execute shell command and incorporate environment changes (without subprocess)?

I'm exploring using iPython as shell replacement for a workflow that requires good logging and reproducibility of actions.
I have a few non-python binary programs and bash shell commands to run during my common workflow that manipulate the environment variables affecting subsequent work. i.e. when run from bash, the environment changes.
How can I incorporate these cases into the Python / iPython interactive shell and modify the environment going forward in the session?
Let's focus on the most critical case.
From bash, I woud do:
> sysmanager initialize foo
where sysmanager is a function:
> type sysmanager
sysmanager is a function
sysmanager ()
{
eval `/usr/bin/sysmanagercmd bash $*`
}
I don't control the binary sysmanagercmd and it generally makes non-trivial manipulations of the environment variables. Use of the eval built-in means these manipulations affect the shell process going forward -- that's critical to the design.
How can I call this command from Python / iPython with the same affect? Does python have something equivalent to bash's eval built-in for non-python commands?
Having not come across any built-in capability to do this, I wrote the following function which accomplishes the broad intent. Environment variable modifications and change of working directory are reflected in the python shell after the function returns. Any modification of shell aliases or functions are not retained but that could be done too with enhancement of this function.
#!/usr/bin/env python3
"""
Some functionality useful when working with IPython as a shell replacement.
"""
import subprocess
import tempfile
import os
def ShellEval(command_str):
"""
Evaluate the supplied command string in the system shell.
Operates like the shell eval command:
- Environment variable changes are pulled into the Python environment
- Changes in working directory remain in effect
"""
temp_stdout = tempfile.SpooledTemporaryFile()
temp_stderr = tempfile.SpooledTemporaryFile()
# in broader use this string insertion into the shell command should be given more security consideration
subprocess.call("""trap 'printf "\\0`pwd`\\0" 1>&2; env -0 1>&2' exit; %s"""%(command_str,), stdout=temp_stdout, stderr=temp_stderr, shell=True)
temp_stdout.seek(0)
temp_stderr.seek(0)
all_err_output = temp_stderr.read()
allByteStrings = all_err_output.split(b'\x00')
command_error_output = allByteStrings[0]
new_working_dir_str = allByteStrings[1].decode('utf-8') # some risk in assuming index 1. What if commands sent a null char to the output?
variables_to_ignore = ['SHLVL','COLUMNS', 'LINES','OPENSSL_NO_DEFAULT_ZLIB', '_']
newdict = dict([ tuple(bs.decode('utf-8').split('=',1)) for bs in allByteStrings[2:-1]])
for (varname,varvalue) in newdict.items():
if varname not in variables_to_ignore:
if varname not in os.environ:
#print("New Variable: %s=%s"%(varname,varvalue))
os.environ[varname] = varvalue
elif os.environ[varname] != varvalue:
#print("Updated Variable: %s=%s"%(varname,varvalue))
os.environ[varname] = varvalue
deletedVars = []
for oldvarname in os.environ.keys():
if oldvarname not in newdict.keys():
deletedVars.append(oldvarname)
for oldvarname in deletedVars:
#print("Deleted environment Variable: %s"%(oldvarname,))
del os.environ[oldvarname]
if os.getcwd() != os.path.normpath(new_working_dir_str):
#print("Working directory changed to %s"%(os.path.normpath(new_working_dir_str),))
os.chdir(new_working_dir_str)
# Display output of user's command_str. Standard output and error streams are not interleaved.
print(temp_stdout.read().decode('utf-8'))
print(command_error_output.decode('utf-8'))

Categories

Resources