Get change of variable environment from script python - python

I have a python script, this run a set_env.sh file.
Later, I can get this new variable environment changed into set_env.sh
Is possible?
set_env.sh
var1="value1" var2="value2" var3="value3"
get_env.py
import os
os.system("set_env.sh")
print os.environ['var1']
Result: KeyError: 'var1'
I know that I can use os.environ['var1'] = 'value' (or set var env) into python file.
But I have access to this environment info only from shell file.
Can you help me?

My temp solution was
(?P<variable>[\w]*)[ ]*[\=][ ]*[\"|\']{0,1}(?P<value>[\w]*)[\"|\']{0,1}
Debuggex Demo
enviroment_regex = "(?P<variable>[\w]*)[ ]*[\=][ ]*[\"|\']{0,1}(?P<value>[\w]*)[\"|\']{0,1}"
match_object = re.search( enviroment_regex, open("set_env.sh", "r").read() )
os.environ[match_object.group("variable")] = match_object.group("value")

In order to set an environment variable from python use:
os.environ["var1"] = "value1"
Pay attention that python spawns the environment, which means that the env will contain var1 as long as you're in the python program. If you'll type from shell env (after running the program) there will be no recollection of var1.
For more info: https://www.inkling.com/read/programming-python-mark-lutz-4th/chapter-3/shell-environment-variables

Related

How to make environment variable in Python

I need a help in making variables as ENV in python, so that I can see that variable by using 'export' command in Linux. So I tested a below short script and I can see variable using export command. But the problem is that, below two command didn't work.
var1 = os.environ['LINE']
print(var1)
Can you guide me how can I get this solved ?
import os
import json
import sys
Name = "a1"
def func():
var = 'My name is ' + '' + Name
os.putenv('LINE', var)
os.system('bash')
func()
var1 = os.environ['LINE']
print(var1)
Output:
export | grep LINE
declare -x LINE="My name is a1"
Try with
os.environ['LINE'] = var
instead of using putenv. Using putenv "bypasses" os.environ, that is, it doesn't update os.environ.
In fact, from the documentation for os.putenv:
Assignments to items in os.environ are automatically translated into corresponding calls to putenv(); however, calls to putenv() don’t update os.environ, so it is actually preferable to assign to items of os.environ. This also applies to getenv() and getenvb(), which respectively use os.environ and os.environb in their implementations."

set environment variables by file using python

I have a file contains set of environment variables .
env_script.env:
export a=hjk
export b=jkjk
export c=kjjhh
export i=jkkl
..........
I want set these environment variables by reading from file .
how can i do this in python
Tried sample code:
pipe = subprocess.Popen([".%s;env", "/home/user/env_script.env"], stdout=subprocess.PIPE, shell=True)
output = pipe.communicate()[0]
env = dict((line.split("=", 1) for line in output.splitlines()))
os.environ.update(env)
Please give some suggestion
There's a great python library python-dotenv that allows you to have your variables exported to your environment from a .env file, or any file you want, which you can keep out of source control (i.e. add to .gitignore):
# to install
pip install -U python-dotenv
# your .env file
export MY_VAR_A=super-secret-value
export MY_VAR_B=other-very-secret-value
...
And you just load it in python when your start like:
# settings.py
from dotenv import load_dotenv
load_dotenv()
Then, you can access any variable later in your code:
from os import environ
my_value_a = environ.get('MY_VALUE_A')
print(my_value_a) # 'super-secret-value'
You don't need to use subprocess.
Read lines and split environment variable name, value and assign it to os.environ:
import os
with open('/home/user/env_script.env') as f:
for line in f:
if 'export' not in line:
continue
if line.startswith('#'):
continue
# Remove leading `export `
# then, split name / value pair
key, value = line.replace('export ', '', 1).strip().split('=', 1)
os.environ[key] = value
or using dict.update and generator expression:
with open('env_script.env') as f:
os.environ.update(
line.replace('export ', '', 1).strip().split('=', 1) for line in f
if 'export' in line
)
Alternatively, you can make a wrapper shell script, which sources the env_script.env, then execute the original python file.
#!/bin/bash
source /home/user/env_script.env
python /path/to/original_script.py
Modern operating systems do not allow a child process to change the environment of its parent. The environment can only be changed for the current process and its descendants. And a Python interpreter is a child of the calling shell.
That's the reason why source is not an external command but is interpreted directly by the shell to allow a change in its environment.
It used to be possible in the good old MS/DOS system with the .COM executable format. A .com executable file had a preamble of 256 (0x100) bytes among which was a pointer to the COMMAND.COM's environment string! So with low level memory functions, and after ensuring not overwriting anything past the environment, a command could change directly its parent environment.
It may still be possible in modern OS, but require cooperation from system. For example Windows can allow a process to get read/write access to the memory of another process, provided the appropriate permissions are set. But this is really a hacky way, and I would not dare doing this in Python.
TL/DR: if your requirement is to change the environment of the calling shell from a Python script, you have misunderstood your requirement.
But what is easy is to start a new shell with a modified environment:
import os
import subprocess
env = os.environ.copy() # get a copy of current environment
# modify the copy of environment at will using for example falsetru's answer
# here is just an example
env['AAA'] = 'BBB'
# and open a subshell with the modified environment
p = subprocess.Popen("/bin/sh", env = env)
p.wait()

Load environment variables from a shell script

I have a file with some environment variables that I want to use in a python script
The following works form the command line
$ source myFile.sh
$ python ./myScript.py
and from inside the python script I can access the variables like
import os
os.getenv('myvariable')
How can I source the shell script, then access the variables, from with the python script?
If you are saying backward environment propagation, sorry, you can't. It's a security issue. However, directly source environment from python is definitely valid. But it's more or less a manual process.
import subprocess as sp
SOURCE = 'your_file_path'
proc = sp.Popen(['bash', '-c', 'source {} && env'.format(SOURCE)], stdout=sp.PIPE)
source_env = {tup[0].strip(): tup[1].strip() for tup in map(lambda s: s.strip().split('=', 1), proc.stdout)}
Then you have everything you need in source_env.
If you need to write it back to your local environment (which is not recommended, since source_env keeps you clean):
import os
for k, v in source_env.items():
os.environ[k] = v
Another tiny attention needs to be paid here, is since I called bash here, you should expect the rules are applied here too. So if you want your variable to be seen, you will need to export them.
export VAR1='see me'
VAR2='but not me'
You can not load environmental variables in general from a bash or shell script, it is a different language. You will have to use bash to evaluate the file and then somehow print out the variables and then read them. see Forcing bash to expand variables in a string loaded from a file

how to set environmental variables permanently in posix(unix/linux) machine using python script

I am trying to set a environmental variable permanently. but temporarily it is working.
if i run below program i got the variable path. after close it and open new terminal to find the variable path using the command printenv LD_LIBRARY_PATH nothing will print.
#!/usr/bin/python
import os
import subprocess
def setenv_var():
env_var = "LD_LIBRARY_PATH"
env_path = "/usr/local/lib"`enter code here`
os.environ[env_var] = env_path
process = subprocess.Popen('printenv ' + env_var, stdout=subprocess.PIPE, shell=True)
result = process.communicate()[0]
return result
if __name__ == '__main__':
print setenv_var()
please help me.
Here is what I use to set environment variables:
def setenv_var(env_file, set_this_env=True):
env_var = "LD_LIBRARY_PATH"
env_path = "/usr/local/lib"`enter code here`
# set environments opened later by appending to `source`-d file
with open(env_file, 'a') as f:
f.write(os.linesep + ("%s=%s" % (env_var, env_path)))
if set_this_end:
# set this environment
os.environ[env_var] = env_path
Now you only have to choose where to set it, that is the first argument in the function. I recommend the profile-specific file ~/.profile or if you're using bash which is pretty common ~/.bashrc
You can also set it globally by using a file like /etc/environment but you'll need to have permissions when you run this script (sudo python script.py).
Remember that environments are inherited from the parent process, and you can't have a child set up a parent process' environment.
When you set an environment variable, it only affects the currently running process (and, by extension, any children that are forked after the variable is set). If you are attempting to set an environment variable in your shell and you want that environment variable to always be set for your interactive shells, you need to set it in the startup scripts (eg .login, .bashrc, .profile) for your shell. Commands that you run are (initially) children of the shell from which you run them, so although they inherit the environment of the shell and can change their own environment, they cannot change the environment of your shell.
Whether you do an export from bash or you set your os.environ from Python, these only stay for the session or process's lifetime. If you want to set them permanent you will have to touch and add it to the respective shell's profile file.
For ex. If you are on bash, you could do:
with open("~/.bashrc", "a") as outfile: # 'a' stands for "append"
outfile.write("export LD_LIBRARY_PATH=/usr/local/lib")
Check this out for some insight as to which file to add this depending on the target shell. https://unix.stackexchange.com/questions/117467/how-to-permanently-set-environmental-variables

Linux shell source command equivalent in python [duplicate]

This question already has answers here:
Emulating Bash 'source' in Python
(6 answers)
Closed 5 years ago.
Converting a shell script to python and trying to find the best way to perform the following. I need this as it contains environment variables I need read.
if [ -e "/etc/rc.platform" ];
then
. "/etc/rc.platform"
fi
I have the 'if' converted but not sure how to handle the . "/etc/rc.platform" as source is a shell command. So far I have the following
if os.path.isfile("/etc/rc.platform"):
print "exists" <just to verify the if if working>
<what goes here to replace "source /etc/rc.platform"?>
I've looked at subprocess and execfile without success.
The python script will need to access the environment variables set by rc.platform
A somewhat hackish solution is to parse the env output:
newenv = {}
for line in os.popen('. /etc/rc.platform >&/dev/null; env'):
try:
k,v = line.strip().split('=',1)
except:
continue # bad line format, skip it
newenv[k] = v
os.environ.update(newenv)
Edit: fixed split argument, thanks to #l4mpi
(Here's a demonstration of the solution crayzeewulf described in his comment.)
If /etc/rc.platform only contains environment variables, you can read them and set them as env vars for your Python process.
Given this file:
$ cat /etc/rc.platform
FOO=bar
BAZ=123
Read and set environment variables:
>>> import os
>>> with open('/etc/rc.platform') as f:
... for line in f:
... k, v = line.split('=')
... os.environ[k] = v.strip()
...
>>> os.environ['FOO']
'bar'
>>> os.environ['BAZ']
'123'
Too much work for the return. Going to keep a small shell script to get all the env vars that we need and forget reading them into python.
Try this:
if os.path.exists ("/etc/rc.platform"):
os.system("/etc/rc.platform")
Since source is a shell builtin, you need to set shell=True when you invoke subprocess.call
>>> import os
>>> import subprocess
>>> if os.path.isfile("/etc/rc.platform"):
... subprocess.call("source /etc/rc.platform", shell=True)
I'm not sure what you're trying to do here, but I still wanted to mention this: /etc/rc.platform might export some shell functions to be used by other scripts in rc.d. Since these are shell functions, they would be exported only to the shell instance invoked by subprocess.call() and if you invoke another subprocess.call(), these functions would not be available since you're spawning a fresh new shell to invoke the new script.

Categories

Resources