Python subprocess: Get environment after subprocess completed - python

We have a tool that comes with a shell script which sets up the environment variables necessary for running the tool. It's fairly convoluted chain of different scripts that determine a bunch of stuff and export/set the env.
We then need that environment every time we want to call the tool itself.
Ideally we would be able to do something like this:
completed_script = subprocess.run("the_settings_script.bat")
[...]
subprocess.run(["some", "other", "call"], env=completed_script.env)
That doesn't work obviously. Is there another nice way to get back the environment after running a subprocess? We could of course run the script in every subprocess.run() call before the actual tool call, but that is kind of inefficient.

No there is no portable way. In any modern OS the parent environment is passed to child processes, but in no way a child can change its parent environment. It used to be possible in good old MS/DOS and only with .com type programs because the address of the parent environment was stored at a well known address in the child process but I know no such tricks for Windows or any Unix-like system.
Here the best way it to setup the environment before starting the Python interpretor. That way the changed environmnent will be passed to all the subprocesses.

Related

How do I add a directory to system environment variable using python script [duplicate]

From what I've read, any changes to the environment variables in a Python instance are only available within that instance, and disappear once the instance is closed. Is there any way to make them stick by committing them to the system?
The reason I need to do this is because at the studio where I work, tools like Maya rely heavily on environment variables to configure paths across multiple platforms.
My test code is
import os
os.environ['FAKE'] = 'C:\\'
Opening another instance of Python and requesting os.environ['FAKE'] yields a KeyError.
NOTE: Portability will be an issue, but the small API I'm writing will be able to check OS version and trigger different commands if necessary.
That said, I've gone the route of using the Windows registry technique and will simply write alternative methods that will call shell scripts on other platforms as they become requirements.
You can using SETX at the command-line.
By default these actions go on the USER env vars.
To set and modify SYSTEM vars use the /M flag
import os
env_var = "BUILD_NUMBER"
env_val = "3.1.3.3.7"
os.system("SETX {0} {1} /M".format(env_var,env_val))
make them stick by committing them to
the system?
I think you are a bit confused here. There is no 'system' environment. Each process has their own environment as part its memory. A process can only change its own environment. A process can set the initial environment for processes it creates.
If you really do think you need to set environment variables for the system you will need to look at changing them in the location they get initially loaded from like the registry on windows or your shell configuration file on Linux.
Under Windows it's possible for you to make changes to environment variables persistent via the registry with this recipe, though it seems like overkill.
To echo Brian's question, what are you trying to accomplish? There is probably an easier way.
Seems like there is simplier solution for Windows
import subprocess
subprocess.call(['setx', 'Hello', 'World!'], shell=True)
I don't believe you can do this; there are two work-arounds I can think of.
The os.putenv function sets the environment for processes you start with, i.e. os.system, popen, etc. Depending on what you're trying to do, perhaps you could have one master Python instance that sets the variable, and then spawns new instances.
You could run a shell script or batch file to set it for you, but that becomes much less portable. See this article:
http://code.activestate.com/recipes/159462/
Think about it this way.
You're not setting shell environment variables.
You're spawning a subshell with some given environment variable settings; this subshell runs your application with the modified environment.
According to this discussion, you cannot do it. What are you trying to accomplish?
You are forking a new process and cannot change the environment of the parent process as you cannot do if you start a new shell process from the shell
You might want to try Python Win32 Extensions, developed by Mark Hammond, which is included in the ActivePython (or can be installed separately). You can learn how to perform many Windows related tasks in Hammond's and Robinson's book.
Using PyWin32 to access windows COM objects, a Python program can use the Environment Property (a collection of environment variables) of the WScript.Shell object.
Try to use py-setenv that will allow you to set variable via registry
python -m pip install py-setenv
From within Python? No, it can't be done!
If you are not bound to Python, you should consider using shell scripts (sh, bash, etc). The "source" command allows you to run a script that modifies the environment and will "stick" like you want to the shell you "sourced" the script in. What's going on here is that the shell executes the script directly rather creating a sub-process to execute the script.
This will be quite portable - you can use cygwin on windows to do this.
In case someone might need this info. I realize this was asked 7 yrs ago, but even I forget how sometimes. .
Yes there is a way to make them "stick" in windows. Simply go control panel, system, advanced system settings,when the system properties window opens you should see an option (button) for Environment Variables. .The process for getting to this is a little different depending on what OS you're using (google it).
Choose that (click button), then the Environment Variables window will open. It has 2 split windows, the top one should be your "User Variables For yourusername". . .choose "new", then simply set the variable. For instance one of mine is "Database_Password = mypassword".
Then in your app you can access them like this: import os, os.environ.get('Database_Password'). You can do something like pass = os.environ.get('Database_Password').

Set shell environment variable via python script

I have some instrument which requires environment variable which I want to set automatically from python code. So I tried several ways to make it happen, but none of them were successful.
Here are some examples:
I insert following code in my python script
import os
os.system("export ENV_VAR=/some_path")
I created bash script(env.sh) and run it from python:
#!/bin/bash
export ENV_VAR=some_path
#call it from python
os.system("source env.sh")
I also tried os.putenv() and os.environ*["ENV_VAR"] = "some_path"
Is it possible to set(export) environment variable using python, i.e
without directly exporting it to shell?
Setting an environment variable sets it only for the current process and any child processes it launches. So using os.system will set it only for the shell that is running to execute the command you provided. When that command finishes, the shell goes away, and so does the environment variable. Setting it using os.putenv or os.environ has a similar effect; the environment variables are set for the Python process and any children of it.
I assume you are trying to have those variables set for the shell that you launch the script from, or globally. That can't work because the shell (or other process) is not a child of the Python script in which you are setting the variable.
You'll have better luck setting the variables in a shell script. If you then source that script (so that it runs in the current instance of the shell, rather than in a subshell) then they will remain set after the script ends.
As long as you start the "instrument" (a script I suppose) from the very same process it should work:
In [1]: os.putenv("VARIABLE", "123")
In [2]: os.system("echo $VARIABLE")
123
You can't change an environment variable of a different process or a parent process.
A shell function may do this. You need to print your export statement and eval that.
set_shell_env() {
output=$(python print_export_env.py $*)
eval $output
}
Depending on how you execute your instrument, you might be able to change environment specifically for the child process without affecting the parent. See documentation for os.spawn*e or subprocess.Popen which accept separate argument denoting child environment. For example, Replacing the os.spawn family in subprocess module documentation which provides both usages:
Environment example:
os.spawnlpe(os.P_NOWAIT, "/bin/mycmd", "mycmd", "myarg", env)
==>
Popen(["/bin/mycmd", "myarg"], env={"PATH": "/usr/bin"})

How to get environment of child process in Python

I need to source the environment of child process. I have a c-shell script(really complicated) that sets many environment variables and I want to use them in the parent process. I doing something like this:
subprocess.call(['set_env_vars.csh; env>crazy_vars.log' shell=True])
In this way I am trying to get the env of child process but this method is not working as I think commands after semicolon are treated as separate processes.
A possible solution is that I created another c-shell script and put those two commands in there and then call that script in python but thats a dirty way.
Is there a way to make two commands as the part of same process.
Thanks
On my system (as with many others) the shell is bash, not csh, so explicitly invoking csh is a good idea. Also, you need to source, not execute set_env_vars.csh:
subprocess.call(['/bin/csh', '-c', 'source set_env_vars.csh; env > crazy_vars.log'])

Setting env variable from a Python script

In my build (I'm using Linux) I need to call a Python script and set some env variables. I need these variables to be set even after I exit the script. I am able to set it using os.environ within the script but whenever I exit the script and try to see if the env variable is set from the terminal (echo $myenv) - I get nothing.
I am new to Python and did quite a bit googling to figure this out. However, I am not quite sure if it's possible. I tried using the subprocess:
subprocess.call('setenv myenv 4s3', shell=True)
Also tried using os.system:
os.system("setenv myenv 4s3")
So far, I didn't succeed.
You cannot set environment variables from a child process and have them be visible in the parent process. Every process gets its own copy of the environment, and changes do not propagate upwards.
What you could do is have the Python script print the settings it wants to change and have the outside shell execute the appropriate commands.
Maybe if you find some equivalent function like c vfork for Python.
When you vfork, both processes share memory space so, you might overwrite environment variables in parent process from child process.
Warning: vfork has many security issues, and therefore not recommended. Just use it if you are desperate.

How can I make a fake "active session" for gconf?

I've automated my Ubuntu installation - I've got Python code that runs automatically (after a clean install, but before the first user login - it's in a temporary /etc/init.d/ script) that sets up everything from Apache & its configuration to my personal Gnome preferences. It's the latter that's giving me trouble.
This worked fine in Ubuntu 8.04 (Hardy), but when I use this with 8.10 (Intrepid), the first time I try to access gconf, I get this exception:
Failed to contact configuration server; some possible causes are that you need to enable TCP/IP networking for ORBit, or you have stale NFS locks due to a system crash. See http://www.gnome.org/projects/gconf/ for information. (Details - 1: Not running within active session)
Yes, right, there's no Gnome session when this is running, because the user hasn't logged in yet - however, this worked before; this appears to be new with Intrepid's Gnome (2.24?).
Short of modifying the gconf's XML files directly, is there a way to make some sort of proxy Gnome session? Or, any other suggestions?
(More details: this is python code that runs as root, but setuid's & setgid's to be me before setting my preferences using the "gconf" module from the python-gconf package.)
I can reproduce this by installing GConf 2.24 on my machine. GConf 2.22 works fine, but 2.24 breaks it.
GConf is failing to launch because D-Bus is not running. Manually spawning D-Bus and the GConf daemon makes this work again.
I tried to spawn the D-Bus session bus by doing the following:
import dbus
dummy_bus = dbus.SessionBus()
...but got this:
dbus.exceptions.DBusException: org.freedesktop.DBus.Error.Spawn.ExecFailed: dbus-launch failed to autolaunch D-Bus session: Autolaunch error: X11 initialization failed.
Weird. Looks like it doesn't like to come up if X isn't running. To work around that, start dbus-launch manually (IIRC use the os.system() call):
$ dbus-launch
DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-eAmT3q94u0,guid=c250f62d3c4739dcc9a12d48490fc268
DBUS_SESSION_BUS_PID=15836
You'll need to parse the output somehow and inject them into environment variables (you'll probably want to use os.putenv). For my testing, I just used the shell, and set the environment vars manually with export DBUS_SESSION_BUS_ADDRESS=blahblah..., etc.
Next, you need to launch gconftool-2 --spawn with those environment variables you received from dbus-launch. This will launch the GConf daemon. If the D-Bus environment vars are not set, the daemon will not launch.
Then, run your GConf code. Provided you set the D-Bus session bus environment variables for your own script, you will now be able to communicate with the GConf daemon.
I know it's complicated.
gconftool-2 provides a --direct option that enables you to set GConf variables without needing to communicate with the server, but I haven't been able to find an equivalent option for the Python bindings (short of outputting XML manually).
Edit: For future reference, if anybody wants to run dbus-launch from within a normal bash script (as opposed to a Python script, as this thread is discussing), it is quite easy to retrieve the session bus address for use within the script:
#!/bin/bash
eval `dbus-launch --sh-syntax`
export DBUS_SESSION_BUS_ADDRESS
export DBUS_SESSION_BUS_PID
do_other_stuff_here
Well, I think I understand the question. Looks like your script just needs to start the dbus daemon, or make sure its started. I believe "session" here refers to a dbus session. (here is some evidence), not a Gnome session. Dbus and gconf both run fine without Gnome.
Either way, faking an "active session" sounds like a pretty bad idea. It would only look for it if it needed it.
Perhaps we could see the script in a pastebin? I should have really seen it before making any comment.
Thanks, Ali & Jeremy - both your answers were a big help. I'm still working on this (though I've stopped for the evening).
First, I took the hint from Ali and was trying part of Jeremy's suggestion: I was using dbus-launch to run "gconftool-2 --spawn". It didn't work for me; I now understand why (thx, Jeremy) -- I was trying to use gconf from within the same python program that was launching dbus & gconftool, but its environment didn't have the environment variables - duh.
I set that strategy aside when I noticed gconftool-2's --direct option; internally, gconftool-2 is using API that isn't exposed by the gconf python bindings. So, I modified python-gconf to expose the extra method, and once that builds (I had some unrelated problems getting this to work), we'll see if that fixes things - if it doesn't (and maybe if it does, because building those bindings seems to build all of gnome!), I'll find a better way to manage the environment variables in that first strategy.
(I'll add another answer here tomorrow either way)
And it's the next day: I ran into a little trouble with my modified python-gconf, which inspired me to try Jeremy's simpler idea, which worked fine - before doing the first gconf operation, I simply ran "dbus-launch", parsed the resulting name-value pairs, and added them directly to python's environment. Having done that, I ran "gconftool-2 --spawn". Problem solved.

Categories

Resources