Passing modified environment using subprocess - python

I have a script that adds variables to the environment. That script is called through
subprocess.call('. myscript.sh', shell=True)
Is there a way I can get the modified environment and use it on my next subprocess call?This questions shows you can get the output of one call and chain it to another call Python subprocess: chaining commands with subprocess.run.Is there something similar with passing the environment?

You'll have to output the variables' content somehow. You're spawning a new process which will not propagate the environment variables back, so your python app will not see those values.
You could either make the script echo those to some file, or to the standard output if possible.
(Technically, it would be possible to stop the process and extract the values if you really wanted to hack that, but it's a bad idea.)

Related

Python - run another python script with current environment passing the arguments over and getting the printed print output

A little bit of an ugly question, but I didn't find existing SO posts which cover it.
Right now I need to use an existing python tool available on this github
This is a rather big piece of code with a lot of dependencies which I don't want to mess with. In a nutshell one can run its module by passing the command line arguments, for example:
timesearch.py timesearch -r "subreddit1" -l "1466812800" -up "1498348800"
Now, I need to run this tool a bunch of times using a for loop, passing over different argument values each time. The tool also prints out some output into command line when you run it - and I would like to intercept and print it out from my python script as well. Finally, I need to ensure that before I move on in my loop and run the tool another time that current execution of the timesearch tool is completed.
One side note here - I do need to ensure that the timesearch is executed using same environment which I use to run my main script with for loop.
I am trying to understand what is the best way to do it.
If I just go for this it doesn't work:
import os
#for loop will go here
os.system('python timesearch.py timesearch -r "ethereum" -l "1466812800" -up "1498348800"')
It fails due to several reasons - it doesn't use the environment in which I am writing my script with a loop, it also doesn't capture the print output of timesearch.
Any advice on how to achieve it?
Just to highlight - I can't just go and pull function I need in timesearch, since it calls the __init__ to set up some things based on the arguments you pass.
I wouldn't call python script with os.system. There is basically one function which you need to use: main(sys.argv[1:])
https://github.com/voussoir/timesearch/blob/master/timesearch/__init__.py#L435.

Retrieve environment variables from popen

I am working on converting an older system of batch files to python. I have encountered a batch file that sets the environment variables for the batch file that called it. Those values are then used again in future calls to batch files. My environment variables seem to go down the flow of calls but I am unable to bring them back up. I have read two different threads on here that talk about something similar but its not exactly the same. Is it possible to retrieve these environment variables from the subprocess?
Python subprocess/Popen with a modified environment
Set Environmental Variables in Python with Popen
What I am doing now:
p = Popen(process_, cwd=wd, stdout=PIPE, shell=True,
universal_newlines=True, stderr=STDOUT, env=env)
The old flow of bat files:
foo.bat calls foo-setparam.bat
foo-setparam.bat sets some variables
foo.bat uses these variables
foo.bat calls bar.bat
bar.bat uses variables set by both foo.bat and foo-setparam.bat
Update:
I have tried adding "call " in front of my popen parameter. I got the same behavior.
It's generally not possible for a child process to set parent environment variables, unless you want to go down a mildly evil path of creating a remote thread in the parent process and using that thread to set variables on your behalf.
It's much easier to have the python script write a temp file that contains a sequence of set commands like:
set VAR1=newvalue1
set VAR2=newvalue2
Then modify the calling batch file to call your python script like:
python MyScript.py
call MyScript_Env.cmd

Determine how environment variable was set

In Python I can access an environment variable as:
os.environ['FOO']
I would like to know if the variable was set previously via export or if it was set only for the current python script like so:
FOO=BAR python some-script.py
Basically I want to only use FOO if it was set like in the line above and not permanently defined per export.
Arguments to the python script itself unfortunately are no option here. This is a plugin and the parent application does not allow passing custom arguments it does not understand itself.
I was hoping I somehow could access the exact and full command (FOO=BAR python some-script.py) that started python but it appears like there is nothing like that. I guess if there was a feature like this it would be somewhere in the os or sys packages.
The environment is simply an array of C strings, there is no metainformation there which helps you find out whether or not the invoking shell had the variable marked for export or not.
On Linux, you could examine /proc/(pid)/environ of the parent PID (if you have suitable permissions) to see what's in the parent's permanent environment, but this is decidedly nonportable and brittle.
Spending time on this seems misdirected anyway; let the user pass the environment variable in any way they see fit.

How to return a value from Python script as a Bash variable?

This is a summary of my code:
# import whatever
def createFolder():
#someCode
var1=Gdrive.createFolder(name)
return var1
def main():
#someCode
var2=createFolder()
return var2
if __name__ == "__main__":
print main()
One way in which I managed to return a value to a bash variable was printing what was returned from main(). Another way is just printing the variable in any place of the script.
Is there any way to return it in a more pythonic way?
The script is called this way:
folder=$(python create_folder.py "string_as_arg")
A more pythonic way would be to avoid bash and write the whole lot in python.
You can't expect bash to have a pythonic way of getting values from another process - it's way is the bash way.
bash and python are running in different processes, and inter-process communication (IPC) must go via kernel. There are many IPC mechanisms, but bash does not support them all (shared memory, for example). The lowest common denominator here is bash, so you must use what bash supports, not what python has (python has everything).
Without shared memory, it is not a simple thing to write to variables of another process - let alone another language. Debuggers do it, but they are written specifically for the host language.
The mechanism you use from bash is to capture the stdout of the child process, so python must print. Under the covers this uses an anonymous pipe. You could use a named pipe (also known as a fifo) instead, which python would open as a normal file and write to it. But it wouldn't buy you much.
If you were working in bash then you could simply do:
export var="value"
However, there is no such equivalent in Python. If you try to use os.environ those values will persist for the rest of the process and will not modify anything after the program finishes. Your best bet is to do exactly what you are already doing.
You can try to set an environment variable from within the python code and read it outside, at the bash script. This way looks very elegant to me, but it is definitely not the "perfect solution" or the only solution. If you like this approach, this thread might be useful: How to set environment variables in Python
There are other ways, very similar to what you have done. Check also this thread: store return value of a Python script in a bash script
Just use sys.exit(), i.e.:
import sys
[...]
if __name__ == "__main__":
sys.exit(main())

Running scripts whose purpose is to change shell variables with subprocess.popen

I have a series of scripts I am automating the calling of in python with subprocess.Popen. Basically, I call script A, then script B, then script C and so forth.
Script A sets a bunch of local shell variables, with commands such as set SOME_VARIABLE=SCRIPT_A, set PATH=%SCRIPT_A:/=\;%PATH%.
Script B and C then need to have the effects of this. In unix, you would call script A with "source script_a.sh". The effect lasts in the current command window. However, subprocess.Popen effectively launches a new window (kind of).
Apparently subprocess.Popen is not the command I want to do this. How would I do it?
edit I have tried parsing the file (which is all 'set' statements) and passing them as a dictionary to 'env' in subprocess.Popen, but it doesn't seem to have all worked..
You can use the env argument to Popen with a python dictionary containing the variables then you don't need to run the command that just sets variables.
if 'script A' get generated by another process, you will either need to change the other process so the output file is in a format that you can source (import) into your python script. or write a parser in python that can digest the vars out of 'Script A' setting them within your python script.
If all you want is to call a series of batch files using Python you could create a helper batch file which would call all these batch files like this
call scriptA;
call scriptB;
call scriptC;
and run this helper batch file using subprocess.Popen

Categories

Resources