There are quite a few question for this topic but unfortunatly they didn't solve my problem.
I have a shell script whose purpose is only to set environment variables (let's call it env.sh). A second shell script is the main program script that sources env.sh and is using variables that are defined there. This works fine when both are bash-scripts.
I have the problem that I try to replace the main shell script with a python program. This python program does also depend on those environment variables that are set by env.sh.
What can I do to source env.sh within python before starting any routine that use those environmen variables?
I have to run bash -c source env.sh with the module subprocess. But if I understand correctly this does not work, because the variables are set in the child process of the calling python program and are therefore not available in the parent process. Is this correct?
A similar solution would have been to use bash -c source env.sh && env and read the output of env in python, iterate over the list and write into os.environ. But this approach would write every variable again, even if it was already defined. What if there are a lot of variables?
The last solution I could think of was to read and parse the env.sh file and set the variables I can find. One problem is that some exports in env.sh are nested, mening:
export SOMETHING=$FOO/Bar
This could become complicated to parse correctly.
Is there another way that I haven't found yet or didn't think of?
Some options:
https://github.com/mattseymour/python-env
https://github.com/rconradharris/envparse
Env variable sourcing is very common, I would not recommend rolling your own.
Related
I'm new to python and enjoying learning the language. I like using the interpreter in real time, but I still don't understand completely how it works. I would like to be able to define my environment with variables, imports, functions and all the rest then run the interpreter with those already prepared. When I run my files (using PyCharm, Python 3.6) they just execute and exit.
Is there some line to put in my .py files like a main function that will invoke the interpreter? Is there a way to run my .py files from the interpreter where I can continue to call functions and declare variables?
I understand this is a total newbie question, but please explain how to do this or why I'm completely not getting it.
I think you're asking three separate things, but we'll count it as one question since it's not obvious that they are different things:
1. Customize the interactive interpreter
I would like to be able to define my environment with variables, imports, functions and all the rest then run the interpreter with those already prepared.
To customize the environment of your interactive interpreter, define the environment variable PYTHONSTARTUP. How you do that depends on your OS. It should be set to the pathname of a file (use an absolute path), whose commands will be executed before you get your prompt. This answer (found by Tobias) shows you how. This is suitable if there is a fixed set of initializations you would always like to do.
2. Drop to the interactive prompt after running a script
When I run my files (using PyCharm, Python 3.6) they just execute and exit.
From the command line, you can execute a python script with python -i scriptname.py and you'll get an interactive prompt after the script is finished. Note that in this case, PYTHONSTARTUP is ignored: It is not a good idea for scripts to run in a customized environment without explicit action.
3. Call your scripts from the interpreter, or from another script.
Is there a way to run my .py files from the interpreter where I can continue to call functions and declare variables?
If you have a file myscript.py, you can type import myscript in the interactive Python prompt, or put the same in another script, and your script will be executed. Your environment will then have a new module, myscript. You could use the following variant to import your custom definitions on demand (assuming a file myconfig.py where Python can find it):
from myconfig import *
Again, this is not generally a good idea; your programs should explicitly declare all their dependencies by using specific imports at the top.
You can achieve the result you intend by doing this:
Write a Python file with all the imports you want.
Call your script as python -i myscript.py.
Calling with -i runs the script then drops you into the interpreter session with all of those imports, etc. already executed.
If you want to save yourself the effort of calling Python that way every time, add this to your .bashrc file:
alias python='python -i /Users/yourname/whatever/the/path/is/myscript.py'
You set the environment variable PYTHONSTARTUP as suggested in this answer:
https://stackoverflow.com/a/11124610/1781434
I need to source the environment of child process. I have a c-shell script(really complicated) that sets many environment variables and I want to use them in the parent process. I doing something like this:
subprocess.call(['set_env_vars.csh; env>crazy_vars.log' shell=True])
In this way I am trying to get the env of child process but this method is not working as I think commands after semicolon are treated as separate processes.
A possible solution is that I created another c-shell script and put those two commands in there and then call that script in python but thats a dirty way.
Is there a way to make two commands as the part of same process.
Thanks
On my system (as with many others) the shell is bash, not csh, so explicitly invoking csh is a good idea. Also, you need to source, not execute set_env_vars.csh:
subprocess.call(['/bin/csh', '-c', 'source set_env_vars.csh; env > crazy_vars.log'])
I want to set an environment variable with a Python script, influencing the shell I am starting the script in. Here is what I mean
python -c "import os;os.system('export TESTW=1')"
But the command
echo ${TESTW}
returns nothing. Also with the expression
python -c "import os;os.environ['TEST']='1'"
it does not work.
Is there another way to do this in the direct sense? Or is it better to write the variables in a file which I execute from 'outside' of the Python script?
You can influence environment via: putenv BUT it will not influence the caller environment, only environment of forked children.
It's really much better to setup environment before launching the python script.
I may propose such variant. You create a bash script and a python script. In bash script you call the python script with params. One param - one env variable. Eg:
#!/bin/bash
export TESTV1=$(python you_program.py testv1)
export TESTV2=$(python you_program.py testv2)
and you_program.py testv1 returns value just for one env variable.
I would strongly suggest using the solution proposed by chepner and Maxym (where the Python script provides the values and your shell exports the variables). If that is not an option for you, you could still use eval to execute what the python script writes in your current Bash process:
eval $( python -c "print('export TESTW=1')" )
Caution: eval is usually read "evil" in Bash programming. As a general rule of thumb, one should avoid "blindly" executing code that is not fully under one's control. That includes being generated by another program at runtime as in this case. See also Stack Overflow question Why should eval be avoided in Bash, and what should I use instead?.
In my build (I'm using Linux) I need to call a Python script and set some env variables. I need these variables to be set even after I exit the script. I am able to set it using os.environ within the script but whenever I exit the script and try to see if the env variable is set from the terminal (echo $myenv) - I get nothing.
I am new to Python and did quite a bit googling to figure this out. However, I am not quite sure if it's possible. I tried using the subprocess:
subprocess.call('setenv myenv 4s3', shell=True)
Also tried using os.system:
os.system("setenv myenv 4s3")
So far, I didn't succeed.
You cannot set environment variables from a child process and have them be visible in the parent process. Every process gets its own copy of the environment, and changes do not propagate upwards.
What you could do is have the Python script print the settings it wants to change and have the outside shell execute the appropriate commands.
Maybe if you find some equivalent function like c vfork for Python.
When you vfork, both processes share memory space so, you might overwrite environment variables in parent process from child process.
Warning: vfork has many security issues, and therefore not recommended. Just use it if you are desperate.
I am trying to set environment variable using python. And this variable is used in another script.
My code is:
#!/usr/bin/env python
import os
os.environ['VAR'] = '/current_working_directory'
after executing above python script,i execute second script which uses the same variable 'VAR', but it is not working.
But when i do export VAR='/current_working_directory and run the second script, it works fine. I tried putenv() also.
This depends on how the second python script get's called.
If you have a shell, and the shell first runs the first python script, then the second, it won't work. The reason is that the first python script inherits the environment from the shell. But modifying os.environ[] or calling putenv() will then only modify the inherited environment --- the one from the second python script, not the one from the shell script.
If now the shell script runs the second python script, it will again inherit the environment from the shell ... and because the shell script is unmodified, you cannot see the modification the first python script did.
One way to achive your goal is using a helper file:
#!/bin/bash
rm -f envfile
./first_pythonscript
test -f envfile && . envfile
rm -f envfile
./second_pythonscript
That code is crude, it won't work if two instances of the shell script run, so don't use it as-is. But I hope you get the idea.
Even another way is to make your second_pythonscript not a program, but a Python module that the first_pythonscript can import. You can also make it a hybrid, library when imported, program when run via the if __name__ == "__main__": construct.
And finally you can use one of the os function, e.g. os.spawnvpe
This code should provide the required environment to your 2nd script:
#!/usr/bin/env python
import os
os.environ['VAR'] = '/current_working_directory'
execfile("/path/to/your/second/script.py")
A child process cannot change the environment of its parent process.
The general shell programming solution to this is to make your first script print out the value you need, and assign it in the calling process. Then you can do
#!/bin/sh
# Assign VAR to the output from first.py
VAR="$(first.py)"
export VAR
exec second.py
or even more succinctly
#!/bin/sh
VAR="$(first.py)" second.py
Obviously, if the output from first.py is trivial to obtain without invoking Python, that's often a better approach; but for e.g. having two scripts call different functions from a library and/or communicating with a common back end, this is a common pattern.
Using Python for the communication between two pieces of Python code is often more elegant and Pythonic, though.
#!/usr/bin/env python
from yourlib import first, second
value=first()
# maybe putenv VAR here if that's really, really necessary
second(value)