$ cat test.py
#!/usr/bin/env python
# coding=utf-8
import os
print (os.environ.get('test'))
$ test=4 python test.py
4
$ test=4; python test.py
None
While in shell I got different with python:
$ test=4; echo $test
4
But :
$ test=2
$ test=4 echo $test
2
So I am confused about how python and bash handle the situation. Can someone explain ?
Thats the difference between shell and environment variable.
Here,
test=4 python test.py
passes test=4 to python's environment, so you will get the variable test inside the script.
Whereas
test=4; python test.py
creates a shell variable that is available only in the current shell session (that is why you are getting the value from shell just fine) i.e. will not be propagated to the environment.
To make a variable environment variable so that all subprocesses inherit the variable i.e. make the variable available in processes' environment, the usual way on any POSIX shell is to export the variable:
export test=4; python test.py
In your last case:
$ test=2
$ test=4 echo $test
2
the expansion of variable test happening before the echo built-in is run.
You need to use some method to preserve the expansion for later:
$ test=2
$ test=4 sh -c 'echo $test'
4
You need to export the variable for Python.
$ export test=4
Then execute your Python script:
$ ./test.py
This ...
test=4 python test.py
... is a single python command, with variable test explicitly set in its environment, whereas this ...
test=4; python test.py
... is two separate commands. The first tells bash to set variable test (without marking it for export) in the current shell, and the second is the python command. Naturally, Python will not see the variable in its environment. But if you afterward do
echo $test
then the the shell (not the echo command) expands the variable reference to its value as it processes the command line. The resulting expanded command is
echo 4
, which does what you would expect.
Related
I have this script named test.py:
import os
print(os.environ['LOL'])
That I run as follow :
(LOL=HAHAHA; python3 test.py)
But it raises a KeyError because it can't find the variable LOL.
I also tried with :
os.getenv('LOL')
But it just returns None.
How can I access to the variable LOL in this context.
You need to supplement the environment variables for the command during which you invoke python3 test.py as follows
$ LOL=HAHAHA python3 test.py
HAHAHA
$ env LOL=HAHAHA python3 test.py
HAHAHA
$ echo $LOL
<empty string>
or you can export the variable for the current session as:
$ export LOL=HAHAHA
$ python3 test.py
HAHAHA
$ echo $LOL
HAHAHA
Simply doing LOL=HAHAHA; python3 test.py doesn't work because that just sets the LOL=HAHAHA variable for the shell process.
Another thing to note, the first approach shown only sets the environment variable for that specific command. It does not set it in the environment. Doing it with the export instead, sets it for the environment. You can see the difference in the values of $LOL above
You are trying to access an environmental variable, so if you are on windows to set it you need to do something like:
set LOL=HAHAHAHA
Then you should be able to access it. To make sure it was set correctly you can also just run:
set
To get a full list of environmental variables.
Just export the variable, i.e. run the command with:
export LOL=HAHA; python3 test.py
or set LOL in the same command, i.e. without the semicolon:
LOL=HAHA python3 test.py
Content of test.py:
import os
print(os.environ['LOL'])
I am running a python3 script which performs the following snippet on Debian 9:
os.environ["PA_DIR"] = "/home/myname/some_folder"
command_template = ("sudo java -Dconfig.file=$PA_DIR/google.conf "
"-jar ~/big/cromwell-42.jar run $PA_DIR/WholeGenomeGermlineSingleSample.wdl "
"-i {} -o $PA_DIR/au_options.json > FDP{}.log 2>&1")
command = command_template.format("test.json, "1")
os.system("screen -dm -S S{} bash -c '{}'".format("1", command))
The use of PA_DIR works as intended. When I tried it on command line:
PA_DIR="/home/myname/some_folder"
screen -dm -S S1 bash -c 'sudo java -Dconfig.file=$PA_DIR/google.conf -jar ~/big/cromwell-42.jar run $PA_DIR/WholeGenomeGermlineSingleSample.wdl -i test.json -o $PA_DIR/au_options.json > FDP1.log 2>&1'
it doesn't do variable substitution due to single quotes and I had to replace them with double quotes (it complains it cannot find the file /google.conf).
What is different when python runs it?
Thanks!
The Python os.system() invokes the underlying system function of the C library, which on POSIX systems is equivalent to do something like
sh -c "your_command and all its arguments"
So the command and all arguments are already surrounded by double-quotes, which does environment variable substitution. Any single quotes inside the string is irrelevant for the variable substitution.
You can test it easily. In a shell do something like
$ foo="bar"
$ echo "foo is '$foo'" # Will print foo is 'bar'
$ echo 'foo is "$foo"' # Will print foo is "$foo"
Waiting for your answer to daltonfury42, I'd bet the problem is, when running in a command line, you are not exporting the PA_DIR environment variable so it is not present in the second bash interpreter. And it behaves different beacuse of what Mihir answered.
If you run
PA_DIR=foo
you only declare a bash variable but it is not an environment variable. Then
bash -c "echo $PA_DIR"
this will output foo because your current interpreter interpolates $PA_DIR and then raises a second bash process with the command echo foo. But
bash -c 'echo $PA_DIR'
this prevents your bash interpreter from interpolating it so it raises a second bash process with the comand echo $PA_DIR. But in this second process the variable PA_DIR does not exist.
If you start your journey running
export PA_DIR=foo
this will become an environment variable that will be accessible to children processes, thus
bash -c 'echo $PA_DIR'
will output foo because the nested bash interpreter has access to the variable even if the parent bash interpreter did not interpolate it.
The same is true for any kind of children process. Try running
PA_DIR=foo
python3 -c 'import os; print(os.environ.get("PA_DIR"))'
python3 -c "import os; print(os.environ.get('PA_DIR'))"
export PA_DIR=foo
python3 -c 'import os; print(os.environ.get("PA_DIR"))'
python3 -c "import os; print(os.environ.get('PA_DIR'))"
in your shell. No quotes are involved here!
When you use the os.environ dictionary in a Python script, Python will export the variables for you. That's why you will see the variable interpolated by either
os.system("bash -c 'echo $PA_DIR'")
or
os.system('bash -c "echo $PA_DIR"')
But beware that in each case it is either the parent or either the children shell process who is interpolating the variable.
You must understand your process tree here:
/bin/bash # but it could be a zsh, fish, sh, ...
|- /usr/bin/python3 # presumably
|- /bin/sh # because os.system uses that
|- /bin/bash
If you want an environment variable to exist in the most nested process, you must export it anywhere in the upper tree. Or in that very process.
When a Python script is supposed to be run from a pyenv virtualenv, what is the correct shebang for the file?
As an example test case, the default Python on my system (OS X) does not have pandas installed. The pyenv virtualenv venv_name does. I tried getting the path of the Python executable from the virtualenv.
pyenv activate venv_name
which python
Output:
/Users/username/.pyenv/shims/python
So I made my example script.py:
#!/Users/username/.pyenv/shims/python
import pandas as pd
print 'success'
But when I tried running the script (from within 'venv_name'), I got an error:
./script.py
Output:
./script.py: line 2: import: command not found
./script.py: line 3: print: command not found
Although running that path directly on the command line (from within 'venv_name') works fine:
/Users/username/.pyenv/shims/python script.py
Output:
success
And:
python script.py # Also works
Output:
success
What is the proper shebang for this? Ideally, I want something generic so that it will point at the Python of whatever my current venv is.
I don't really know why calling the interpreter with the full path wouldn't work for you. I use it all the time. But if you want to use the Python interpreter that is in your environment, you should do:
#!/usr/bin/env python
That way you search your environment for the Python interpreter to use.
As you expected, you should be able to use the full path to the virtual environment's Python executable in the shebang to choose/control the environment the script runs in regardless of the environment of the controlling script.
In the comments on your question, VPfB & you find that the /Users/username/.pyenv/shims/python is a shell script that does an exec $pyenv_python. You should be able to echo $pyenv_python to determine the real python and use that as your shebang.
See also: https://unix.stackexchange.com/questions/209646/how-to-activate-virtualenv-when-a-python-script-starts
Try pyenv virtualenvs to find a list of virtual environment directories.
And then you might find a using shebang something like this:
#!/Users/username/.pyenv/python/versions/venv_name/bin/python
import pandas as pd
print 'success'
... will enable the script to work using the chosen virtual environment in other (virtual or not) environments:
(venv_name) $ ./script.py
success
(venv_name) $ pyenv activate non_pandas_venv
(non_pandas_venv) $ ./script.py
success
(non_pandas_venv) $ . deactivate
$ ./script.py
success
The trick is that if you call out the virtual environment's Python binary specifically, the Python interpreter looks around that binary's path location for the supporting files and ends up using the surrounding virtual environment. (See per *How does virtualenv work?)
If you need to use more shell than you can put in the #! shebang line, you can start the file with a simple shell script which launches Python on the same file.
#!/bin/bash
"exec" "pyenv" "exec" "python" "$0" "$#"
# the rest of your Python script can be written below
Because of the quoting, Python doesn't execute the first line, and instead joins the strings together for the module docstring... which effectively ignores it.
You can see more here.
To expand this to an answer, yes, in 99% of the cases if you have a Python executable in your environment, you can just use:
#!/usr/bin/env python
However, for a custom venv on Linux following the same syntax did not work for me since the venv created a link to the Python interpreter which the venv was created from, so I had to do the following:
#!/path/to/the/venv/bin/python
Essentially, however, you are able to call the Python interpreter in your terminal. This is what you would put after #!.
It's not exactly answering the question, but this suggestion by ephiement I think is a much better way to do what you want. I've elaborated a bit and added some more of an explanation as to how this works and how you can dynamically select the Python executable to use:
#!/bin/sh
#
# Choose the Python executable we need. Explanation:
# a) '''\' translates to \ in shell, and starts a python multi-line string
# b) "" strings are treated as string concatenation by Python; the shell ignores them
# c) "true" command ignores its arguments
# c) exit before the ending ''' so the shell reads no further
# d) reset set docstrings to ignore the multiline comment code
#
"true" '''\'
PREFERRED_PYTHON=/Library/Frameworks/Python.framework/Versions/2.7/bin/python
ALTERNATIVE_PYTHON=/Library/Frameworks/Python.framework/Versions/3.6/bin/python3
FALLBACK_PYTHON=python3
if [ -x $PREFERRED_PYTHON ]; then
echo Using preferred python $ALTERNATIVE_PYTHON
exec $PREFERRED_PYTHON "$0" "$#"
elif [ -x $ALTERNATIVE_PYTHON ]; then
echo Using alternative python $ALTERNATIVE_PYTHON
exec $ALTERNATIVE_PYTHON "$0" "$#"
else
echo Using fallback python $FALLBACK_PYTHON
exec python3 "$0" "$#"
fi
exit 127
'''
__doc__ = """What this file does"""
print(__doc__)
import platform
print(platform.python_version())
If you want just a single script with a simple selection of your pyenv virtualenv, you may use a Bash script with your source as a heredoc as follows:
#!/bin/bash
PYENV_VERSION=<your_pyenv_virtualenv_name> python - $# <<EOF
import sys
print(sys.argv)
exit
EOF
I did some additional testing. The following works too:
#!/usr/bin/env -S PYENV_VERSION=<virtual_env_name> python
/usr/bin/env python won't work, since it doesn't know about the virtual environment.
Assuming that you have main.py living next to a ./venv directory, you need to use Python from the venv directory. Or in other words, use this shebang:
#!venv/bin/python
Now you can do:
./main.py
Maybe you need to check the file privileges:
sudo chmod +x script.py
I am on ubuntu 13.04, bash, python2.7.4
The interpreter doesn't see variables I set.
Here is an example:
$ echo $A
5
$ python -c 'import os; print os.getenv( "A" )'
None
$ python -c 'import os; print os.environ[ "A" ]'
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/lib/python2.7/UserDict.py", line 23, in __getitem__
raise KeyError(key)
KeyError: 'A'
But everything works fine with the PATH variable:
$ echo $PATH
/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
$ python -c 'import os; print os.getenv("PATH")'
/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
And it notices changes in PATH:
$ PATH="/home/alex/tests/:$PATH"
$ echo $PATH
/home/alex/tests/:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
$ python -c 'import os; print os.getenv("PATH")'
/home/alex/tests/:/usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
What could be wrong?
PS the problem comes when using $PYTHONPATH:
$ python -c 'import os; print os.getenv("PYTHONPATH")'
None
Aha! the solution is simple!
I was setting variables with plain $ A=5 command; when you use $ export B="foo" everything is fine.
That is because export makes the variable available to sub-processes:
it creates a variable in the shell
and exports it into the environment of the shell
the environment is passed to sub-processes of the shell.
Plain $ A="foo" just creates variables in the shell and doesn't do anything with the environment.
The interpreter called from the shell obtains its environment from the parent -- the shell. So really the variable should be exported into the environment before.
Those variables (parameters in bash terminology) are not environment variables. You want to export them into the environment, using export or declare -x. See the bash documentation on environment.
Adding as I do not see an answer that has the exact issue I had. If you have multiple "shells" eg BASH and Z-Shell, ensure that you have exported the environment in the correct shell and that this is available to python.
If you are using VSCode and set the default shell to Z shell, then understandably, variables in .bashrc will not be visible to the python interpreter if they do not also exist in .zshrc. The solution then is to export the variable in both shells or change the default shell to the one with the necessary variables.
I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:
Is this true?
and
It seems like it would be a useful things to do; why isn't it possible in general?
You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:
export VAR="foo"
What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.
Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:
cat .data
That prints the data. Now, we want to create a bash command to set that data in an environment variable:
export DATA=`cat .data`
That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:
alias set-data="export DATA=`cat .data`"
You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.
One workaround is to output export commands, and have the parent shell evaluate this..
thescript.py:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
..and the bash alias (the same can be done in most shells.. even tcsh!):
alias setblahblahenv="eval $(python thescript.py)"
Usage:
$ echo $BLAHBLAH
$ setblahblahenv
$ echo $BLAHBLAH
72
You can output any arbitrary shell code, including multiple commands like:
export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'
Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)
If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.
Edit clarification:
So the answer to your question is yes, it is true.
You can however export from within a shell script and source it by using the dot invocation
in fooexport.sh
export FOO="bar"
at the command prompt
$ . ./fooexport.sh
$ echo $FOO
bar
It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.
Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.
You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )
import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )
Or you could have python write out a shell script like this to a file with a .sh extension:
export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd
and then chmod +x it and run it from anywhere.
What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:
#!/bin/bash
/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}
So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.
Now I can invoke any command using that environment by simply prepending "myappenv" as such:
myappenv dosometask -xyz
Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.
Modified version based on new comments
#!/bin/bash
/usr/bin/env G4WORKDIR=$PWD ${*}
You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.
As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:
upsert-env-var (){ eval $(python upsert_env_var.py $*); }
Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:
var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
print "export %s=%s:%s" % (var, val, os.environ[var])
else:
print "export %s=%s" % (var, val)
Usage:
upsert-env-var VAR VAL
As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.
They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:
alias export_my_program="export MY_VAR=`my_program`"
However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:
my_program.py:
#!/usr/bin/env python3
_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''
#=========================
# Python code starts here
#=========================
print('Hello environment!')
Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.
Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.
Example usage:
$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!
However, the script will still do everything it did before except exporting, the environment variable if run normally:
$ ./my_program.py
Hello environment!
$ echo $MY_VAR
<-- Empty line
As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:
>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0