I am having a problem with the environment variables in python. How do I get python to export variables to the parent shell?
I am using ubuntu, python 2.7.4
I get this:
$ python
>>> import os
>>> os.environ
{'HOME':'~'}
>>> os.environ['foo']='bar'
>>> os.environ
{'HOME':'~','foo':'bar'}
>>> quit()
$ echo $foo
# Place #1
$ python
>>> import os
>>> os.environ
{'HOME':'~'} # Place #2
>>>
My expected output is:
At Place #1: bar
At Place #2: {'HOME':'~','foo':'bar'}
Thanks
Environment variables set in a child process (e.g. python) do not affect the parent process.
It's a one-way street; if this could be done it would be very easy to exploit shells! The environment variables must be set in the parent process itself. This restriction is enforced by the operating system and is not specific to Python.
Note that sourcing a file in a shell (e.g. . script.sh) doesn't create a new process; but there is no way to "source" Python files.
Related
In order to declare a number of environment variables then call some python scripts using them, I create a myfile.sh file which is the on to run by bash myfile.sh .
I have, however, plenty of scripts that should read these environment variables, but cannot create a myfile.sh for each one!
So my idea is to create an environment variable file and access it by each of my python scripts.
So my question is, how to access such a file with python 2.7 ?
A most relevant question is the following:
Where does os.environ read the environment variables from?
It should be noted that I cannot install additional libraries such as dotenv. So a solution, if possible, should be based on standard libraries.
Any help would be mostly appreciated!
I have to create a new environment variables file that should be accessed by my script.
That's not how it works. What you need is to set the environment variables from the file BEFORE calling your script:
myenv.sh
export DB_USER="foo"
export DB_PWD="bar"
export DB_NAME="mydb"
# etc
myscript.py
import os
DB_USER = os.getenv("DB_USER")
DB_PWD = os.getenv("DB_PWD")
DB_NAME = os.getenv("DB_NAME")
# etc
Then
$ source /path/to/myenv.sh
$ python /path/to/myscript.py
Most often you will want to wrap the above two lines in a shell script to make your sysadmin's life easier.
EDIT: if you want those env vars to be "automagically" set, you can always source them from .bashrc or some similar place - but then this is a sysadmin question, not a programming one.
Now the question is: do you really need to use environment variables here ? You can as well use a python settings file - a plain Python module that just defines those variables, ie:
mysettings.py
DB_USER = "foo"
DB_PWD = "bar"
# etc
and make sure the path to the directory containing this script is in your $PYTHONPATH. Then your scripts only have to import it (like they import any other module), and you're done.
Icing on the cake: you can even mix both solutions, by having your settings module looking up environment variables and providing a default, ie:
mysettings.py
import os
DB_USER = os.getenv("DB_USER", "foo")
DB_PWD = os.getenv("DB_PWD", "bar")
# etc
Show environment from commandline
linux: export
windows: set
Setting an environment variable
linux: export foo=bar
windows: set foo=bar
Printing env var:
linux: echo $foo
windows: echo %foo%
I am going to try and say this right but it's a bit outside my area of expertise.
I am using the xgboost library in a windows environment with Python 2.7 which requires all kinds of nasty compiling and installation.
That done, the instructions I'm following tell me I need to modify the OS Path Variable in an iPython notebook before I actually import the library for use.
The instructions tell me to run the following:
import os
mingw_path = 'C:\\Program Files\\mingw-w64\\x86_64-5.3.0-posix-seh-rt_v4-rev0\\mingw64\\bin'
os.environ['PATH'] = mingw_path + ';' + os.environ['PATH']
then I can import
import xgboost as xgb
import numpy as np
....
This works. My question. Does the OS path modification make a permanent change in the path variable or do I need to modify the os path variable each time I want to use it as above?
Thanks in advance.
EDIT
Here is a link to the instructions I'm following. The part I'm referencing is toward the end.
The os.environ function is only inside the scope of the python/jupyter console:
Here's evidence of this in my bash shell:
$ export A=1
$ echo $A
1
$ python -c "import os; print(os.environ['A']); os.environ['A'] = '2'; print(os.environ['A'])"
1
2
$ echo $A
1
The python line above, prints the environ variable A and then changes it's value and prints it again.
So, as you see, any os.environ variable is changed within the python script, but when it gets out, the environment of the bash shell does not change.
Another way of doing this is to modify your User or System PATH variable. But this may break other things because what you're doing may replace the default compiler with mingw and complications may arise. I'm not a windows expert, so not sure about that part.
In a nutshell:
The os.environ manipulations are local only to the python process
It won't affect any other program
It has to be done every time you want to import xgboost
How can I run a bunch of imports and path appends from the interpreter with one command/import? If I import another module that runs the commands for me the imports are not available in main namespace. Similar to running a bash script that modifies/adds commands and variables to the current session.
ex.
import os, ...
sys.path.append(...)
If I understand you correctly, you're just looking for the from … import … statement. For example:
lotsostuff.py:
import json
def foo(): pass
Now:
$ python3.3
>>> from lotsostuff import *
>>> json
<module 'json' from '/usr/local/lib/python3.3/json/__init__.py'>
>>> foo
<function lotsostuff.foo>
However, you might want to consider a different alternative. If you're just trying to control the startup of your interpreter session, you can do this:
$ PYTHONSTARTUP=lotsostuff.py
$ python3.3
>>> json
<module 'json' from '/usr/local/lib/python3.3/json/__init__.py'>
>>> foo
<function __main__.foo>
Notice the difference in the last line. You're now running lotsostuff in the __main__ namespace, rather than running in a separate namespace and grabbing all of its members.
Similarly:
$ python3.3 -i lotsostuff.py
>>> json
<module 'json' from '/usr/local/lib/python3.3/json/__init__.py'>
You'd normally use PYTHONSTARTUP if you want to do this every time in your session, -i if you want to do it just this once.
If you want to do the same thing in the middle of a session instead of at startup… well, you can't do it directly, but you can come pretty close with exec (Python 3.x) (or execfile in Python 2.x).
If you really want to do exactly what you described—importing a module, as a normal import, except merged into your namespace instead of in its own—you'll need to customize the import process. This isn't that hard with importlib; if you're not in Python 3.2 or later, you'll have a lot more work to do it with imp.
That's pretty much the difference between . ./foo instead of just ./foo in a bash script that I think you were looking for.
If you're using ipython, there are even cooler options. (And if you're not using ipython, you might want to check it out.)
After having installed a new virtualenv, for example called ENV, if I type
. /path/to/ENV/bin/activate
python
import os
print os.environ['VIRTUAL_ENV']
Then I see the /path/to/ENV/
However, if I type
/path/to/ENV/bin/python
And then
import os
print os.environ['VIRTUAL_ENV']
I've got a key error
So what is the fundamental difference between these two methods?
Thanks,
Inside the script at bin/activate, there's a line that looks like this:
VIRTUAL_ENV="/Users/me/.envs/myenv"
export VIRTUAL_ENV
Which is what's responsible for setting your VIRTUAL_ENV environment variable. When you don't use activate, that variable never gets exported - so it's not present in os.environ.
I want to implement a userland command that will take one of its arguments (path) and change the directory to that dir. After the program completion I would like the shell to be in that directory. So I want to implement cd command, but with external program.
Can it be done in a python script or I have to write bash wrapper?
Example:
tdi#bayes:/home/$>python cd.py tdi
tdi#bayes:/home/tdi$>
Others have pointed out that you can't change the working directory of a parent from a child.
But there is a way you can achieve your goal -- if you cd from a shell function, it can change the working dir. Add this to your ~/.bashrc:
go() {
cd "$(python /path/to/cd.py "$1")"
}
Your script should print the path to the directory that you want to change to. For example, this could be your cd.py:
#!/usr/bin/python
import sys, os.path
if sys.argv[1] == 'tdi': print(os.path.expanduser('~/long/tedious/path/to/tdi'))
elif sys.argv[1] == 'xyz': print(os.path.expanduser('~/long/tedious/path/to/xyz'))
Then you can do:
tdi#bayes:/home/$> go tdi
tdi#bayes:/home/tdi$> go tdi
That is not going to be possible.
Your script runs in a sub-shell spawned by the parent shell where the command was issued.
Any cding done in the sub-shell does not affect the parent shell.
cd is exclusively(?) implemented as a shell internal command, because any external program cannot change parent shell's CWD.
As codaddict writes, what happens in your sub-shell does not affect the parent shell. However, if your goal is to present the user with a shell in a different directory, you could always have Python use os.chdir to change the sub-shell's working directory and then launch a new shell from Python. This will not change the working directory of the original shell, but will leave the user with one in a different directory.
As explained by mrdiskodave
in Equivalent of shell 'cd' command to change the working directory?
there is a hack to achieve the desired behavior in pure Python.
I made some modifications to the answer from mrdiskodave to make it work in Python 3:
The pipes.quote() function has moved to shlex.quote().
To mitigate the issue of user input during execution, you can delete any previous user input with the backspace character "\x08".
So my adaption looks like the following:
import fcntl
import shlex
import termios
from pathlib import Path
def change_directory(path: Path):
quoted_path = shlex.quote(str(path))
# Remove up to 32 characters entered by the user.
backspace = "\x08" * 32
cmd = f"{backspace}cd {quoted_path}\n"
for c in cmd:
fcntl.ioctl(1, termios.TIOCSTI, c)
I shall try to show how to set a Bash terminal's working directory to whatever path a Python program wants in a fairly easy way.
Only Bash can set its working directory, so routines are needed for Python and Bash. The Python program has a routine defined as:
fob=open(somefile,"w")
fob.write(dd)
fob.close()
"Somefile" could for convenience be a RAM disk file. Bash "mount" would show tmpfs mounted somewhere like "/run/user/1000", so somefile might be "/run/user/1000/pythonwkdir". "dd" is the full directory path name desired.
The Bash file would look like:
#!/bin/bash
#pysync ---Command ". pysync" will set bash dir to what Python recorded
cd `cat /run/user/1000/pythonwkdr`