Python Modify OS Path Variable - python

I am going to try and say this right but it's a bit outside my area of expertise.
I am using the xgboost library in a windows environment with Python 2.7 which requires all kinds of nasty compiling and installation.
That done, the instructions I'm following tell me I need to modify the OS Path Variable in an iPython notebook before I actually import the library for use.
The instructions tell me to run the following:
import os
mingw_path = 'C:\\Program Files\\mingw-w64\\x86_64-5.3.0-posix-seh-rt_v4-rev0\\mingw64\\bin'
os.environ['PATH'] = mingw_path + ';' + os.environ['PATH']
then I can import
import xgboost as xgb
import numpy as np
....
This works. My question. Does the OS path modification make a permanent change in the path variable or do I need to modify the os path variable each time I want to use it as above?
Thanks in advance.
EDIT
Here is a link to the instructions I'm following. The part I'm referencing is toward the end.

The os.environ function is only inside the scope of the python/jupyter console:
Here's evidence of this in my bash shell:
$ export A=1
$ echo $A
1
$ python -c "import os; print(os.environ['A']); os.environ['A'] = '2'; print(os.environ['A'])"
1
2
$ echo $A
1
The python line above, prints the environ variable A and then changes it's value and prints it again.
So, as you see, any os.environ variable is changed within the python script, but when it gets out, the environment of the bash shell does not change.
Another way of doing this is to modify your User or System PATH variable. But this may break other things because what you're doing may replace the default compiler with mingw and complications may arise. I'm not a windows expert, so not sure about that part.
In a nutshell:
The os.environ manipulations are local only to the python process
It won't affect any other program
It has to be done every time you want to import xgboost

Related

Execute custom command using subprocess that is in PATH variable

I want to execute another program that I compiled earlier from python using the subprocess.call(command) function.
However, python states that it cannot find the command. I suspect that subprocess is not able to find my custom command since it does not know the PATH variable of my Ubuntu system. Is it possible to somehow execute the following code, where command is part of my PATH?
import subprocess
subprocess.run("command -args")
Running this code leads to the error command not found.
You can either provide the explicit path to your command:
subprocess.run('/full/path/to/command.sh')
or else modify your PATH variable in your Python code:
import os
os.environ['PATH'] += ':'+'/full/path/to/'
subprocess.run('command.sh')
You can modify the environment variables. But be careful when you pass arguments.
Try something like this:
import os
import subprocess
my_env = os.environ.copy()
my_env["PATH"] = "/usr/test/path:" + my_env["PATH"]
subprocess.run(["command", "-args"], env=my_env)

How to read environment variables from specific file with python 2.7

In order to declare a number of environment variables then call some python scripts using them, I create a myfile.sh file which is the on to run by bash myfile.sh .
I have, however, plenty of scripts that should read these environment variables, but cannot create a myfile.sh for each one!
So my idea is to create an environment variable file and access it by each of my python scripts.
So my question is, how to access such a file with python 2.7 ?
A most relevant question is the following:
Where does os.environ read the environment variables from?
It should be noted that I cannot install additional libraries such as dotenv. So a solution, if possible, should be based on standard libraries.
Any help would be mostly appreciated!
I have to create a new environment variables file that should be accessed by my script.
That's not how it works. What you need is to set the environment variables from the file BEFORE calling your script:
myenv.sh
export DB_USER="foo"
export DB_PWD="bar"
export DB_NAME="mydb"
# etc
myscript.py
import os
DB_USER = os.getenv("DB_USER")
DB_PWD = os.getenv("DB_PWD")
DB_NAME = os.getenv("DB_NAME")
# etc
Then
$ source /path/to/myenv.sh
$ python /path/to/myscript.py
Most often you will want to wrap the above two lines in a shell script to make your sysadmin's life easier.
EDIT: if you want those env vars to be "automagically" set, you can always source them from .bashrc or some similar place - but then this is a sysadmin question, not a programming one.
Now the question is: do you really need to use environment variables here ? You can as well use a python settings file - a plain Python module that just defines those variables, ie:
mysettings.py
DB_USER = "foo"
DB_PWD = "bar"
# etc
and make sure the path to the directory containing this script is in your $PYTHONPATH. Then your scripts only have to import it (like they import any other module), and you're done.
Icing on the cake: you can even mix both solutions, by having your settings module looking up environment variables and providing a default, ie:
mysettings.py
import os
DB_USER = os.getenv("DB_USER", "foo")
DB_PWD = os.getenv("DB_PWD", "bar")
# etc
Show environment from commandline
linux: export
windows: set
Setting an environment variable
linux: export foo=bar
windows: set foo=bar
Printing env var:
linux: echo $foo
windows: echo %foo%

Not getting LD_LIBRARY_PATH

I am amending an existing script in which I want to check the set of libraries used in an executable with the shared libraries called at the run time. I have the list of libraries which I need to compare with the shared libraries. For getting shared libraries I am trying to get LD_LIBRARY_PATH by giving below code but I had no luck. I tried checking the variable on command line by giving
echo $LD_LIBRARY_PATH
and it returned /opt/cray/csa/3.0.0-1_2.0501.47112.1.91.ari/lib64:/opt/cray/job/1.5.5-0.1_2.0501.48066.2.43.ari/lib64
the things that I have already tried are (this is a python script)
#! /usr/bin/python -E
import os
ld_lib_path = os.environ.get('LD_LIBRARY_PATH')
#ld_lib_path = os.environ["LD_LIBRARY_PATH"]
I think you are just missing a print in your script? This works for me from the command line:
python -c 'import os; temp=os.environ.get("LD_LIBRARY_PATH"); print temp'
script:
#! /usr/bin/python -E
import os
ld_lib_path = os.environ.get('LD_LIBRARY_PATH')
print ld_lib_path

python scripts issue (no module named ...) when starting in rc.local

I'm facing of a strange issue, and after a couple of hour of research I'm looking for help / explanation about the issue.
It's quite simple, I wrote a cgi server in python and I'm working with some libs including pynetlinux for instance.
When I'm starting the script from terminal with any user, it works fine, no bug, no dependency issue. But when I'm trying to start it using a script in rc.local, the following code produce an error.
import sys, cgi, pynetlinux, logging
it produce the following error :
Traceback (most recent call last):
File "/var/simkiosk/cgi-bin/load_config.py", line 3, in
import cgi, sys, json, pynetlinux, loggin
ImportError: No module named pynetlinux
Other dependencies produce similar issue.I suspect some few things like user who executing the script in rc.local (root normaly) and trying some stuff found on the web without success.
Somebody can help me ?
Thanx in advance.
Regards.
Ollie314
First of all, you need to make sure if the module you want to import is installed properly. You can check if the name of the module exists in pip list
Then, in a python shell, check what the paths are where Python is looking for modules:
import sys
sys.path
In my case, the output is:
['', '/usr/lib/python3.4', '/usr/lib/python3.4/plat-x86_64-linux-gnu', '/usr/lib/python3.4/lib-dynload', '/usr/local/lib/python3.4/dist-packages', '/usr/lib/python3/dist-packages']
Finally, append those paths to $PATH variable in /etc/rc.local. Here is an example of my rc.local:
#!/bin/sh -e
#
# rc.local
#
# This script is executed at the end of each multiuser runlevel.
# Make sure that the script will "exit 0" on success or any other
# value on error.
#
# In order to enable or disable this script just change the execution
# bits.
#
# By default this script does nothing
export PATH="$PATH:/usr/lib/python3.4:/usr/lib/python3.4/plat-x86_64-linux-gnu:/usr/lib/python3.4/lib-dynload:/usr/local/lib/python3.4/dist-packages:/usr/lib/python3/dist-packages"
# Do stuff
exit 0
The path where your modules are install is probably normally sourced by .bashrc or something similar. .bashrc doesn't get sourced when it's not an interactive shell. /etc/profile is one place that you can put system wide path changes. Depending on what Linux version/distro it may use /etc/profile.d/ in which case /etc/profile runs all the scripts in /etc/profile.d, add a new shell script there with execute permissions and a .sh extention.

How to find a module in a virtualenv without activating said virtualenv?

Suppose I have the following setup:
mkdir test && cd test
virtualenv .venv
source .venv/bin/activate
pip install django
mkdir mod1
touch mod1/__init__.py
echo "a = 1" > mod1/mod2.py
Which gives me:
test/.venv
test/mod1/__init__.py
test/mod1/mod2.py
How would I write this function:
def get_module(module_name, root_path, virtualenv_path=None)
In order for this to work:
project_root_path = "./test"
project_virtualenv_path = "./test/.venv"
get_module("mod1.mod2", project_root_path, project_virtualenv_path)
get_module("django.contrib.auth", project_root_path, project_virtualenv_path)
Assuming I don't have ./test/.venv activated.
The reason I want to do this, is because I'm working on a vim plugin which would implement gf functionality in a python file on an import statement. I'm trying to support virtualenvs as well.
EDIT:
Also, the script should not alter the current runtime, by adding or appending to sys.path. This should run inside vim, via the vim python bindings, and I don't think altering the vim python runtime would be a good idea.
get_module could either return a module object, or the path to the module, which is what I'm basically looking for.
You can add your virtualenv on python path like:
import site
site.addsitedir('/home/user/.virtualenvs/myapp1/lib/python2.7/site-packages')
and then import should work
The only practical solution I could find here is to run the virtualenv's activate_this.py script, look for what I need, then remove it's changes from sys.path.
import sys
import os
old_sys_path = list(sys.path)
virtualenv_path = "/path/to/venv"
activate_this_path = os.path.join(virtualenv_path, "bin", "activate_this.py")
execfile(activate_this_path, dict(__file__=activate_this_path))
# get my module here
# restore sys.path
sys.path = old_sys_path
If you have a better answer, please add it, and I'll change the accepted answer gladly.

Categories

Resources