I need to set a environment variables in the python, and I try the commands bellow
import os
basepath = os.putenv('CPATH','/Users/cat/doc')
basepath = os.getenv('CPATH','/Users/cat/doc')
And when I print the varible, they are not set:
print basepath
None
What i'm doing wrong?
Rephrasing the question, I would like to create a base path based on a evironment variable. I'm testing this:
os.environ["CPATH"] = "/Users/cat/doc"
print os.environ["CPATH"]
base_path=os.getenv('C_PATH')
And when I try to print the basepath:
print basepath
it always return
None
Try this one.
os.environ["CPATH"] = "/Users/cat/doc"
Python documentation is quite clear about it.
Calling putenv() directly does not change os.environ, so it’s better to modify os.environ.
Based on the documentation, the os.getenv() is available on most flavors of Unix and Windows. OS X is not listed.
Use rather the snippet below to retrieve the value.
value = os.environ.get('CPATH')
Use os.environ:
os.environ['CPATH'] = '/Users/cat/doc'
print os.environ['CPATH'] # /Users/cat/doc
print os.environ.get('CPATH') # /Users/cat/doc
See the above link for more details:
If the platform supports the putenv() function, this mapping may be
used to modify the environment as well as query the environment.
putenv() will be called automatically when the mapping is modified.
Note Calling putenv() directly does not change os.environ, so it’s
better to modify os.environ.
It's a good practice to restore the environment variables at function completion.
You may need something like the modified_environ context manager describe in this question to restore the environment variables.
Usage example:
with modified_environ(CPATH='/Users/cat/doc'):
call_my_function()
Related
Edit: My first attempt at asking this might be a bit unfocused/poorly worded here's a better explanation of what I'm trying to do:
I'm trying to modify the default behavior of the print function for the entire environment python is running in without having to modify each file that's being run.
I'm attempting to decorate the print function (I know there are many ways to do this such as overriding it but that's not really the question I'm asking) so I can have it print out some debugging information and force it to always flush. I did that like so:
def modify_print(func):
# I made this so that output always gets flushed as it won't by default
# within the environment I'm using, I also wanted it to print out some
# debugging information, doesn't really matter much in the context of this
# question
def modified_print(*args,**kwargs):
return func(f"some debug prefix: ",flush=True,*args,**kwargs)
return modified_print
print = modify_print(print)
print("Hello world") # Prints "some debug prefix Hello World"
However what I'm trying to do is modify this behavior throughout my entire application. I know I can manually decorate/override/import the print function in each file however I'm wondering if there is some way I can globally configure my python environment to decorate this function everywhere. The only way I can think to do this would be to edit the python source code and build the modified version.
EDIT:
Here's the behavior I wanted implemented, thank you Match for your help.
It prints out the line number and filename everywhere you call a print function within your python environment. This means you don't have to import or override anything manually in all of your files.
https://gist.github.com/MichaelScript/444cbe5b74dce2c01a151d60b714ac3a
import site
import os
import pathlib
# Big thanks to Match on StackOverflow for helping me with this
# see https://stackoverflow.com/a/48713998/5614280
# This is some cool hackery to overwrite the default functionality of
# the builtin print function within your entire python environment
# to display the file name and the line number as well as always flush
# the output. It works by creating a custom user script and placing it
# within the user's sitepackages file and then overwriting the builtin.
# You can disable this behavior by running python with the '-s' flag.
# We could probably swap this out by reading the text from a python file
# which would make it easier to maintain larger modifications to builtins
# or a set of files to make this more portable or to modify the behavior
# of more builtins for debugging purposes.
customize_script = """
from inspect import getframeinfo,stack
def debug_printer(func):
# I made this so that output always gets flushed as it won't by default
# within the environment I'm running it in. Also it will print the
# file name and line number of where the print occurs
def debug_print(*args,**kwargs):
frame = getframeinfo(stack()[1][0])
return func(f"{frame.filename} : {frame.lineno} ", flush=True,*args,**kwargs)
return debug_print
__builtins__['print'] = debug_printer(print)
"""
# Creating the user site dir if it doesn't already exist and writing our
# custom behavior modifications
pathlib.Path(site.USER_SITE).mkdir(parents=True, exist_ok=True)
custom_file = os.path.join(site.USER_SITE,"usercustomize.py")
with open(custom_file,'w+') as f:
f.write(customize_script)
You can use usercustomize script from the site module to achieve something like this.
First, find out where your user site-packages directory is:
python3 -c "import site; print(site.USER_SITE)"
/home/foo/.local/lib/python3.6/site-packages
Next, in that directory, create a script called usercustomize.py - this script will now be run first whenever python is run.
One* way to replace print is to override the __builtins__ dict and replace it with a new method - something like:
from functools import partial
old_print = __builtins__['print']
__builtins__['print'] = partial(old_print, "Debug prefix: ", flush=True)
Drop this into the usercustomize.py script and you should see all python scripts from then on being overridden. You can temporarily disable calling this script by calling python with the -s flag.
*(Not sure if this is the correct way of doing this - there may be a better way - but the main point is that you can use usercustomize to deliver whatever method you choose).
There's no real reason to define a decorator here, because you are only intending to apply it to a single, predetermined function. Just define your modified print function directly, wrapping it around __builtins__.print to avoid recursion.
def print(*args, **kwargs):
__builtins.__print(f"some debug prefix: ", flush=True, *args, **kwargs)
print("Hello world") # Prints "some debug prefix Hello World"
You can use functools.partial to simplify this.
import functools
print = functools.partial(__builtins.__print, f"some debug prefix: ", flush=True)
I am writing a script in Python 2.7.
It needs to be able to go whoever the current users profile in Windows.
This is the variable and function I currently have:
import os
desired_paths = os.path.expanduser('HOME'\"My Documents")
I do have doubts that this expanduser will work though. I tried looking for Windows Env Variables to in Python to hopefully find a list and know what to convert it to. Either such tool doesn't exist or I am just not using the right search terms since I am still pretty new and learning.
You can access environment variables via the os.environ mapping:
import os
print(os.environ['USERPROFILE'])
This will work in Windows. For another OS, you'd need the appropriate environment variable.
Also, the way to concatenate strings in Python is with + signs, so this:
os.path.expanduser('HOME'\"My Documents")
^^^^^^^^^^^^^^^^^^^^^
should probably be something else. But to concatenate paths you should be more careful, and probably want to use something like:
os.sep.join(<your path parts>)
# or
os.path.join(<your path parts>)
(There is a slight distinction between the two)
If you want the My Documents directory of the current user, you might try something like:
docs = os.path.join(os.environ['USERPROFILE'], "My Documents")
Alternatively, using expanduser:
docs = os.path.expanduser(os.sep.join(["~","My Documents"]))
Lastly, to see what environment variables are set, you can do something like:
print(os.environ.keys())
(In reference to finding a list of what environment vars are set)
Going by os.path.expanduser , using a ~ would seem more reliable than using 'HOME'.
I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it:
import subprocess, os
my_env = os.environ
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
I've got a gut feeling that there's a better way; does it look alright?
I think os.environ.copy() is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
That depends on what the issue is. If it's to clone and modify the environment one solution could be:
subprocess.Popen(my_command, env=dict(os.environ, PATH="path"))
But that somewhat depends on that the replaced variables are valid python identifiers, which they most often are (how often do you run into environment variable names that are not alphanumeric+underscore or variables that starts with a number?).
Otherwise you'll could write something like:
subprocess.Popen(my_command, env=dict(os.environ,
**{"Not valid python name":"value"}))
In the very odd case (how often do you use control codes or non-ascii characters in environment variable names?) that the keys of the environment are bytes you can't (on python3) even use that construct.
As you can see the techniques (especially the first) used here benefits on the keys of the environment normally is valid python identifiers, and also known in advance (at coding time), the second approach has issues. In cases where that isn't the case you should probably look for another approach.
With Python 3.5 you could do it this way:
import os
import subprocess
my_env = {**os.environ, 'PATH': '/usr/sbin:/sbin:' + os.environ['PATH']}
subprocess.Popen(my_command, env=my_env)
Here we end up with a copy of os.environ and overridden PATH value.
It was made possible by PEP 448 (Additional Unpacking Generalizations).
Another example. If you have a default environment (i.e. os.environ), and a dict you want to override defaults with, you can express it like this:
my_env = {**os.environ, **dict_with_env_variables}
you might use my_env.get("PATH", '') instead of my_env["PATH"] in case PATH somehow not defined in the original environment, but other than that it looks fine.
To temporarily set an environment variable without having to copy the os.envrion object etc, I do this:
process = subprocess.Popen(['env', 'RSYNC_PASSWORD=foobar', 'rsync', \
'rsync://username#foobar.com::'], stdout=subprocess.PIPE)
The env parameter accepts a dictionary. You can simply take os.environ, add a key (your desired variable) (to a copy of the dict if you must) to that and use it as a parameter to Popen.
I know this has been answered for some time, but there are some points that some may want to know about using PYTHONPATH instead of PATH in their environment variable. I have outlined an explanation of running python scripts with cronjobs that deals with the modified environment in a different way (found here). Thought it would be of some good for those who, like me, needed just a bit more than this answer provided.
In certain circumstances you may want to only pass down the environment variables your subprocess needs, but I think you've got the right idea in general (that's how I do it too).
This could be a solution :
new_env = dict([(k,(':'.join([env[k], v]) if k in env else v)) for k,v in os.environ.items()])
I believe that running an external command with a slightly modified environment is a very common case. That's how I tend to do it:
import subprocess, os
my_env = os.environ
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
I've got a gut feeling that there's a better way; does it look alright?
I think os.environ.copy() is better if you don't intend to modify the os.environ for the current process:
import subprocess, os
my_env = os.environ.copy()
my_env["PATH"] = "/usr/sbin:/sbin:" + my_env["PATH"]
subprocess.Popen(my_command, env=my_env)
That depends on what the issue is. If it's to clone and modify the environment one solution could be:
subprocess.Popen(my_command, env=dict(os.environ, PATH="path"))
But that somewhat depends on that the replaced variables are valid python identifiers, which they most often are (how often do you run into environment variable names that are not alphanumeric+underscore or variables that starts with a number?).
Otherwise you'll could write something like:
subprocess.Popen(my_command, env=dict(os.environ,
**{"Not valid python name":"value"}))
In the very odd case (how often do you use control codes or non-ascii characters in environment variable names?) that the keys of the environment are bytes you can't (on python3) even use that construct.
As you can see the techniques (especially the first) used here benefits on the keys of the environment normally is valid python identifiers, and also known in advance (at coding time), the second approach has issues. In cases where that isn't the case you should probably look for another approach.
With Python 3.5 you could do it this way:
import os
import subprocess
my_env = {**os.environ, 'PATH': '/usr/sbin:/sbin:' + os.environ['PATH']}
subprocess.Popen(my_command, env=my_env)
Here we end up with a copy of os.environ and overridden PATH value.
It was made possible by PEP 448 (Additional Unpacking Generalizations).
Another example. If you have a default environment (i.e. os.environ), and a dict you want to override defaults with, you can express it like this:
my_env = {**os.environ, **dict_with_env_variables}
you might use my_env.get("PATH", '') instead of my_env["PATH"] in case PATH somehow not defined in the original environment, but other than that it looks fine.
To temporarily set an environment variable without having to copy the os.envrion object etc, I do this:
process = subprocess.Popen(['env', 'RSYNC_PASSWORD=foobar', 'rsync', \
'rsync://username#foobar.com::'], stdout=subprocess.PIPE)
The env parameter accepts a dictionary. You can simply take os.environ, add a key (your desired variable) (to a copy of the dict if you must) to that and use it as a parameter to Popen.
I know this has been answered for some time, but there are some points that some may want to know about using PYTHONPATH instead of PATH in their environment variable. I have outlined an explanation of running python scripts with cronjobs that deals with the modified environment in a different way (found here). Thought it would be of some good for those who, like me, needed just a bit more than this answer provided.
In certain circumstances you may want to only pass down the environment variables your subprocess needs, but I think you've got the right idea in general (that's how I do it too).
This could be a solution :
new_env = dict([(k,(':'.join([env[k], v]) if k in env else v)) for k,v in os.environ.items()])
Django uses real Python files for settings, Trac uses a .ini file, and some other pieces of software uses XML files to hold this information.
Are one of these approaches blessed by Guido and/or the Python community more than another?
Depends on the predominant intended audience.
If it is programmers who change the file anyway, just use python files like settings.py
If it is end users then, think about ini files.
As many have said, there is no "offical" way. There are, however, many choices. There was a talk at PyCon this year about many of the available options.
Don't know if this can be considered "official", but it is in standard library: 14.2. ConfigParser — Configuration file parser.
This is, obviously, not an universal solution, though. Just use whatever feels most appropriate to the task, without any necessary complexity (and — especially — Turing-completeness! Think about automatic or GUI configurators).
I use a shelf ( http://docs.python.org/library/shelve.html ):
shelf = shelve.open(filename)
shelf["users"] = ["David", "Abraham"]
shelf.sync() # Save
Just one more option, PyQt. Qt has a platform independent way of storing settings with the QSettings class. Underneath the hood, on windows it uses the registry and in linux it stores the settings in a hidden conf file. QSettings works very well and is pretty seemless.
There is no blessed solution as far as I know. There is no right or wrong way to storing app settings neither, xml, json or all types of files are fine as long as you are confortable with. For python I personally use pypref it's very easy, cross platform and straightforward.
pypref is very useful as one can store static and dynamic settings and preferences ...
from pypref import Preferences
# create singleton preferences instance
pref = Preferences(filename="preferences_test.py")
# create preferences dict
pdict = {'preference 1': 1, 12345: 'I am a number'}
# set preferences. This would automatically create preferences_test.py
# in your home directory. Go and check it.
pref.set_preferences(pdict)
# lets update the preferences. This would automatically update
# preferences_test.py file, you can verify that.
pref.update_preferences({'preference 1': 2})
# lets get some preferences. This would return the value of the preference if
# it is defined or default value if it is not.
print pref.get('preference 1')
# In some cases we must use raw strings. This is most likely needed when
# working with paths in a windows systems or when a preference includes
# especial characters. That's how to do it ...
pref.update_preferences({'my path': " r'C:\Users\Me\Desktop' "})
# Sometimes preferences to change dynamically or to be evaluated real time.
# This also can be done by using dynamic property. In this example password
# generator preference is set using uuid module. dynamic dictionary
# must include all modules name that must be imported upon evaluating
# a dynamic preference
pre = {'password generator': "str(uuid.uuid1())"}
dyn = {'password generator': ['uuid',]}
pref.update_preferences(preferences=pre, dynamic=dyn)
# lets pull 'password generator' preferences twice and notice how
# passwords are different at every pull
print pref.get('password generator')
print pref.get('password generator')
# those preferences can be accessed later. Let's simulate that by creating
# another preferences instances which will automatically detect the
# existance of a preferences file and connect to it
newPref = Preferences(filename="preferences_test.py")
# let's print 'my path' preference
print newPref.get('my path')
I am not sure that there is an 'official' way (it is not mentioned in the Zen of Python :) )- I tend to use the Config Parser module myself and I think that you will find that pretty common. I prefer that over the python file approach because you can write back to it and dynamically reload if you want.
One of the easiest ways which is use is using the json module.
Save the file in config.json with the details as shown below.
Saving data in the json file:
{
"john" : {
"number" : "948075049" ,
"password":"thisisit"
}
}
Reading from json file:
import json
#open the config.json file
with open('config.json') as f:
mydata = json.load(f) ;
#Now mydata is a python dictionary
print("username is " , mydata.get('john').get('number') , " password is " , mydata.get('john').get('password')) ;
It depends largely on how complicated your configuration is. If you're doing a simple key-value mapping and you want the capability to edit the settings with a text editor, I think ConfigParser is the way to go.
If your settings are complicated and include lists and nested data structures, I'd use XML or JSON and create a configuration editor.
For really complicated things where the end user isn't expected to change the settings much, or is more trusted, just create a set of Python classes and evaluate a Python script to get the configuration.
For web applications I like using OS environment variables: os.environ.get('CONFIG_OPTION')
This works especially well for settings that vary between deploys. You can read more about the rationale behind using env vars here: http://www.12factor.net/config
Of course, this only works for read-only values because changes to the environment are usually not persistent. But if you don't need write access they are a very good solution.
It is more of convenience. There is no official way per say. But using XML files would make sense as they can be manipulated by various other applications/libraries.
Not an official one but this way works well for all my Python projects.
pip install python-settings
Docs here: https://github.com/charlsagente/python-settings
You need a settings.py file with all your defined constants like:
# settings.py
DATABASE_HOST = '10.0.0.1'
Then you need to either set an env variable (export SETTINGS_MODULE=settings) or manually calling the configure method:
# something_else.py
from python_settings import settings
from . import settings as my_local_settings
settings.configure(my_local_settings) # configure() receives a python module
The utility also supports Lazy initialization for heavy to load objects, so when you run your python project it loads faster since it only evaluates the settings variable when its needed
# settings.py
from python_settings import LazySetting
from my_awesome_library import HeavyInitializationClass # Heavy to initialize object
LAZY_INITIALIZATION = LazySetting(HeavyInitializationClass, "127.0.0.1:4222")
# LazySetting(Class, *args, **kwargs)
Just configure once and now call your variables where is needed:
# my_awesome_file.py
from python_settings import settings
print(settings.DATABASE_HOST) # Will print '10.0.0.1'
why would Guido blessed something that is out of his scope? No there is nothing particular blessed.