I have a configuration of os.environ with default values (that cover 90% of my needs). I have a special application-framework-package, for example called SALOME, that does not provide package installation into system environment and tries to be self contained, it also requires use of special old technologies that rely on environmental variables thus sys.path and PYTHONPATH are not the only things it needs. I can get all variables it needs when it started calling os.environ inside an environment it creates. I can then serialize that os.environ dictionary.
I wonder how to apply a merge of os.environ I have on my currently running system with one I obtained by serializing?
Let's assume you have done something like the following to serialize the environment:
import json
import os
with open('environ.json', 'w') as f:
json.dump(dict(**os.environ), f)
You can now read those back like this (in another program)
import json
import os
with open('environ.json', 'r') as f:
os.environ.update(json.load(f))
This will only add or change the current environment variables to match the saved ones, but any additional variables will remain.
If you want to update only specific variables by adding them (so for instance to add extra paths), you can do that explicitly:
with open('environ.json', 'r') as f:
loadedenv = json.load(f)
pathvars = ['PATH', 'PYTHONPATH']
for p in pathvars:
os.environ[p] += ':' + loadedenv[p]
You can use the package environs to achieve exporting os.environ dictionary. It has inbuilt dumper/loader for exporting importing the environment variables.
from environs import Env
env = Env()
# reading an environment variable
gh_user = env('GITHUB_USER') # => 'sloria'
secret = env('SECRET') # => raises error if not set
# casting
api_key = env.str('API_KEY') # => '123abc'
date = env.date('SHIP_DATE') # => datetime.date(1984, 6, 25)
# serialize to a dictionary of simple types (numbers and strings)
env.dump()
# { 'API_KEY': '123abc',
# 'GITHUB_USER': 'sloria',
# 'SECRET': 'AASJI93WSJD93DWW3X0912NS2',
# 'SHIP_DATE': '1984-06-25'}}
If you want to have multiple values for a dictionary which the standard python dictionary does not offer than you can use
werkzeug.datastructures.MultiDict
os.environ = MultiDict([('Key1', 'First Value'), ('Key1', 'Second Value')])
The update will also work the same way as I have mentioned below.
If you do not want to preserve the old key values before you merge with the new dictionary then you can do the following.
Since os.environ is a dictionary that you already have in memory the other dict is the one you are reading from would need to be converted into json. I generally use ujson since it is really fast.
os.environ.update(new_dict)
If you want to save the json u can dump it to a file.
import ujson
with open('data.json', 'w') as file:
ujson.dump(dict(**os.environ), file)
if you want to read the file and update the os.environ dictionary than you can use.
with open('environ.json', 'r') as f:
os.environ.update(ujson.load(f))
Related
Im trying to make a thing that can detect whether or not a user wants their password stored for next time they run the program, The reason im using a boolean for this is so in the files that need to use the password they can check to see if storepass is True then get the pass/user from a .env if not they can get the password from the storepasswordquestion file amd use that and it wont get stored when the user closes the program.
I have a file called booleans that im using to store booleans, in it is this:
storepass = False
In a other file called storepasswordquestion i have:
import booleans
username = 'username'
password = 'password'
question = input('Would you like your password to be stored for use next time?') # enter y
if question == 'y':
booleans.storepass = True
# store password/username in .env
As i understand it import booleans loads the booleans file, Then i think booleans.storepass is like a variable but like a copy of the one in booleans? Because when i go on the booleans file again its still False.
Im needing to change storepass to True in the booleans file and then save it.
Then i think booleans.storepass is like a variable but like a copy of the one in booleans? Because when i go on the booleans file again its still False.
That's correct - you can't change the values inside a .py file just by importing and then setting within another. py file. The standard way to manipulate files is by using some variation of
with open('boolean.py', 'w') as f:
f.write('storepass = False')
Personally, I really dislike writing over .py files like this; I usually save as JSON. So "boolean.json" can have just
{"storepass": false}
and then in your python code you can (instead of importing) get it as
# import json
boolean = json.load(open('boolean.json', 'r'))
and set storepass with
# import json
boolean.storepass = True
with open('boolean.json', 'w') as f:
json.dump(boolean, f, indent=4)
## NOT f.write(boolean)
and this way, if there are more values in boolean, they'll also be preserved (as long as you don't use the variable for anything else in between...)
You're expecting changing a variable's value to change source code. Luckily, that's not how this works!
You need to make a mental distinction between the source code that the python interpreter reads, and the values of variables.
What you need is indeed, as you say,
Im needing to change storepass to True in the booleans file and then save it.
So, you would need to open that boooleans.py as a text file, not as a python program, read and interpret it as pairs of keys and values, modify the right one (so, storepass in this case), write the result back to the file. Then, the next time someone imports it, they would see the different setting.
This is a bad approach you've chosen:
Reading and parsing python files is very hard in general, because, well, Python is a programming language and can do much more than just store settings
The change only takes effect the next time the program is run and the settings are imported anew. This has no effect on other parts of an already running program. So, that's bad.
Things get a lot easier if you do two things:
Get rid of the idea to store settings in a Python source code file. Instead, use one of the multiple ways that python has to read structured data from a file. For your use case, usage of the configparser module might be easiest – a ConfigParser has a read and a write method, with which you can, you guessed it, read and write your config file. There's multiple other modules that Python brings that can do similar things. The json module, for example, is a sensible way to store especially logically hierarchically structured settings. And of course, the place where you store passwords might also be a place where you could store settings – depending on what that is.
The approach of having a single module that you import (your booleans) is a good one, as it ensures that there's a single booleans that is the "source of truth" about these settings. I propose you put the configuration loading into such a single module:
# this is settings.py
import json
SETTINGSFILE = "settings.json"
# when this is loaded the first time, load things from the settings file
try:
with open(SETTINGSFILE) as sfile:
_settings = json.load(sfile)
except FileNotFoundError:
# no settings file yet. We'll just have empty settings
# Saving these will create a file
_settings = {}
def __getattr__(settingname):
try:
return _settings[settingname]
except KeyError as e:
# you could implement "default settings" by checking
# in a default settings dictionary that you create above
# but here, we simply say: nay, that setting doesn't yet
# exist. Which is as good as your code would have done!
raise AttributeError(*e.args) from e
def save():
with open(SETTINGSFILE, "w") as sfile:
json.dump(_settings, sfile)
def update_setting(settingname, value):
_settings[settingname] = value
save()
which you can then use like this:
import settings
…
if settings.storepass:
"Do what you want here"
…
# User said they want to change the setting:
should_we_store = (input("should we store the password? ").upper == "Y")
# this updates the setting, i.e. it's instantly visible
# in other parts of the program, and also stores the changed config
# to the settings.json file
settings.update_setting("storepass", should_we_store)
You are changing the value inside the runtime, meaning you can actually change it. Meaning if you are ever trying to print out the value inside the if condition like this:
booleans.storepass = True
print(booleans.storepass)#This should return True
But your problem is that you are not actually changing the file. In order to change the file you should do it like this( This is the general way to write to a file with python from my research it should work in your case as well.
f = open("booleans.py", "w")
f.write("booleans.storepass = True")
f.close()
In short what you are doing is actually true for runtime, but if you want to store the result in a file this is the way to do it.
I am currently programming a Soundboard. For that I would need to store the usergiven name of the sound, its path and a hotkey assigned to it. I also want the user to be able to add sounds whenever they like.
In a YAML document it would look something like this:
sounds:
sound1:
path: ...
name: ...
hotkey: ...
sound2:
path: ...
name: ...
hotkey: ...
Sadly I haven't found an easy way to append keys (sound3, sound4...) to a yaml like this.
Is there an easy way to accomplish my goal?
Although it is theoretically possible to append to a YAML document, it is not practical in all but
the most controlled environments (length of entries that might cause wrapping, data structure, indentation, etc.).
If you are not adding elements to a root-level sequence or keys to root-level mapping, I would
not even start to try it.
What you should do is load the document into a Python data structure and dump the whole structure back.
If you care about preserving the format, quoting and commenting of the document you can do
this using:
import sys
from pathlib import Path
import ruamel.yaml
file_path = Path('sound.yaml')
new_sound = dict(path='/path/to/sound', name='sndname', hotkey='Alt+S')
yaml = ruamel.yaml.YAML()
yaml.indent(mapping=4, sequence=4, offset=2)
yaml.preserve_quotes = True
data = yaml.load(file_path)
sounds = data['sounds']
idx = 0
while True:
idx += 1
if (key := f'sound{idx}') not in sounds:
sounds[key] = new_sound
break
yaml.dump(data, file_path)
print(file_path.read_text())
which gives:
sounds:
sound1:
path: ..
name: ..
hotkey: ..
sound2:
path: ..
name: ..
hotkey: ..
sound3:
path: /path/to/sound
name: sndname
hotkey: Alt+S
The above searches for the first free key of the form soundNNN and inserts the new data in the structure. If the input would have had quotes around scalar values (i.e. those loaded as Python strings), they would be preserved, as would EOL comments in the YAML document.
You load it into memory, edit it as a python data structure, then dump it back out to the yaml file.
import yaml
with open("t.yaml") as f:
data = yaml.load(f, Loader=yaml.SafeLoader)
data['sounds']['sound3'] = {'path': '...', 'name': '...', 'hotkey': '...'}
with open("t.yaml", "w") as f:
yaml.dump(data, f)
This seems like it should be simple enough, but haven't been able to find a working example of how to approach this. Simply put I am generating a JSON file based on a list that a script generates. What I would like to do, is use some variables to run the dump() function, and produce a json file into specific folders. By default it of course dumps into the same place the .py file is located, but can't seem to find a way to run the .py file separately, and then produce the JSON file in a new folder of my choice:
import json
name = 'Best'
season = '2019-2020'
blah = ['steve','martin']
with open(season + '.json', 'w') as json_file:
json.dump(blah, json_file)
Take for example the above. What I'd want to do is the following:
Take the variable 'name', and use that to generate a folder of the same name inside the folder the .py file is itself. This would then place the JSON file, in the folder, that I can then manipulate.
Right now my issue is that I can't find a way to produce the file in a specific folder. Any suggestions, as this does seem simple enough, but nothing I've found had a method to do this. Thanks!
Python's pathlib is quite convenient to use for this task:
import json
from pathlib import Path
data = ['steve','martin']
season = '2019-2020'
Paths of the new directory and json file:
base = Path('Best')
jsonpath = base / (season + ".json")
Create the directory if it does not exist and write json file:
base.mkdir(exist_ok=True)
jsonpath.write_text(json.dumps(data))
This will create the directory relative to the directory you started the script in. If you wanted a absolute path, you could use Path('/somewhere/Best').
If you wanted to start the script while beeing in some other directory and still create the new directory into the script's directory, use: Path(__file__).resolve().parent / 'Best'.
First of all, instead of doing everything in same place have a separate function to create folder (if already not present) and dump json data as below:
def write_json(target_path, target_file, data):
if not os.path.exists(target_path):
try:
os.makedirs(target_path)
except Exception as e:
print(e)
raise
with open(os.path.join(target_path, target_file), 'w') as f:
json.dump(data, f)
Then call your function like :
write_json('/usr/home/target', 'my_json.json', my_json_data)
Use string format
import json
import os
name = 'Best'
season = '2019-2020'
blah = ['steve','martin']
try:
os.mkdir(name)
except OSError as error:
print(error)
with open("{}/{}.json".format(name,season), 'w') as json_file:
json.dump(blah, json_file)
Use os.path.join():
with open(os.path.join(name, season+'.json'), 'w') as json_file
The advantage above writing a literal slash is that it will automatically pick the type of slash for the operating system you are on (slash on linux, backslash on windows)
Does anyone know of a way to name a list in python using a String. I am writing a script that iterates through a directory and parses each file and and generates lists with the contents of the file. I would like to use the filename to name each array. I was wondering if there was a way to do it similar to the exec() method but using lists instead of just a normal variable
If you really want to do it this way, then for instance:
import os
directory = os.getcwd() # current directory or any other you would like to specify
for name in os.listdir(directory):
globals()[name] = []
Each of the lists can be now referenced with the name of the file. Of course, this is a suboptimal approach, normally you should use other data structures, such as dictionaries, to perform your task.
You would be better off using a dictionary. Store the file name as the key value of the dictionary and place the contents inside the corresponding value for the key.
It's like
my_dict = {'file1.txt':'This is the contents of file1','file2.txt':'This is the content of file2'}
I need to create an unknown number of python variables, based on a list of file in a folder.
I found that I could use the global dictionary to create and initialize those variables:
# libraries import
import os.path
import glob
import numpy as np
# list of all the text files in the folder
list = glob.glob("*.txt")
# creation of the variables based on the name of each file
for file in list:
shortname = os.path.splitext(file)[0]
globals()[shortname] = np.loadtxt(file)
However, I was wondering if it was a good practice to access the global dictionary for variable assignment in python (when we do not know the number and name of the variables in advance) or if there was an alternative method preferable.
You should use a dedicated dictionary for this:
files = {f: np.loadtxt(f) for f in glob.glob("*.txt")}
Generally, you should not mix data and variable or attribute names. Your code could shadow just any Python built-in if a file with the same name exists.
No, you probably shouldn't be using globals for this. Instead, create a dictionary or class and store the values in that.