My conf directory looks like:
conf
- hydra
- run
- dir
- job_timestamp.yaml
main.yaml
in main.yaml I am trying to overwrite the hydra output directory with a custom structure as defined in job_timestamp.yaml:
dir: ./outputs/${status}/${now:%Y-%m-%d}/${now:%H-%M-%S}
However, writing:
hydra:
run:
dir: job_timestamp
in main.yaml doesn't reference the hydra/run/dir/job_timestamp.yaml file I have - it actually just makes the output folder name "job_timestamp"
I was wondering how I would reference my job_timestamp.yaml file in main.yaml to overwrite hydra's output directory config.
EDIT:
I originally had hydra/run/dir: job_timestamp under defaults of main.yaml, however, it would return with hydra/run/dir not found in defaults list. I am planning to have different versions of main.yaml (e.g. main2.yaml, evaluate.yaml etc.) which all would need to override the output directory with the same format, is there a way to do this so that it is DRY?
You seem to be confusing basic concepts in Hydra like config values and config groups.
You do not "reference" a config file from a config value, only from the defaults list.
Go back to the basic tutorial and make sure you understand config groups.
Hydra specific config groups are defined inside the Hydra package (here). You may want to override Hydra configs via config groups - but I only suggest you attempt that after you have a better understanding of config groups.
In the mean time, you can do it by simply overriding it directly in your main.yaml file:
main.yaml:
hydra:
run:
dir: outputs/${status}/${now:%Y-%m-%d}/${now:%H-%M-%S}
Related
I'm forced to keep my .env file in a non-standard path outside the root of my project (in a separate directory altogether).
Let's say I have my Django project in /var/projects/my_project, though I have my .env file in /opt/envs/my-project/.env where my SECRET_KEY is stored. In my settings.py file, I'd like to explicitly use the .env file at that path so that I can still do this:
from decouple import config
secret_key = config('SECRET_KEY')
I figured it out.
Instead of importing decouple.config and doing the usual config('FOOBAR'), create a new decouple.Config object using RepositoryEnv('/path/to/env-file').
from decouple import Config, RepositoryEnv
DOTENV_FILE = '/opt/envs/my-project/.env'
env_config = Config(RepositoryEnv(DOTENV_FILE))
# use the Config().get() method as you normally would since
# decouple.config uses that internally.
# i.e. config('SECRET_KEY') = env_config.get('SECRET_KEY')
SECRET_KEY = env_config.get('SECRET_KEY')
Hopefully this helps someone.
If you look at the decouple implementation, config is just a pre-instantiated AutoConfig:
config = AutoConfig()
But AutoConfig takes as optional argument search_path so we can do the following:
from decouple import AutoConfig
config = AutoConfig(search_path='/opt/envs/my-project')
Then you can do as usual:
secret_key = config('SECRET_KEY')
Now, django-decouple==2.1 supports having settings.ini and .env files in any parent directory of the project dir.
(And the old methods don't work anymore. - from decouple import Config, RepositoryEnv does not work, AutoConfig does not have search_path as parameter.)
This is convenient because you would want to keep the settings.ini in the project folder on your local machine and you would want to have clean checkouts on the staging/prod server, thus the settings.ini is better located outside the project folder.
I have the following directory structure for a program I'm writing in python:
\code\
main.py
config.py
\module_folder1\
script1.1.py
\data\
data_file1
data_file2
My config.py is a set of global variables that are set by the user, or generally fixed all the time. In particular config.py defines path variables to the 2 data files, something like path1 = os.path.abspath("../data/data_file1"). The primary use is to run main.py which imports config (and the other modules I wrote) and all is good.
But sometimes I need to run script1.1.py by itself. Ok, no problem. I can add to script1.1 the usual if __name__ == '__main__': and I can import config. But then I get path1 = "../code/data/data_file1" which doesn't exist. I thought that since the path is created in config.py the path would be relative to where config.py lives, but it's not.
So the question is, how can I have a central config file which defines relative paths, so I can import the config file to scripts in different directories and have the paths still be correct?
I should mention that the code repo will be shared among multiple machines, so hardcoding an absolute path is not an option.
You know the correct relative path to the file from the directory where config.py is located
You know the correct relative path to the directory where config.py is located (in your case, ..)
Both of this things are system-independent and do not change unless you change the structure of you project. Just add them together using os.path.join('..', config.path_repative_to_config)
(Not sure who posted this as a comment, then deleted it, but it seems to work so I'm posting as an answer.) The trick is to use os.path.dirname(__file__) in the config file, which gives the directory of the config file (/code/) regardless of where the script that imports config is.
Specifically to answer the question, in the config file define
path1 = os.path.abspath(os.path.join(os.path.join(os.path.join( os.path.dirname(__file__) , '..'), 'data' ), 'data_file1' ) )
We are working on an add-on that writes to a log file and we need to figure out where the default var/log directory is located (the value of the ${buildout:directory} variable).
Is there an easy way to accomplish this?
In the past I had a similar use case.
I solved it by declaring the path inside the zope.conf:
zope-conf-additional +=
<product-config pd.prenotazioni>
logfile ${buildout:directory}/var/log/prenotazioni.log
</product-config>
See the README of this product:
https://github.com/PloneGov-IT/pd.prenotazioni/
This zope configuration can then be interpreted with this code:
from App.config import getConfiguration
product_config = getattr(getConfiguration(), 'product_config', {})
config = product_config.get('pd.prenotazioni', {})
logfile = config.get('logfile')
See the full example
here: https://github.com/PloneGov-IT/pd.prenotazioni/blob/9a32dc6d2863b5bfb5843d441e652101406d9a2c/pd/prenotazioni/init.py#L17
Worth noting is the fact that the initial return avoids multiple logging if the init function is mistakenly called more than once.
Anyway, if you do not want to play with buildout and custom zope configuration, you may want to get the default event log location.
It is specified in the zope.conf. You should have something like this:
<eventlog>
level INFO
<logfile>
path /path/to/plone/var/log/instance.log
level INFO
</logfile>
</eventlog>
I was able to obtain the path with this code:
from App.config import getConfiguration
import os
eventlog = getConfiguration().eventlog
logpath = eventlog.handler_factories[0].instance.baseFilename
logfolder = os.path.split(logpath)[0]
Probably looking at in the App module code you will find a more straightforward way of getting this value.
Another possible (IMHO weaker) solution would be store (through buildout or your prefered method) the logfile path into an environment variable.
You could let buildout set it in parts/instance/etc/zope.conf in an environment variable:
[instance]
recipe = plone.recipe.zope2instance
environment-vars =
BUILDOUT_DIRECTORY ${buildout:directory}
Check it in Python code with:
import os
buildout_directory = os.environ.get('BUILDOUT_DIRECTORY', '')
By default you already have the INSTANCE_HOME environment variable, which might be enough.
I have a following directory structure:
parentDirectory
|
-----------------------------------------
| |
Task Configs
- pythonFile1.py - Config1.py
- Config2.py
The Configuration files have few configuration constants defined inside a class.
Now, I want to import the configuration files from Configs directory into the python file under Tasks directory and make use of constants defined inside the class in each config file.
I had tried with adding(after reading through few answers) -
sys.path.insert(0,'/home/MyName/parentDirectory/Tasks')
inside the config files.
Since I am new to python, I don't know to what extent I am correct in adding the above lines.
Please help!
I think you got it backward: if you want your "Tasks" to be able to import your "Configs", you need to add code to the Tasks to insert the Configs path into sys.path.
I've looked around the docs on the pytest website, but haven't found a clear example of working with 'test resources', such as reading in fixed files during unit tests. Something similar to what http://jlorenzen.blogspot.com/2007/06/proper-way-to-access-file-resources-in.html describes for Java.
For example, if I have a yaml file checked in to source control, what is the right way to write a test which loads from that file? I think this boils down to understanding the right way to access a 'resource file' on the python equivalent of the classpath (PYTHONPATH?).
This seems like it should be simple. Is there an easy solution?
Perhaps what you are looking for is pkg_resources or pkgutil. For example, if you have a module within your python source called "resources", you could read your "resourcefile" using:
with open(pkg_resources.resource_filename("resources", "resourcefile")) as infile:
for line in infile:
print(line)
or:
with tempfile.TemporaryFile() as outfile:
outfile.write(pkgutil.get_data("resources", "resourcefile"))
The latter even works when your "script" is an executable zip file. The former works without needing to unpack your resources from an egg.
Note that creating a subdirectory of your source does not make it a module. You need to add a file named __init__.py within the directory for it to be visible as a module for the purposes of pkg_resources and pkgutil. __init__.py can be empty.
I think "resource file" is whatever definition you give to it in python (in Java, resource files can be bundled into jar files with ordinary Java classes, and Java provides library functions to access this information).
An equivalent solution might be to access the PYTHONPATH environment variable, define your "resource file" as a relative path, and then troll the PYTHONPATH looking for it. Here's an example:
pythonpath = os.env['PYTHONPATH']
file_relative_path = os.path.join('subdir', 'resourcefile') // e.g. subdir/resourcefile
for dir in pythonpath.split(os.pathsep):
resource_path = os.path.join(dir, file_relative_path)
if os.path.exists(resource_path):
return resource_path
This code snippet returns a full path for the first file that exists on the PYTHONPATH.