How do I read conf.py setting in Sphinx extension node? - python

from docutils.parsers.rst.directives.images import Figure
class MyFigure(Figure):
def run(self):
# here I need to read the 'thumbnails_folder' setting
pass
def setup(app):
app.add_config_value('thumbnails_folder', '_thumbnails', 'env')
How can I access the config value in .run()? I read sources of Sphinx-contrib, but did not see things done in my way, so they accessed conf.py the way I can't. Or should I do it in a different manner?
All I want to do is translate this
.. figure:: image/image.jpg
into this:
.. image:: image/thumbnails/image.jpg
:target: image/image.jpg
Here's the extension code
(the thumbnail is generated with PIL). And also put the :target: into downloadable files (As I see, only builder instances can do this).

The build environment holds a reference to the Config object. Configuration variables can be retrieved from this object:
def run(self):
env = self.state.document.settings.env # sphinx.environment.BuildEnvironment
config = env.config # sphinx.config.Config
folder = config["thumbnails_folder"]
...

Related

What is the best way to allow user to configure a Python package

I have situation like this. I am creating a Python package. That Python package needs to use Redis, so I want to allow the user of the package to define the Redis url.
Here's how I attempted to do it:
bin/main.py
from my_package.main import run
from my_package.config import config
basicConfig(filename='logs.log', level=DEBUG)
# the user defines the redis url
config['redis_url'] = 'redis://localhost:6379/0'
run()
my_package/config.py
config = {
"redis_url": None
}
my_package/main.py
from .config import config
def run():
print(config["redis_url"]) # prints None instead of what I want
Unfortunately, it doesn't work. In main.py the value of config["redis_url"] is None instead of the url defined in bin/main.py file. Why is that? How can I make it work?
I could pass the config to the run() function, but then if I run some other function I will need to pass the config to that function as well. I'd like to pass it one time ideally.

Use project config variables across different Python scripts

I am working on a project with multiple directories, each having a number of python scripts. And it involves use of certain key parameters I pass using a yaml config file.
Currently the method used is, (I'd say it is naive as) it simply parses the yaml to a python dictionary, which is then imported in other scripts and values are accessed.
From what I could find, there is:
Abseil library that can be used for accessing flags across different scripts but using it is cumbersome.
Another approach using a Class (preferably singleton), putting all global variables in it and exposing instance of that class in other scripts.
I wanted to ask, is there any other library that can be used for this purpose? And what is the most pythonic methodolgy to deal with it?
Any help would be really appreciated. Thanks!
To make global values accessible across modules I use the Class (singleton) Method.
The code I list below is also in my Gisthub https://gist.github.com/auphofBSF/278206afff675cd30377f4894a5b2b1d
My generic GlobalValues singleton class and usage is as follows. This class is located in a subdirectory below the main. In the example of use that I also attach I place the GlobalValues class in a file globals.py in the folder myClasses
class GlobalValues:
"""
a Singleton class to serve the GlobalValues
USAGE: (FirstTime)
from myClasses.globals import GlobalValues
global_values = GlobalValues()
global_values.<new value> = ...
... = global_values.<value>
USAGE: (Second and n'th time, in same module or other modules)
NB adjust `from myClasses.globals` dependent on relative path to this module
from myClasses.globals import GlobalValues
global_values = GlobalValues.getInstance()
global_values.<new value> = ...
... = global_values.<value>
"""
__instance = None
DEFAULT_LOG_LEVEL="CRITICAL"
#staticmethod
def get_instance():
""" Static access method. """
if GlobalValues.__instance == None:
GlobalValues()
return GlobalValues.__instance
def __init__(self):
""" Virtually private constructor. """
if GlobalValues.__instance != None:
raise Exception("This class is a singleton! once created use global_values = Glovalvalues.get_instance()")
else:
GlobalValues.__instance = self
my Example of use is as follows
Example File layout
<exampleRootDir>
Example_GlobalValues_Main.py #THIS is the main
myClasses # A folder
globals.py #for the singleton class GlobalValues
exampleSubModule.py # demonstrates use in submodules
Example_GlobalValues_Main.py
print(
"""
----------------------------------------------------------
Example of using a singleton Class as a Global value store
The files in this example are in these folders
file structure:
<exampleRootDir>
Example_GlobalValues_Main.py #THIS is the main
myClasses # A folder
globals.py #for the singleton class GlobalValues
exampleSubModule.py # demonstrates use in submodules
-----------------------------------------------------------
"""
)
from myClasses.globals import GlobalValues
globalvalues = GlobalValues() # THe only place an Instance of GlobalValues is created
print(f"MAIN: global DEFAULT_LOG_LEVEL is {globalvalues.DEFAULT_LOG_LEVEL}")
globalvalues.DEFAULT_LOG_LEVEL = "DEBUG"
print(f"MAIN: global DEFAULT_LOG_LEVEL is now {globalvalues.DEFAULT_LOG_LEVEL}")
#Add a new global value:
globalvalues.NEW_VALUE = "hello"
#demonstrate using global modules in another module
from myClasses import exampleSubModule
print(f"MAIN: globalvalues after opening exampleSubModule are now {vars(globalvalues)}")
print("----------------- Completed -------------------------------")
exampleSubModule.py is as follows and is located in the myClasses folder
"""
Example SubModule using the GlobalValues Singleton Class
"""
# observe where the globals module is in relation to this module . = same directory
from .globals import GlobalValues
# get the singleton instance of GlobalValues, cannot instantiate a new instance
exampleSubModule_globalvalues = GlobalValues.get_instance()
print(f"exampleSubModule: values in GlobalValues are: {vars(exampleSubModule_globalvalues)}")
#Change a value
exampleSubModule_globalvalues.NEW_VALUE = "greetings from exampleSubModule"
#add a new value
exampleSubModule_globalvalues.SUBMODULE = "exampleSubModule"

Sharing Variables across new object instances

Background:
I am writing a module in order to set-up an embedded system. In this context I need to load some modules and perform some system settings.
Context:
I have a parent class holding some general code (load the config file, build ssh connection etc.) used for several child classes. One of them is the module class that sets up the module and therefore uses among otherthings the ssh connection and the configuration file.
My goal is to share the configuration file and the connection with the next module, that will be setup. For the connection its just a waste to build and destroy it all the time, but for the configuration file, changes during setup can lead to undefined behavior.
Research/ approaches:
I tried using class variables, however they aren't passed when initaiting
a new module object.
Futher, I tried using global variables, but since the parent class and the
child classes are in different files, This won't work (Yes, i can put them
all in one file but this will be a mess) Also using a getter function from
the file where I defined the global variable didn't work.
I am aware of the 'builtin' solution from
How to make a cross-module variable?
variable, but feel this would be a bit of an overkill...
Finally, I can keep the config file and and the connection in a central
script and pass them to each of the instances, but this will lead to loads
of dependencies and I don't think it's a good solution.
So here is a bit of code with an example method, to get some file paths.
The code is set up according to approach 1 (class vaiables)
An example config file:
Files:
Core:
Backend:
- 'file_1'
- 'file_2'
Local:
File_path:
Backend: '/the/path/to'
The Parent class in setup_class.py
import os
import abc
import yaml
class setup(object):
__metaclass__ = abc.ABCMeta
configuration = []
def check_for_configuration(self, config_file):
if not self.configuration:
with open(config_file, "r") as config:
self.configuration = yaml.safe_load(config)
def get_configuration(self):
return self.configuration
def _make_file_list(self, path, names):
full_file_path = []
for file_name in names:
temp_path = path + '/' + file_name
temp_path = temp_path.split('/')
full_file_path.append(os.path.join(*temp_path))
return full_file_path
#abc.abstractmethod
def install(self):
raise NotImplementedError
The module class in module_class.py
from setup_class import setup
class module(setup):
def __init__(self, module_name, config_file = ''):
self.name = module_name
self.check_for_configuration(config_file)
def install(self):
self._backend()
def _backend(self):
files = self._make_file_list(
self.configuration['Local']['File_path']['Backend'],
self.configuration['Files'][self.name]['Backend'])
if files:
print files
And finally a test script:
from module_class import module
Analysis = module('Analysis',"./example.yml")
Core = module('Core', "./example.yml")
Core.install()
Now, when running the code, the config file is loaded everytime, a new module object is initaiated. I would like to avoid this. Are there approaches I have not considdered? What's the neatest way to achive this?
Save your global values in a global dict, and refer to that inside your module.
cache = {}
class Cache(object):
def __init__(self):
global cache
self.cache = cache

Django autoreload: add watched file

When source files in my project change, django server reloads. I want to extend this to non-Python source files. I use native SQL queries, which are stored in separate files (eg. big_select.sql), and I want the server to reload when these files change.
I use django on Windows.
I have tried adding .py extension, which didn't work.
Django>=2.2
The autoreloading was given a major overhaul (thanks to #Glenn who notified about the incoming changes in this comment!), so one doesn't have to use the undocumented Django features and append files to _cached_filenames anymore. Instead, register custom signal listener, listening to autoreloading start:
# apps.py
from django.apps import AppConfig
from django.utils.autoreload import autoreload_started
def my_watchdog(sender, **kwargs):
sender.watch_file('/tmp/foo.bar')
# to listen to multiple files, use watch_dir, e.g.
# sender.watch_dir('/tmp/', '*.bar')
class EggsConfig(AppConfig):
name = 'eggs'
def ready(self):
autoreload_started.connect(my_watchdog)
Django<2.2
Django stores the watched filepaths in the django.utils.autoreload._cached_filenames list, so adding to or removing items from it will force django to start or stop watching files.
As for your problem, this is the (kind of a hacky) solution. For the demo purpose, I adapted the apps.py so the file starts being watched right after django initializes, but feel free to put the code wherever you want to. First of all, create the file as django can watch only files that already exist:
$ touch /tmp/foo.bar
In your django app:
# apps.py
from django.apps import AppConfig
...
import django.utils.autoreload
class MyAppConfig(AppConfig):
name = 'myapp'
def ready(self):
...
django.utils.autoreload._cached_filenames.append('/tmp/foo.bar')
Now start the server, in another console modify the watched file:
$ echo baz >> /tmp/foo.bar
The server should trigger an autoreload now.
The accepted answer did not work in Django 3.0.7 probably due to changes since.
Came up with the following after going through autoreload:
from django.utils.autoreload import autoreload_started
# Watch .conf files
def watch_extra_files(sender, *args, **kwargs):
watch = sender.extra_files.add
# List of file paths to watch
watch_list = [
FILE1,
FILE2,
FILE3,
FILE4,
]
for file in watch_list:
if os.path.exists(file): # personal use case
watch(Path(file))
autoreload_started.connect(watch_extra_files)

How do I access ${buildout:directory} from Python code?

I have a Pyramid web application managed with zc.buildout. In it, I need to read a file on disk, which is located in a sub-directory of buildout directory.
The problem is with determining the path to the file - I do not want to hard-code the absolute path and just providing a relative path does not work when serving the app in production (supposedly because the working directory is different).
So the promising "hooks" I am thinking about are:
the "root" buildout directory, which I can address in buildout.cfg as ${buildout:directory} - however, I can't figure out how can I "export" it so it can be accessed by the Python code
the location of the Paster's .ini file which starts the app
Like #MartijnPieters suggests in a comment on your own answer, I'd use collective.recipe.template to generate an entry in the .ini. I wondered myself how I could then access that data in my project, so I worked it out :-)
Let's work our way backwards to what you need. First in your view code where you want the buildout directory:
def your_view(request):
buildout_dir = request.registry.settings['buildout_dir']
....
request.registry.settings (see documentation) is a "dictonary-like deployment settings object". See deployment settings, that's the **settings that gets passed into your main method like def main(global_config, **settings)
Those settings are what's in the [app:main] part of your deployment.ini or production.ini file. So add the buildout directory there:
[app:main]
use = egg:your_app
buildout_dir = /home/you/wherever/it/is
pyramid.reload_templates = true
pyramid.debug_authorization = false
...
But, and this is the last step, you don't want to have that hardcoded path in there. So generate the .ini with a template. The template development.ini.in uses a ${partname:variable} expansion language. in your case you need${buildout:directory}:
[app:main]
use = egg:your_app
buildout_dir = ${buildout:dir}
# ^^^^^^^^^^^^^^^
pyramid.reload_templates = true
pyramid.debug_authorization = false
...
Add a buildout part in buildout.cfg to generate development.ini from development.ini.in:
[buildout]
...
parts =
...
inifile
...
[inifile]
recipe = collective.recipe.template
input = ${buildout:directory}/development.ini.in
output = ${buildout:directory}/development.ini
Note that you can do all sorts of cool stuff with collective.recipe.template. ${serverconfig:portnumber} to generate a matching port number in your production.ini and in your your_site_name.nginx.conf, for instance. Have fun!
If the path to the file relative to the buildout root or location of paster.ini is always the same, which it seems it is from your question, you could set it in paster.ini:
[app:main]
...
config_file = %(here)s/path/to/file.txt
Then access it from the registry as in Reinout's answer:
def your_view(request):
config_file = request.registry.settings['config_file']
Here's a rather clumsy solution I've devised:
In buildout.cfg I used extra-paths option of zc.recipe.egg to add the buildout directory to sys.path:
....
[webserver]
recipe = zc.recipe.egg:scripts
eggs = ${buildout:eggs}
extra-paths = ${buildout:directory}
then I put a file called app_config.py into the buildout directory:
# This remembers the root of the installation (similar to {buildout:directory}
# so we can import it and use where we need access to the filesystem.
# Note: we could use os.getcwd() for that but it feels kinda wonky
# This is not directly related to Celery, we may want to move it somewhere
import os.path
INSTALLATION_ROOT = os.path.dirname(__file__)
Now we can import it in our Python code:
from app_config import INSTALLATION_ROOT
filename = os.path.join(INSTALLATION_ROOT, "somefile.ext")
do_stuff_with_file(filename)
If anyone knows a nicer solution you're welcome :)

Categories

Resources