Share 50+ constants between modules in a pythonic way - python

I understand this is not a 'coding' question but I need a coding solution.
I have multiple packages I wrote, all supposed to be encapsulated, and independent on external parameters other than a few input arguments.
On the other side, I have a general file constants.py with 50+ constants which those packages better use in order to provide an output dictionary without hardcoded names :
A PACKAGE OUTPUT:
{
'sub_name':xyz,
'sub_type':yzg
}
Here sub_name should be given to the package as input so the general program will know what to do with sub_name output.
How should I share constants.py with the packages ?
The obvious way is to just import constants.py, which makes the package dependent on an external file somewhere else in the program.
The other way is to keep constants in some class Keys and send it as argument.
Could/should I send constants.py as an argument ?
I find it hard to understand how packages should be written and organized when inside a larger project, in a way they can be reused by other devs independently.

You can store constants in __init__.py and import them in submodules.
Example:
main_module/__init__.py
# inside this file
CONSTANT_A = 42
CONSTANT_B = 69
then:
main_module/submodule.py
# inside this file
from main_module import CONSTANT_A
print(CONSTANT_A)
>>42

Related

Importing a List of Scripts Dynamically - Python

Say I have N python files (each with a main function) in a file structure like so:
tools \
|_ tool_1.py
|_ tool_2.py
...
|_ tool_N.py
Furthermore, I have a data structure like so:
files = [
{"path":"tools/tool_1.py", "alias" : "tools__tool_1"}
{"path":"tools/tool_2.py", "alias" : "tools__tool_2"}
...
{"path":"tools/tool_N.py", "alias" : "tools__tool_N"}
]
How can I dynamically import these files into a single python file? The number of tools will increase over time and manually adding a new line for each is not feasible.
So how can I convert this:
from tools.tool_1 import main as tools__tool_1
from tools.tool_2 import main as tools__tool_2
...
from tools.tool_N import main as tools__tool_N
To this?
for file in files:
from file["path"] import main as file["alias"]
Ok, first a couple of disclaimers: (1) Dynamically importing modules may or may not be what you actually need. I have done so myself, but in my case I had a library with like 100 different models, and a common driver that dynamically loaded one of those models depending on the command-line options I gave it. The main point is that I never needed more than one of them loaded at a time, so it made sense to load that one module dynamically.
And (2) I'm far from an expert at importing modules and packages, but I'm usually able to get it to do what I want.
That having been said, if you believe that dynamically importing modules is what you want, then this should work for you. Note that I tried to create a complete example for you:
import importlib
files = [
{"path" : "tools.tool_1", "name" : "tools__tool_1"},
{"path" : "tools.tool_2", "name" : "tools__tool_2"},
{"path" : "tools.tool_3", "name" : "tools__tool_3"}
]
module_dict = {}
main_dict = {}
for file_desc in files:
path = file_desc["path"]
name = file_desc["name"]
module = importlib.import_module(path)
module_dict[name] = module
main_dict[name] = module.main
main_dict["tools__tool_1"]()
In this example, there are three modules that all reside in the directory tools. The modules are tool_1, tool_2, and tool_3. They are imported and stored in dictionaries under the names tools__tool_1, etc. Note: You may be able to simply use tool_1 etc. for these names, unless you need to qualify them with tools__ because you want to load modules from other directories into the same dictionaries.
Note that none of these imports have any effect on your global namespace. The modules are imported as objects, and they (or their main functions) are stored only in the dictionaries.
In terms of what you need, I wasn't entirely sure what you wanted, so I created two dictionaries. The first is module_dict, which imports the entire modules. The second is main_dict, which simply contains the main function from each imported module, as described in the original post.
Note that each module is only imported once. If you only need one of these dictionaries, it's simple enough to just remove the one you don't want.
Anyway, suppose you want to invoke main from tools.tool_1. You can do this from main_dict as follows:
main_dict["tools__tool_1"]()
If you want to invoke it, or any other function, from module_dict, you can do:
module_dict["tools__tool_1"].main()
You can basically access everything in a module from module_dict, but if you only want to access main, then you could just have main_dict.
Again, there is probably more here than you need, but I wasn't entirely certain how you intended to use this. If you only need one of the dictionaries, just get rid of the other.
You need to call exec() function. See sample below:
exec('from datetime import datetime')
print(datetime.now())
So in your case it would be:
for file in files:
exec(f'from {file["path"]} import main as {file["alias"]}')

How to approximate "import package.*" in Python

I'm writing a Python program that can parse a binary protocol consisting of messages. Each message has an identifier, followed by type-specific data. I've structured my app so that each message-type is represented by a (sub)class, located in its own module inside my package:
messages/
__init__.py
MessageTypeOne.py
MessageTypeTwo.py
...
From my main file (which is inside the same package, but I don't think it matters) I would like to do the equivalent of
import package.*
That is, I would like all module types to be loaded, but not imported in to the local namespace (i.e. not what from package import * would do). I prefer not to list the message types explicitly (simply adding a file should be enough), but using something similar to the __all__ construct from from bla import * would be acceptable.
I've found a way to accomplish this by looping over os.listdir(__path__), and importlib.import_module()'ing each found file, but this feels overly hacky... Is there a more elegant way to do this?
Update:
Depending on the usage (e.g. decoding for logging or sending a single message), I don't always want to import every message type, so statically importing them in __init__.py is not desirable
I would probably have __init__.py do the importing:
# in __init__.py
from . import MessageTypeOne
from . import MessageTypeTwo
...
If you don't want to do that, you can use __import__:
__import__('package', fromlist=['*'])
This performs the same initialization as from package import * would, but without actually binding any names in the local namespace. Note that it won't initialize any submodules that from package import * wouldn't, so you still need to configure the __all__ list in __init__.py.

Python : import module once for a whole package

I'm currently coding an app which is basically structured that way :
main.py
+ Package1
+--- Class1.py
+--- Apps
+ Package2
+--- Class1.py
+--- Apps
So I have two questions :
First, inside both packages, there are modules needed by all Apps, eg : re. Is there a way I can import a module for the whole package at once, instead of importing it in every file that needs it ?
And, as you can see, Class1 is used in both packages. Is there a good way to share it between both packages to avoid code duplication ?
I would strongly recommend against doing this: by separating the imports from the module that uses the functionality, you make it more difficult to track dependencies between modules.
If you really want to do it though, one option would be to create a new module called common_imports (for example) and have it do the imports you are after.
Then in your other modules, add the following:
from common_imports import *
This should give you all the public names from that module (including all the imports).
To answer your second question, if your two modules named Class1.py are in fact the same, then you should not copy it to both packages. Place it in a package which will contain only code which is common to both, and then import it. It is absolutely not necessary to copy the file and try to maintain each change in both copies.
Q1:
You Must find a way to import your package, thus you have two choices:
(Please correct me if I'm wrong or not thorough)
1. Look at James' solution, which you need to define a class, put all the modules inside and finally import them to your Sub-classes
2. Basically import nothing to your Main class, but instead, import only once to your Sub-classes
For example:(inside A_1 subclass)
import re
def functionThatUseRe(input):
pass
Then inside your main class, just do
try:
from YourPackage import A_1 #windows
except:
import A_1 #MAC OSX
A_1.functionThatUseRe("")
And you completely avoided importing modules multiple times
Q2: put your class1.py in the same directory with your main class, or move it to another folder, in Package1(&2).Apps
import Class1
Start using the code from there

Scalability of Python Module / Package system to large projects

I would like to have a nice hierarchy of modules for a large
project.. (Python seems to get in the way of this) I am confused about
the distinction of modules and packages and how they relate to the C++
concept of a namespace. For concreteness my project is a compiler and
the code generation phases want to query properties from some set of
abstract representations which are maintained in a different directory
(actually far away in the hierarchy)
The problem can be stated as:
Ass: Let a.py and b.py be two source files somewhere in the project hierarchy
Then: I want to refer to the functions defined in b.py from
a.py -- ideally with a relative path from the well-defined root
directory of the project (which is /src). We want a general-purpose
solution for this, something which will always work..
Dirty hack: It sounds absurd but my putting all sub-directories that contain .py on this
project into PYTHONPATH we will be able to reference them with their name, but with this
the reader of the code loses any sense of hierarchy & relation about the different project
classes etc..
Note: The tutorial on Python.org only mentions the special case of referring from a file c.py to a file d.py placed in its parent directory. Where is the generality that makes
Python scale to really large projects here?
I am not sure if it is the question, but let us see.
Suppose I have the following package scheme (__init__.py files excluded for readability):
foo/baz/quux/b.py
foo/baz/quux/quuuux/c.py
foo/bar/a.py
My foo/baz/quux/b.py file contains this:
def func_b():
print 'func b'
and my foo/baz/quux/quuuux/c.py is:
def func_c():
print 'func c'
If the root directory which contains foo (in your case, src*) is in the Python path, aur foo/bar/a.py file can import any other module starting from foo:
import foo.baz.quux.b as b
import foo.baz.quux.quuuux.c as c
def func_a():
b.func_b()
c.func_c()
And one can use the foo/bar/a.py this way:
import foo.bar.a as a
a.func_a()
Have you tried it? Did you got some error?
* When you deploy your project, I do not believe the root will be src but let us maintain it simple by omitting it :)

Python: Importing an "import file"

I am importing a lot of different scripts, so at the top of my file it gets cluttered with import statements, i.e.:
from somewhere.fileA import ...
from somewhere.fileB import ...
from somewhere.fileC import ...
...
Is there a way to move all of these somewhere else and then all I have to do is import that file instead so it's just one clean import?
I strongly advise against what you want to do. You are doing the global include file mistake again. Although only one module is importing all your modules (as opposed to all modules importing the global one), the remaining point is that if there's a valid reason for all those modules to be collected under a common name, fine. If there's no reason, then they should be kept as separate includes. The reason is documentation. If I open your file, and see only one import, I don't get any information about what is imported and where it comes from. If on the other hand, I have the list of imports, I know at a glance what is needed and what not.
Also, there's another important error I assume you are doing. When you say
from somewhere.fileA import ...
from somewhere.fileB import ...
from somewhere.fileC import ...
I assume you are importing, for example, a class, like this
from somewhere.fileA import MyClass
this is wrong. This alternative solution is much better
from somewhere import fileA
<later>
a=fileA.MyClass()
Why? two reasons: first, namespacing. If you have two modules having a class named MyClass, you would have a clash. Second, documentation. Suppose you use the first option, and I find in your code the following line
a=MyClass()
now I have no idea where this MyClass comes from, and I will have to grep around all your files in order to find it. Having it qualified with the module name allows me to immediately understand where it comes from, and immediately find, via a /search, where stuff coming from the fileA module is used in your program.
Final note: when you say "fileA" you are doing a mistake. There are modules (or packages), not files. Modules map to files, and packages map to directories, but they may also map to egg files, and you may even create a module having no file at all. This is naming of concepts, and it's a lateral issue.
Of course there is; just create a file called myimports.py in the same directory where your main file is and put your imports there. Then you can simply use from myimports import * in your main script.

Categories

Resources