I have an abstract base class with a number of derived classes. I'm trying to achieve the same behaviour that I would get by placing all the derived classes in the same file as the base class, i.e. if my classes are Base, DerivedA, DerivedB, DerivedC in the file myclass.py I can write in another file
import myclass
a = myclass.DerivedA()
b = myclass.DerivedB()
c = myclass.DerivedC()
but with each derived class in its own file. This has to be dynamic, i.e. such that I could e.g. delete derived_c.py and everything still works except that now I can no longer call myclass.DerivedC, or that if I add a derived_d.py, I could use it without touching the __init__.py so simply using from derived_c import DerivedC is not an option.
I've tried placing them all in a subdirectory and in that directory's __init__.py use pkgutil.walk_packages() to import all the files dynamically, but I can't get them to then be directly in the module's namespace, i.e. rather than myclass.DerivedC() I have to call myclass.derived_c.DerivedC() because I can't figure out how (or if it's possible) to use importlib to achieve the equivalent of a from xyz import * statement.
Any suggestions for how I could achieve this? Thanks!
Edit: The solutions for Dynamic module import in Python don't provide a method for automatically importing the classes in all modules into the namespace of the package.
I had to make something quite similar a while back, but in my case I had to dynamically create a list with all subclasses from a base class in a specific package, so in case you find it useful:
Create a my_classes package containing all files for your Base class and all subclasses. You should include only one class in each file.
Set __all__ appropriately in __init__.py to import all .py files except for __init__.py (from this answer):
from os import listdir
from os.path import dirname, basename
__all__ = [basename(f)[:-3] for f in listdir(dirname(__file__)) if f[-3:] == ".py" and not f.endswith("__init__.py")]
Import your classes using from my_classes import *, since our custom __all__ adds all classes inside the my_classes package to the namespace.
However, this does not allow us direct access to the subclasses yet. You have to access them like this in your main script:
from my_classes import *
from my_classes.base import Base
subclasses = Base.__subclasses__()
Now subclasses is a list containing all classes that derive from Base.
Since Python 3.6 there exists a method for initializing subclasses. This is done on definition, so before all of your code gets executed. In here you can simply import the sub-class that is initialized.
base.py
class Base:
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
__import__(cls.__module__)
sub1.py
class Sub1(Base):
pass
sub2.py
class Sub2(Base):
pass
Related
What I'd like to do
I'd like to import a Python module without adding it to the local namespace.
In other words, I'd like to do this:
import foo
del foo
Is there a cleaner way to do this?
Why I want to do it
The short version is that importing foo has a side effect that I want, but I don't really want it in my namespace afterwards.
The long version is that I have a base class that uses __init_subclass__() to register its subclasses. So base.py looks like this:
class Base:
_subclasses = {}
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
cls._subclasses[cls.__name__] = cls
#classmethod
def get_subclass(cls, class_name):
return cls._subclasses[class_name]
And its subclasses are defined in separate files, e.g. foo_a.py:
from base import Base
class FooA(Base):
pass
and so on.
The net effect here is that if I do
from base import Base
print(f"Before import: {Base._subclasses}")
import foo_a
import foo_b
print(f"After import: {Base._subclasses}")
then I would see
Before import: {}
After import: {'FooA': <class 'foo_a.FooA'>, 'FooB': <class 'foo_b.FooB'>}
So I needed to import these modules for the side effect of adding a reference to Base._subclasses, but now that that's done, I don't need them in my namespace anymore because I'm just going to be using Base.get_subclass().
I know I could just leave them there, but this is going into an __init__.py so I'd like to tidy up that namespace.
del works perfectly fine, I'm just wondering if there's a cleaner or more idiomatic way to do this.
If you want to import a module without assigning the module object to a variable, you can use importlib.import_module and ignore the return value:
import importlib
importlib.import_module("foo")
Note that using importlib.import_module is preferable over using the __import__ builtin directly for simple usages. See the builtin documenation for details.
I am struggling with what seems to me a very basic and common problem, and the fact that I could not find any answer after hours of Internet searching tells me that I must be doing something very wrong...
I am simply trying to find an elegent way to handle imports with my package.
The background :
My package is structured like this :
mypackage/
__init__.py
model/
__init__.py
A.py
B.py
controllers/
__init__.py
A.py
B.py
# mypackage/model/A.py
class A:
def __init__(self):
print("This is A's model.")
# mypackage/model/B.py
from mypackage.model.A import A as AModel
class B:
def __init__(self):
self._a_model = AModel()
print("This is B's model.")
# mypackage/controllers/A.py
class A:
def __init__(self):
print("This is A's controller.")
# mypackage/controllers/B.py
from mypackage.controllers.A import A as AController
class B:
def __init__(self):
self._a = AController()
print("This is B's controller.")
The problem :
Two things are really bothering me with this design.
First : I want to use namespaces
I really don't like writing
from mypackage.controllers.A import A as AController
...
self._a = AController()
It feels cumbersome and not very pythonic...
I would prefer using namespaces like in :
from mypackage import controllers
...
self._a = controllers.A.A()
But if I try, I get a AttributeError: module 'mypackage.controllers' has no attribute 'A'
Second : I really don't like typing the class's filename
I really don't like writing
from mypackage.controllers.A import A as AController
I would prefer :
from mypackage.controllers import A as AController
What did not work
Putting everything in one file
I understand that I could get what I want by puting all controller's class (A and B) defininitions in a single file (controllers.py) and do the same with the models...
I read several time that putting several class definitions in a single file is a quite common thing to do in python... But for big separate classes I just can't. If A and B are hundreds of lines and have nothing to do with each other (other than being controllers and models), having their definitions in a single file is unusable.
Using imports in the init.py files
Indeed it would solve all my problems...
Except that :
It leads to circular imports. As you can see I need A's model in B's model... So if all models are imported when I need to access one of them, I'm stuck in vicious circle...
It does not seems very pythonic. If only because it forces the user to load every modules.
Here I am...
What is wrong with my reasoning ?
Using imports in the __init__.py file
That is the way to go.
Sorry if it looks too much boiler plate for you, but if yru project is big, that is what is needed.
As for the circular import problem: it will kick in wether you write your imports
in the __init__ files or not, and is just a matter of logistic.
If in your __init__ file, you write the imports in the correct order, there will be no circularity problem.
I.e. in your myproject/models/__init__.py you have:
from .A import A as AModel
from .B import B as BModel
Of course you naming the .py files the same names as the classes won't help you - if you will at least let go of the casing in the filename you can write:
from .a import A
from .b import B
Otherwise, you can do just:
import myproject.models.A
import myproject.models.B
A = myproject.models.A.A
B = myproject.models.B.B
To be able to use "myproject.models.A" as the class:
The name A inside __init__ will override the module object
with the same name.
One writting
import myproject.models.A will get to the module, but by doing
from myproject.models import A you get the module.
If that feels confusing... try not to use the same name for the module file than the classes. Even because in case-ignoring file systems, like Windows you would
ambiguities anyway. Stick with the convention: module names in snake_case, class names in CamelCase
Back to the circular-imports matter: the point is that in this design, b.py is only read after a.py has been already imported, and no circular-import problem.
That is not always possible - sometimes cross-reference between submodules are needed. What is possible to do in these cases is to move the import lines into
the functions or methods, instead of as a global statement. That way, when the import is executed, the referred module is already initialised.
In your example that would be:
mypackage.models.b.py
# mypackage/model/B.py
class B:
def __init__(self):
from mypackage.model.A import A as AModel
self._a_model = AModel()
print("This is B's model.")
# mypackage/controllers/__init__.py
from A import A
Then you can make a new file outside of mypackage with.
# check.py
from mypackage.controllers import A as AController
from mypackage import controllers
a = controllers.A()
>>> This is A's controller.
let us know if it works for you.
[Off the top of my head, without testing]
I really don't like writing
from mypackage.controllers.A import A as AController
#...
self._a = AController()
It feels cumbersome and not very pythonic... I would prefer using
namespaces like in :
from mypackage import controllers
# ...
self._a = controllers.A.A()
In mypackage/controllers/__init__.py you would need: from . import A.
I really don't like writing
from mypackage.controllers.A import A as AController
I would prefer :
from mypackage.controllers import A as AController
In mypackage/controllers/__init__.py you would need from A import A.
I have the following (toy) package structure
root/
- package1/
- __init__.py
- class_a.py
- class_b.py
- run.py
In both class_a.py and class_b.py I have a class definition that I want to expose to run.py. If I want to import them this way, I will have to use
from package1.class_a import ClassA # works but doesn't look nice
I don't like that this shows the class_a.py module, and would rather use the import style
from package1 import ClassA # what I want
This is also closer to what I see from larger libraries. I found a way to do this by importing the classes in the __init__.py file like so
from class_a import ClassA
from class_b import ClassB
This works fine if it wasn't for one downside: as soon as I import ClassA as I would like (see above), I also immediately 'import' ClassB as, as far as I know, the __init__.py will be run, importing ClassB. In my real scenario, this means I implicitly import a huge class that I use very situationally (which itself imports tensorflow), so I really want to avoid this somehow. Is there a way to create the nice looking imports without automatically importing everything in the package?
It is possible but require a rather low level customization: you will have to customize the class of your package (possible since Python 3.5). That way, you can declare a __getattr__ member that will be called when you ask for a missing attribute. At that moment, you know that you have to import the relevant module and extract the correct attribute.
The init.py file should contain (names can of course be changed):
import importlib
import sys
import types
class SpecialModule(types.ModuleType):
""" Customization of a module that is able to dynamically loads submodules.
It is expected to be a plain package (and to be declared in the __init__.py)
The special attribute is a dictionary attribute name -> relative module name.
The first time a name is requested, the corresponding module is loaded, and
the attribute is binded into the package
"""
special = {'ClassA': '.class_a', 'ClassB': '.class_b'}
def __getattr__(self, name):
if name in self.special:
m = importlib.import_module(self.special[name], __name__) # import submodule
o = getattr(m, name) # find the required member
setattr(sys.modules[__name__], name, o) # bind it into the package
return o
else:
raise AttributeError(f'module {__name__} has no attribute {name}')
sys.modules[__name__].__class__ = SpecialModule # customize the class of the package
You can now use it that way:
import package1
...
obj = package1.ClassA(...) # dynamically loads class_a on first call
The downside is that clever IDE that look at the declared member could choke on that and pretend that you are accessing an inexistant member because ClassA is not statically declared in package1/__init__.py. But all will be fine at run time.
As it is a low level customization, it is up to you do know whether it is worth it...
Since 3.7 you could also declare a __gettatr__(name) function directly at the module level.
On Python 3.6
I want to organize my classes in folders.
ClassX in folder Classes
Subclass_of_ClassX in a subfolder
MyPackage
-__init__.py
-someCode.py
-folder Classes
-__init__.py
-ClassX.py
-Subfolder SubClasses
-__init__.py
-Subclass_of_ClassX.py
Subclass_of_ClassX overriddes elements (functions and variables) from ClassX
The question is: if I import ClassX on any project, and I get any instance of Subclass_of_ClassX, and I want to use any overriden function, do I need to explicitly import the subclass?
Or can I just import ClassX and abstract from any inherited class, using any subclass instance as if it were ClassX?
If you get an instance of Subclass_of_ClassX from somewhere, you don't need to additionally import the class at all. Importing a class just makes the name available in your current scope (module).
You only need to import the class or subclass when you need to...
create a new instance
write a subclass
use it otherwise like in isinstance(obj, ClassX)
If you got instance of class - you already don't need to import anything, because it was imported in some other place already.
What you describe is not the Pythonic way to structure packages.
You'll end up with having to do
from mypackage.Classes.ClassX import ClassX
when you could do with
from mypackage.class_x import ClassX
If your package is, uh, for the sake of example, a zoo where you have animals and foods, I would suggest putting the animals in one package and their foods in another.
myzoo/__init__.py (empty)
myzoo/animals/__init__.py (empty)
myzoo/animals/base.py (containing your Animal base class)
myzoo/animals/cats.py (containing Lion and Tiger, for instance)
myzoo/foods/__init__.py (empty)
myzoo/foods/base.py (containing your Food base class)
myzoo/foods/kibbles.py (containing kibbles for the cats, not that they'll like them)
(You could also have one file per animal/food, and probably should if you anticipate them growing large.)
I have a python package which gathers multiple modules. In those modules, I have multiple classes heriting from a Component class. I'd like to make the load of those classes dynamic and to build some object dynamically.
ex:
package/module1.py
/module2.py
in module1.py, there is multiple classes heriting from the class Component, the same with module2.py, of course the number of classes and package is unknown. The final user define which object has to be built in a config file. In order to iter throught modules, I use the pkgutil.iter_modules which is working. From my function in charge to build the components, I do like this:
[...]
myPckge = __import__('package.module1', globals(), locals(), ['class1'], -1)
cmpt_object = locals()[component_name](self, component_prefix, *args)
[...]
However, this is not working as the class is not recognized, the following works but is not dynamic:
cmpt_object = myPckge.class1(self, component_prefix, *args)
thanks for your reply
you can use execfile() to load modules on the fly and then use exec() to create new objects from them. But I dont understand why you're doing this!
To find the subclasses of a class in a specified module, you can do something like:
import inspect
def find_subclasses(module, parent_cls):
return [clazz for name, clazz in inspect.getmembers(module)
if inspect.isclass(clazz) and
issubclass(clazz, parent_cls) and
clazz.__module__ == module.__name__ and # do not keep imported classes
clazz is not parent_cls]
Note that parent_cls doesn't have to be the direct parent of a class for it to be returned.
Then you can dynamically load classes from a module, knowing the module's name and directory, and the parent class of the classes you want.
import imp
def load_classes(module_name, module_dir, parent_cls):
fle, path, descr = imp.find_module(module_name, [module_dir])
if fle:
module = imp.load_module(module_name, fle, path, descr)
classes = find_subclasses(module, parent_cls)
return classes
return [] # module not found