my problem is following, I have an application in the following hierarchy:
main.py
package/__init__.py
package/MyClass.py
Clearly there is a package named "package". In the MyClass.py there is a class definition.
Now, while im in "main.py" file, to get the class instance i have to do:
package.MyClass.MyClass()
in order to create and instance of that class.
Now I want to be able to write only
package.MyClass()
To instantiate a class. How do I do this? I saw many apis do it this way, is there some trick to do it?
Add this in __init__.py:
from .MyClass import Myclass
Then, in main.py:
import package
package.MyClass()
Or again:
from package import MyClass
MyClass()
Make __init__.py:
from .package import MyClass
Related
I name my modules same as class names. So MyClass.py would contain class MyClass.
(Why? I adopted this naming convention from other languages and would rather not change it)
I put it in directory components with __init__.py like this:
from .MyClass1 import MyClass1
from .MyClass2 import MyClass2
I do exactly the same thing for classes in directory controllers.
Then, in my main.py, I try to import them like so:
from components import MyClass1, MyClass2
from controllers import MyClass3, MyClass4
From components it imports fine, from controllers, it ignores my __init__.py and imports a module instead of class, so I have to rewrite my imports like so:
from components import MyClass1, MyClass2
from controllers.MyClass3 import MyClass3
from controllers.MyClass4 import MyClass4
Please tell me what could be the reason? Is there a way to fix this without having to change naming convention? Thank you.
I have a base class A in base.py:
import module1
class A:
def test(self):
module1.sample("test")
Then in new.py I created a new class B which inherits A and override test method:
from base import A
class B(A):
def test(self):
module1.sample("test")
print("Testing...")
The problem is that the module1 is no longer available in new.py. Is there any options that I do not need to import module1 again in new.py?
One not recommended way to achieve what you want is to use __builtins__. Add the following line to base.py.
__builtins__['module1'] = module1
Then module1 is no longer undefined from new.py. It is definitely defined in __builtins__.
Again, it is not recommended, however, good to understand how Python works. You would better import module1 from new.py as well.
import module1
from base import A
...
I have the following (toy) package structure
root/
- package1/
- __init__.py
- class_a.py
- class_b.py
- run.py
In both class_a.py and class_b.py I have a class definition that I want to expose to run.py. If I want to import them this way, I will have to use
from package1.class_a import ClassA # works but doesn't look nice
I don't like that this shows the class_a.py module, and would rather use the import style
from package1 import ClassA # what I want
This is also closer to what I see from larger libraries. I found a way to do this by importing the classes in the __init__.py file like so
from class_a import ClassA
from class_b import ClassB
This works fine if it wasn't for one downside: as soon as I import ClassA as I would like (see above), I also immediately 'import' ClassB as, as far as I know, the __init__.py will be run, importing ClassB. In my real scenario, this means I implicitly import a huge class that I use very situationally (which itself imports tensorflow), so I really want to avoid this somehow. Is there a way to create the nice looking imports without automatically importing everything in the package?
It is possible but require a rather low level customization: you will have to customize the class of your package (possible since Python 3.5). That way, you can declare a __getattr__ member that will be called when you ask for a missing attribute. At that moment, you know that you have to import the relevant module and extract the correct attribute.
The init.py file should contain (names can of course be changed):
import importlib
import sys
import types
class SpecialModule(types.ModuleType):
""" Customization of a module that is able to dynamically loads submodules.
It is expected to be a plain package (and to be declared in the __init__.py)
The special attribute is a dictionary attribute name -> relative module name.
The first time a name is requested, the corresponding module is loaded, and
the attribute is binded into the package
"""
special = {'ClassA': '.class_a', 'ClassB': '.class_b'}
def __getattr__(self, name):
if name in self.special:
m = importlib.import_module(self.special[name], __name__) # import submodule
o = getattr(m, name) # find the required member
setattr(sys.modules[__name__], name, o) # bind it into the package
return o
else:
raise AttributeError(f'module {__name__} has no attribute {name}')
sys.modules[__name__].__class__ = SpecialModule # customize the class of the package
You can now use it that way:
import package1
...
obj = package1.ClassA(...) # dynamically loads class_a on first call
The downside is that clever IDE that look at the declared member could choke on that and pretend that you are accessing an inexistant member because ClassA is not statically declared in package1/__init__.py. But all will be fine at run time.
As it is a low level customization, it is up to you do know whether it is worth it...
Since 3.7 you could also declare a __gettatr__(name) function directly at the module level.
I have an abstract base class with a number of derived classes. I'm trying to achieve the same behaviour that I would get by placing all the derived classes in the same file as the base class, i.e. if my classes are Base, DerivedA, DerivedB, DerivedC in the file myclass.py I can write in another file
import myclass
a = myclass.DerivedA()
b = myclass.DerivedB()
c = myclass.DerivedC()
but with each derived class in its own file. This has to be dynamic, i.e. such that I could e.g. delete derived_c.py and everything still works except that now I can no longer call myclass.DerivedC, or that if I add a derived_d.py, I could use it without touching the __init__.py so simply using from derived_c import DerivedC is not an option.
I've tried placing them all in a subdirectory and in that directory's __init__.py use pkgutil.walk_packages() to import all the files dynamically, but I can't get them to then be directly in the module's namespace, i.e. rather than myclass.DerivedC() I have to call myclass.derived_c.DerivedC() because I can't figure out how (or if it's possible) to use importlib to achieve the equivalent of a from xyz import * statement.
Any suggestions for how I could achieve this? Thanks!
Edit: The solutions for Dynamic module import in Python don't provide a method for automatically importing the classes in all modules into the namespace of the package.
I had to make something quite similar a while back, but in my case I had to dynamically create a list with all subclasses from a base class in a specific package, so in case you find it useful:
Create a my_classes package containing all files for your Base class and all subclasses. You should include only one class in each file.
Set __all__ appropriately in __init__.py to import all .py files except for __init__.py (from this answer):
from os import listdir
from os.path import dirname, basename
__all__ = [basename(f)[:-3] for f in listdir(dirname(__file__)) if f[-3:] == ".py" and not f.endswith("__init__.py")]
Import your classes using from my_classes import *, since our custom __all__ adds all classes inside the my_classes package to the namespace.
However, this does not allow us direct access to the subclasses yet. You have to access them like this in your main script:
from my_classes import *
from my_classes.base import Base
subclasses = Base.__subclasses__()
Now subclasses is a list containing all classes that derive from Base.
Since Python 3.6 there exists a method for initializing subclasses. This is done on definition, so before all of your code gets executed. In here you can simply import the sub-class that is initialized.
base.py
class Base:
def __init_subclass__(cls, **kwargs):
super().__init_subclass__(**kwargs)
__import__(cls.__module__)
sub1.py
class Sub1(Base):
pass
sub2.py
class Sub2(Base):
pass
I have a class like this in package a:
class A:
#staticmethod
def method(param1, param2):
...
return something
And in a python file:
from a import A
print(A.A.method(p1,p2))
What is wrong in my definitions? I think isn't correct to call a static method by `Class.Class.method'.
Screenshots with errors:
[EDIT after question was updated with picture]
So your top-level app is called a you can see this in the picture just under "Information" on the left in the picture. Then you have a module called A, which you can see the file A.py just under the folder a. This module A has the class called A
so when you do from a import A you are importing the file A.py
That file has a class A() which has def method(p1, p2)
You say A.A.method() is working... and that would be correct
It now becomes
from a import A
print(A.A.method(p1,p2)
Alternatively
import a
print(a.A.A.method(p1,p2)
Or
from a.A import A
print(A.method(p1, p2)
My advice: start using some more descriptive names, not a for top
level package and A for module and A for class