I have a package in my project containing many *.py source files (each consisting of one class in most cases, and named by the class). I would like it so that when this package is imported, all of the files in the package are also imported, so that I do not have to write
import Package.SomeClass.SomeClass
import Package.SomeOtherClass.SomeOtherClass
import ...
just to import every class in the package. Instead I can just write
import Package
and every class is available in Package, so that later code in the file can be:
my_object = Package.SomeClass()
What's the easiest way of doing this in __init__.py? If different, what is the most general way of doing this?
The usual method is inside package/__init__.py
from .SomeClass import SomeClass
As Winston put it, the thing to do is to have an __init__.py file where all your classes are available in the module (global) namespace.
One way to do is, is to have a
from .myclasfile import MyClass
line for each class in your package, and that is not bad.
But, of course, this being Python, you can "automagic" this by doing something like this in __init__.py
import glob
for module_name in glob.glob("*.py"):
class_name = module_name.split(".")[0]
mod = __import__(module_name)
globals()[class_name] = getattr(mod, class_name)
del glob
Related
I'd like to make the following line dynamic :
from my_package import my_class as my_custom_name
I know how to dynamically import modules via string
import importlib
module_name = "my_package"
my_module = importlib.import_module(module_name)
as suggested here. However it still doesn't let me specify the class I want to import (my_class) and the alias I want to assign to the class name (my_custom_name). I'm using python 3.6
Two steps. Number one, you can reference a module directly using importlib:
importlib.import_module('my_package.my_module') # You can use '.'.join((my_package, my_module))
Your class will be contained in the module itself as an attribute, as in any import. As such, just use
my_custom_name = importlib.import_module('my_package.my_module').__dict__['my_class']
or even better
my_custom_name = getattr(importlib.import_module('my_package.my_module'), 'my_class')
I have my own repository created in BitBucket.
In that repository, I have a file named core.py and an __init__.py file
I tried to import the core module, and I fixed all the requirements that were needed.
Now when I am finally able to import the module using ipython, which is only one big class, with the call:
obj = MyClass()
I get an error:
name 'MyClass()' is not defined
even though it seems the module was imported.
Let me know if more information is Needed.
As you stated in your comment, you are importing core.py:
from mintigocloudstorage import core
That means, you also have to tell your script where to find your class:
obj = core.MyClass()
If the import was sucessfull as you say, Python should now be able to locate your classes definition.
Alternatively you can also import your class:
from mintigocloudstorage.core import MyClass
obj = MyClass()
i'm searching how to do
from myLib import *
inside my python code in order to do a import loop
__import__() method does not seems to provide the * feature, as i have to specify every content i want to import.
Is there a way to do the * ?
Thank's a lot for your help
EDIT:
to clarify, the goal is to import a bunch of classes, that stand inside a bunch of modules in a package to access them directly through there classes name, not like myPacakge.myModule.myClass(), nor myModule.myClass() but just myClass()
imagine you have:
myScript.py
myPackage
\__init__.py
\myModule_0.py
\myModule_1.py
\myModule_2.py
\myModule_3.py
\myModule_4.py
each myModule contains a bunch of classes and you are editing myScript.py, you want to have access to all classes in each myModule_X.py like:
myClass()
myOtherClass()
myOtherOtherClass()
etc... not like myModule_X.myClass() nor myPackage.myModyle_X.myClass()
__import__ returns the imported module's namespace. If you want to do import * from it, then you can iterate over that namespace and stuff all the module's names into your module's globals, which is what from modulename import * does. You probably shouldn't, just like you shouldn't use import * (except more so because you don't even know what module you're importing) but you can.
module = __import__("modulename")
if hasattr(module, "__all__"): # module tells us which names are all names
globals().update((name, getattr(module, name)) for name in module.__all__)
else: # import all non-private names
globals().update((name, getattr(module, name)) for name in dir(module)
if not name.startswith("_"))
You could also write it like so, which is a little safer since it avoids clobbering any global names already defined (at the risk of potentially not having a name you need):
module = __import__("modulename")
if hasattr(module, "__all__"): # module tells us which names are all names
globals().update((name, getattr(module, name)) for name in module.__all__
if name not in globals())
else: # import all non-private names
globals().update((name, getattr(module, name)) for name in dir(module)
if not (name.startswith("_") or name in globals()))
import myLib
will import everything but I advise against an import all
to use it you'd preface all imports with
myLib.my_module
If you only want to import certain things on the fly you'd want to do a conditional import, eg:
if condition_met:
import myLib.my_module
import * wreaks havoc with static code checking and debugging, so I don't recommend using it in a script. Assuming you're not trying to do something ill-advised with this, you might consider using the __all__ attribute to get a list of strings of the members of the package.
import my_package
for sub_package in my_package.__all__:
print "found " + sub_package
Dir structure:
main.py
my_modules/
module1.py
module2.py
module1.py:
class fooBar():
....
class pew_pew_FooBarr()
....
...
How can I add all classes from module* to main without prefixes (i.e. to use them like foo = fooBar(), not foo = my_modules.module1.fooBar()).
An obvious decision is to write in main.py something like this:
from my_modules.module1 import *
from my_modules.module2 import *
from my_modules.module3 import *
...
But I don't want to change main.py when I create new moduleN. Is there solution for that?
I do know it's not a good idea to import classes like this, but I'm still curious about that.
UPD: This question differs from this one Loading all modules in a folder in Python, because my problem is to load modules without namespaces.
In the my_modules folder, add a __init__.py file to make it a proper package. In that file, you can inject the globals of each of those modules in the global scope of the __init__.py file, which makes them available as your module is imported (after you've also added the name of the global to the __all__ variable):
__all__ = []
import pkgutil
import inspect
for loader, name, is_pkg in pkgutil.walk_packages(__path__):
module = loader.find_module(name).load_module(name)
for name, value in inspect.getmembers(module):
if name.startswith('__'):
continue
globals()[name] = value
__all__.append(name)
Now, instead of doing:
from my_modules.class1 import Stuff
You can just do:
from my_modules import Stuff
Or to import everything into the global scope, which seems to be what you want to do:
from my_modules import *
The problem with this approach is classes overwrite one another, so if two modules provide Foo, you'll only be able to use the one that was imported last.
Take the following code example:
File package1/__init__.py:
from moduleB import foo
print moduleB.__name__
File package1/moduleB.py:
def foo(): pass
Then from the current directory:
>>> import package1
package1.moduleB
This code works in CPython. What surprises me about it is that the from ... import in __init__.py statement makes the moduleB name visible. According to Python documentation, this should not be the case:
The from form does not bind the module name
Could someone please explain why CPython works that way? Is there any documentation describing this in detail?
The documentation misled you as it is written to describe the more common case of importing a module from outside of the parent package containing it.
For example, using "from example import submodule" in my own code, where "example" is some third party library completely unconnected to my own code, does not bind the name "example". It does still import both the example/__init__.py and example/submodule.py modules, create two module objects, and assign example.submodule to the second module object.
But, "from..import" of names from a submodule must set the submodule attribute on the parent package object. Consider if it didn't:
package/__init__.py executes when package is imported.
That __init__ does "from submodule import name".
At some point later, other completely different code does "import package.submodule".
At step 3, either sys.modules["package.submodule"] doesn't exist, in which case loading it again will give you two different module objects in different scopes; or sys.modules["package.submodule"] will exist but "submodule" won't be an attribute of the parent package object (sys.modules["package"]), and "import package.submodule" will do nothing. However, if it does nothing, the code using the import cannot access submodule as an attribute of package!
Theoretically, how importing a submodule works could be changed if the rest of the import machinery was changed to match.
If you just need to know what importing a submodule S from package P will do, then in a nutshell:
Ensure P is imported, or import it otherwise. (This step recurses to handle "import A.B.C.D".)
Execute S.py to get a module object. (Skipping details of .pyc files, etc.)
Store module object in sys.modules["P.S"].
setattr(sys.modules["P"], "S", sys.modules["P.S"])
If that import was of the form "import P.S", bind "P" in local scope.
this is because __init__.py represent itself as package1 module object at runtime, so every .py file will be defined as an submodule. and rewrite __all__ will not make any sense. you can make another file e.g example.py and fill it with the same code in __init__.py and it will raise NameError.
i think CPython runtime takes special algorithm when __init__.py looking for variables differ from other python files, may be like this:
looking for variable named "moduleB"
if not found:
if __file__ == '__init__.py': #dont raise NameError, looking for file named moduleB.py
if current dir contains file named "moduleB.py":
import moduleB
else:
raise namerror