How to import appropriately - python

I have been wondering for a long time how to appropriately import in Python, in relation to the way I'm going to explain now:
Imagine I've got two modules built by myself, modulea and moduleb, and have a last one which uses both the modulea and b, so modulec.
Furthermore, modulea uses moduleb.
The modules are all one-file, which form part of a big module which has got its own __init__.py file.
+ My project
-
-> modulea.py
-> moduleb.py
-> modulec.py
-> __init__.py # Here I import modulea, b, c.
-> main_module.py # This is were I do all the hardcore.
I've created a file for importing all the modules I need, instead of importing them one by one in each script, and to know if I have already imported it, I would have to check file for file if I have already imported that or not.
I could simply declare, as you have been saying below in the comment box, the required modules in each module, but I think that's not practical and it's easier to import them all only in one file.
Tell me if this is a good practice or not.
If I import
import modulea
import moduleb
import modulec
It won't work (ImportError) because modulea does not implicitly import moduleb inside the code, and it is imported before than the module it requires (b)
If I did
import moduleb
import modulea
import modulec
I guess they will work because since the runnning script already imports moduleb before a, it should also be accessible to a and not only to the running script; and the same I guess it happens with modulec.
Supposing this is true, which I think indeed, will it work if the import were done in only one line? Are the modules imported and included into the script one by one or are they rather imported altogether and them enabled for use, after the statement finishes (the one-line import call)?
import moduleb, modulea, modulec
And if it does, does it shorten too much the processing time of the code, or is it basically the same, despite the aesthetic matter?

You are overthinking way too much.
Each module must be "self contained" in the sense that itself satisfies its dependencies. If modulea needs moduleb, you do
# modulea.py
import moduleb
Then moduleb has to import its dependencies, if any:
# moduleb.py
import json # for instance
And finally modulec needs modulea and moduleb, you just import them:
# modulec.py
import modulea
import moduleb # <-- actually this import does not reload the moduleb, since it was already loaded in modulea, and it was cached.
There is nothing more about this.
And if you just need a function from a module, you can just import it into the namespace like this.
from moduleb import my_cool_function
Just as style convention, you should do each import in a new line:
import json
from moduleb import my_cool_function
from moduleb import another_not_so_cool_function

According to the python style guide you should use an import statement for each individual module. It will also tell you everything else you need to know about the conventions surrounding importing modules.
import moduleb
import modulea
import modulec
Also, as some of the commenters have indicated, you don't seem to be importing these modules properly within each script, thus creating errors here in this script.

You can put all of your imports into one place if you'd like.
myproject/internal.py
from moduleb import *
from modulea import *
from modulec import *
Now, your other modules can just import it
myproject/modulec.py
from internal import *
Although its usually preferable to import everything in the module that uses it, there are conditions where that is not desirable. I have a project with many small scripts that test product functionality. It would be a bother to have lots of imports in each of them.

Related

__init__.py only import defined functions

I have a package that uses a __init__.py:
from .common import (funcA, funcB, funcC)
from .other import (funcD, funcE, funcF)
I am literally having to import every single function manually because if I try to do import * it drags all the modules used in common.py and other.py with it. i.e. things like os, re, sys, etc. will get dragged into the package's namespace.
Is there a way to import all the functions that are actually defined in the file, rather than importing the entire namespace from the module?

Python Importing modules in a package

I currently have a module I created that has a number of functions.
It's getting quite large so I figured I should make it into a package and split the functions up to make it more manageable.
I'm just testing out how this all works before I do this for real so apologies if it seems a bit tenuous.
I've created a folder called pack_test and in it I have:
__init__.py
foo.py
bar.py
__init__.py contains:
__all__ = ['foo', 'bar']
from . import *
import subprocess
from os import environ
In the console I can write import pack_test as pt and this is fine, no errors.
pt. and two tabs shows me that I can see pt.bar, pt.environ, pt.foo and pt.subprocess in there.
All good so far.
If I want to reference subprocess or environ in foo.py or bar.py how do I do it in there?
If in bar.py I have a function which just does return subprocess.call('ls') it errors saying NameError: name 'subprocess' is not defined. There must be something I'm missing which enables me to reference subprocess from the level above? Presumably, once I can get the syntax from that I can also just call environ in a similar way?
The alternative as I could see it would be to have import subprocess in both foo.py and bar.py but then this seems a bit odd to me to have it appear across multiple files when I could have it the once at a higher level, particularly if I went on to have a large number of files rather than just 2 in this example.
TL;DR:
__init__.py :
import foo
import bar
__all__ = ["foo", "bar"]
foo.py:
import subprocess
from os import environ
# your code here
bar.py
import subprocess
from os import environ
# your code here
There must be something I'm missing which enables me to reference subprocess from the level above?
Nope, this is the expected behaviour.
import loads a module (if it isn't already), caches it in sys.modules (idem), and bind the imported names in the current namespace. Each Python module has (or "is") it's own namespace (there's no real "global" namespace). IOW, you have to import what you need in each module, ie if foo.py needs subprocess, it must explicitely import it.
This can seem a bit tedious at first but in the long run it really helps wrt/ maintainability - you just have to read the imports at the top of your module (pep 08: always put all imports at the beginning of the module) to know where a name comes from.
Also you should not use star imports (aka wild card imports aka from xxx import *) anywhere else than in your python shell (and even then...) - it's a maintainance time bomb. Not only because you don't know where each name comes from, but also because it's a sure way to rebind an already import name. Imagine that your foo module defines function "func". Somewhere you have "from foo import *; from bar import *", then later in the code a call to func. Now someone edits bar.py and adds a (distinct) "func" function, and suddenly you call fails, because you're not calling the expected "func". Now enjoy debugging this... And real-life examples are usually a bit more complex than this.
So if you fancy your mental sanity, don't be lazy, don't try to be smart either, just do the simple obvious thing: explicitely import the names you're interested in at the top of your modules.
(been here, done that etc)
You could create modules.py containing
import subprocess
import os
Then in foo.py or any of your files just have.
from modules import *
Your import statements in your files are then static and just update modules.py when you want to add an additional module accessible to them all.

Python imported module dependency

Let's say I have file main.py :
import math
import mymodule.py
print(math.ceil(5/3))
and then mymodule.py :
print(math.ceil(10/3))
mymodule.py gives an error that math is not defined, even though its parent module has it imported.
Considering both main.py and mymodule.py need to use the math lib, do I need to import it twice? It just seems non-optimal. What's the most pythonic way to solve this issue?
I know it's a dumb example, but I'm trying to fragment a code I made into several modules for organization, and this issue appeared multiple times in several levels
mymodule.py is parent for main.py since you are importing mymodule within main.
You need to import math within mymodule so that it gets inherited in main.
Then there won't be a need to import within main.
mymodule.py
import math
main.py
import mymodule
print mymodule.math.pow(10,2)
Result:
>>>
100.0
>>>
This is really very basic. If you have something in a separate file, like mymodule.py, then you can import that function in any python file easily in the same directory.
two files:
mymodule.py:
import math
def aFunc():
return math.ceil(10/3)
# We could also just use this file as a standalone
if __name__ == "__main__":
print(aFunc())
main.py:
import mymodule
print(mymodule.aFunc())
You could also specifically call out the function you want to import.
main.py (alternative):
from mymodule import aFunc
print(aFunc())

Lazy loading module imports in an __init__.py file python

I was wondering if anyone had any suggestions for lazy loading imports in an init file? I currently have the following folder structure:
/mypackage
__init__.py
/core
__init__.py
mymodule.py
mymodule2.py
The init.py file in the core folder with the following imports:
from mymodule import MyModule
from mymodule2 import MyModule2
This way I can just do:
from mypackage.core import MyModule, MyModule2
However, in the package init.py file, I have another import:
from core.exc import MyModuleException
This has the effect that whenever I import my package in python, MyModule and MyModule2 get imported by default because the core init.py file has already been run.
What I want to do, is only import these modules when the following code is run and not before:
from mypackage.core import MyModule, MyModule2
Any ideas?
Many thanks.
Unless I'm mistaking your intentions, this is actually possible but requires some magic.
Basically, subclass types.ModuleType and override __getattr__ to import on demand.
Check out the Werkzeug init.py for an example.
You can't. Remember that when python imports it executes the code in the module. The module itself doesn't know how it is imported hence it cannot know whether it has to import MyModule(2) or not.
You have to choose: allow from mypackage.core import A, B and from core.exc import E does the non-needed imports (x)or do not import A and B in core/__init__.py, hence not allowing from mypackage.core import A, B.
Note: Personally I would not import MyModule(2) in core/__init__.py, but I'd add an all.py module that does this, so the user can do from mypackage.core.all import A, B
and still have from mypackage.core.exc import TheException not loading the unnecessary classes.
(Actually: the all module could even modify mypackage.core and add the classes to it, so that following imports of the kind from mypackage.core import MyModule, MyModule2 work, but I think this would be quite obscure and should be avoided).
If your modules structure is like:
/mypackage
__init__.py
/core
__init__.py
MyModule.py
MyModule2.py
or:
/mypackage
__init__.py
/core
__init__.py
/MyModule
__init__.py
/MyModule2
__init__.py
then feel free to use
from mypackage.core import MyModule, MyModule2
without importing them in __init__.py under mypackage/core
Not sure if it applies here but in general lazy loading of modules can be done using the Importing package.
Works like this:
from peak.util.imports import lazyModule
my_module = lazyModule('my_module')
Now my module is only really imported when you use it the first time.
You may use follow code in __init__ in module:
import apipkg
apipkg.initpkg(__name__, {
'org': {
'Class1': "secure._mypkg:Class1",
'Class2': "secure._mypkg2:Class2",
}
})
I realize that this question was posted a very long time ago and since then there has been some helpful updates to solve lazy loading submodules. So for anyone else looking for how to solve this we have great options available now.
Specifically PEP 562
Here is an excerpt from that article:
Another widespread use case for getattr would be lazy submodule imports. > Consider a simple example:
# lib/__init__.py
import importlib
__all__ = ['submod', ...]
def __getattr__(name):
if name in __all__:
return importlib.import_module("." + name, __name__)
raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
# lib/submod.py
print("Submodule loaded")
class HeavyClass:
...
# main.py
import lib
lib.submod.HeavyClass # prints "Submodule loaded"

How do I make these relative imports work in Python 3?

I have a directory structure that looks like this:
project/
__init__.py
foo/
__init.py__
first.py
second.py
third.py
plum.py
In project/foo/__init__.py I import classes from first.py, second.py and third.py and put them in __all__.
There's a class in first.py named WonderfulThing which I'd like to use in second.py, and want to import by importing * from foo. (It's outside of the scope of this question why I'd like to do so, assume I have a good reason.)
In second.py I've tried from .foo import *, from foo import * and from . import * and in none of these cases is WonderfulThing imported. I also tried from ..foo import *, which raises an error "Attempted relative import beyond toplevel package".
I've read the docs and the PEP, and I can't work out how to make this work. Any assistance would be appreciated.
Clarification/Edit: It seems like I may have been misunderstanding the way __all__ works in packages. I was using it the same as in modules,
from .first import WonderfulThing
__all__ = [ "WonderfulThing" ]
but looking at the docs again it seems to suggest that __all__ may only be used in packages to specify the names of modules to be imported by default; there doesn't seem to be any way to include anything that's not a module.
Is this correct?
A non-wildcard import failed (cannot import name WonderfulThing). Trying from . import foo failed, but import foo works. Unfortunately, dir(foo) shows nothing.
Edit: I did misunderstand the question: No __all__ is not restricted to just modules.
One question is why you want to do a relative import. There is nothing wrong with doing from project.foo import *, here. Secondly, the __all__ restriction on foo won't prevent you from doing from project.foo.first import WonderfulThing, or just from .first import WonderfulThing, which still will be the best way.
And if you really want to import a a lot of things, it's probably best to do from project import foo, and then use the things with foo.WonderfulThing instead for doing an import * and then using WonderfulThing directly.
However to answer your direct question, to import from the __init__ file in second.py you do this:
from . import WonderfulThing
or
from . import *

Categories

Resources