importing modules best practice - python

I wrote a package with several modules
pkg/
├── __init__.py
├── mod.py
├── mod_calc.py
└── mod_plotting.py
mod.py uses functions from mod_calc.py and mod_plotting.py, so in mod.py I use:
# mod.py:
import pkg.mod_calc as MC
import pkg.mod_plotting as MP
MC.calculate_this()
MP.plot_this()
I reckon this is ok.
There are also scripts (jupyter notebooks) with the suggested workflow designed for users with very little python knowledge, and I'd like them to use the functions as calculate_this() (defined in mod_calc.py) etc (as oppose to mod_calc.calculate_this() etc)
So here is what I'm currently doing:
# __init__.py:
from pkg.mod import *
from pkg.mod_calc import *
from pkg.mod_plotting import *
# script:
from pkg import *
do_this() # from mod.py
calculate_this() # from mod_calc.py
plot_this() # from mod_plotting.py
This way the user doesn't need to worry about which function is defined in which module. It works fine, but I understand that from ... import * is not best practice. So what is the pythonic way to do this?

Related

How to build package like pandas/numpy where pd/np is an object with all the functions

As per title, I am trying to build a python package myself, I am already familiar with writing python packages reading notes from https://packaging.python.org/en/latest/tutorials/packaging-projects/ and https://docs.python.org/3/tutorial/modules.html#packages. These gave me an idea of how to write a bunch of object class/functions where I can import them.
What I want is to write a package like pandas and numpy, where I run import and they work as an "object", that is to say most/all the function is a dotted after the package.
E.g. after importing
import pandas as pd
import numpy as np
The pd and np would have all the functions and can be called with pd.read_csv() or np.arange(), and running dir(pd) and dir(np) would give me all the various functions available from them. I tried looking at the pandas src code to try an replicate their functionality. However, I could not do it. Maybe there is some parts of that I am missing or misunderstanding. Any help or point in the right direction to help me do this would be much appreciated.
In a more general example, I want to write a package and import it to have the functionalities dotted after it. E.g. import pypack and I can call pypack.FUNCTION() instead of having to import that function as such from pypack.module import FUNCTION and call FUNCTION() or instead of importing it as just a submodule.
I hope my question makes sense as I have no formal training in write software.
Let's assume you have a module (package) called my_library.
.
├── main.py
└── my_library/
└── __init__.py
/my_library/__init__.py
def foo(x):
return x
In your main.py you can import my_library
import my_library
print(my_library.foo("Hello World"))
The directory with __init__.py will be your package and can be imported.
Now consider a even deeper example.
.
├── main.py
└── my_library/
├── __init__.py
└── inner_module.py
inner_module.py
def bar(x):
return x
In your /my_library/__init__.py you can add
from .inner_module import bar
def foo(x):
return x
You can use bar() in your main as follows
import my_library
print(my_library.foo("Hello World"))
print(my_library.bar("Hello World"))

Import submodule functions into the parent namespace

I have a package of commonly used utility functions which I import into tons of different projects, using different parts. I'm following patterns that I've seen in numpy, by doing things like:
utils/
__init__.py
```
from . import math
from . import plot
```
math/
__init__.py
```
from . import core
from . core import *
from . import stats
from . stats import *
__all__ = []
__all__.extend(core.__all__)
__all__.extend(stats.__all__)
```
core.py
```
__all__ = ['cfunc1', 'cfunc2']
def cfunc1()...
def cfunc2()...
```
stats.py
```
__all__ = ['sfunc1', 'sfunc2']
def sfunc1()...
def sfunc2()...
```
plot/
__init__.py
The nice thing about this structure is I can just call the submodule functions from the higher level namespaces, e.g. utils.math.cfunc1() and utils.math.sfunc2().
The issue is: both core and stats take a long time to import, so I don't want to automatically import them both from math.__init__.
Is there any way to not import both stats and core by default, and instead use something like import utils.math.core, but still have the functions in core end up in the math namespace? i.e. be able to do:
import utils.math.core
utils.math.cfunc1()

__init__.py :: make a submodule routines available at top level import

Assume the following structure...
root/test.py
root/myapp/__init__.py
root/myapp/myapp.py # contains My_App_Class()
root/myapp/SomeObject.py # contains My_Obj_Class()
If my __init__.py contains the following structure:
from myapp import *
import SomeObject
__all__ = ['SomeObject']
I want to be able to call myapp and have the routines pre-extracted from myapp/myapp.py so that I can do the following in test.py:
import myapp
if 'My_App_Class' in myapp.__dict__.keys():
print 'success'
else:
print 'fail'
if 'My_Obj_Class' in myapp.SomeObject.__dict__.keys():
print 'success'
else:
print 'fail'
so that I can effectively collapse from myapp.myapp import * into from myapp import *
In [1]: %run test.py
fail
success
The __all__ attribute in your __init__.py file is what is preventing other names (imported from myapp/myapp.py) from being visible when using the myapp package.
Just don't use the all - and maybe, to avoid ambiguity, change the import written as:
from myapp import *
to
from .myapp import *
In other words, your __init__.py file should be simply:
from .myapp import *
import SomeObject
and no other line.
Your test.py on the other hand if further incorrect - if it starts with import myapp , all of myapp/myapp.py's names will be in the myapp namespace. You should change that first line to be from myapp import * as you put further down in the question.
Mandatory note: You should avoid doing import * and variants in Python projects that import names from packages. Python programs and projects should always import individual names from packages/modules - with the name written explicitly, anyone reading your code will know exactly from were each function or class you use came from. Moreover, IDE's and other programming tools will be able to check for non-existing or inconsistent names and warn you.
However, note it is ok to use from module import * inside a package's __init__.py to get the names exposed (in the __all__ export) in each of the package's modules into the package namespace itself - which is what you want.

What is the most pythonic way to import 'sibling' modules into one another?

By 'sibling' modules, I mean two submodules that exist at the same depth within a parent module.
I'm trying to create a flask project using Flask-Restful, and it recommends structuring the project using this schema:
myapi/
__init__.py
app.py # this file contains your app and routes
resources/
__init__.py
foo.py # contains logic for /Foo
bar.py # contains logic for /Bar
common/
__init__.py
util.py # just some common infrastructure
I really like this structure, but I'm not sure how to import something from the 'common' module into the 'resources' module. Can anyone help me out?
In common/__init__.py
from myapi.common.utils import A, B
In resource/foo.py
from myapi.common import A
You can also relative imports in __init__.py like from .utils import A.

Lazy loading module imports in an __init__.py file python

I was wondering if anyone had any suggestions for lazy loading imports in an init file? I currently have the following folder structure:
/mypackage
__init__.py
/core
__init__.py
mymodule.py
mymodule2.py
The init.py file in the core folder with the following imports:
from mymodule import MyModule
from mymodule2 import MyModule2
This way I can just do:
from mypackage.core import MyModule, MyModule2
However, in the package init.py file, I have another import:
from core.exc import MyModuleException
This has the effect that whenever I import my package in python, MyModule and MyModule2 get imported by default because the core init.py file has already been run.
What I want to do, is only import these modules when the following code is run and not before:
from mypackage.core import MyModule, MyModule2
Any ideas?
Many thanks.
Unless I'm mistaking your intentions, this is actually possible but requires some magic.
Basically, subclass types.ModuleType and override __getattr__ to import on demand.
Check out the Werkzeug init.py for an example.
You can't. Remember that when python imports it executes the code in the module. The module itself doesn't know how it is imported hence it cannot know whether it has to import MyModule(2) or not.
You have to choose: allow from mypackage.core import A, B and from core.exc import E does the non-needed imports (x)or do not import A and B in core/__init__.py, hence not allowing from mypackage.core import A, B.
Note: Personally I would not import MyModule(2) in core/__init__.py, but I'd add an all.py module that does this, so the user can do from mypackage.core.all import A, B
and still have from mypackage.core.exc import TheException not loading the unnecessary classes.
(Actually: the all module could even modify mypackage.core and add the classes to it, so that following imports of the kind from mypackage.core import MyModule, MyModule2 work, but I think this would be quite obscure and should be avoided).
If your modules structure is like:
/mypackage
__init__.py
/core
__init__.py
MyModule.py
MyModule2.py
or:
/mypackage
__init__.py
/core
__init__.py
/MyModule
__init__.py
/MyModule2
__init__.py
then feel free to use
from mypackage.core import MyModule, MyModule2
without importing them in __init__.py under mypackage/core
Not sure if it applies here but in general lazy loading of modules can be done using the Importing package.
Works like this:
from peak.util.imports import lazyModule
my_module = lazyModule('my_module')
Now my module is only really imported when you use it the first time.
You may use follow code in __init__ in module:
import apipkg
apipkg.initpkg(__name__, {
'org': {
'Class1': "secure._mypkg:Class1",
'Class2': "secure._mypkg2:Class2",
}
})
I realize that this question was posted a very long time ago and since then there has been some helpful updates to solve lazy loading submodules. So for anyone else looking for how to solve this we have great options available now.
Specifically PEP 562
Here is an excerpt from that article:
Another widespread use case for getattr would be lazy submodule imports. > Consider a simple example:
# lib/__init__.py
import importlib
__all__ = ['submod', ...]
def __getattr__(name):
if name in __all__:
return importlib.import_module("." + name, __name__)
raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
# lib/submod.py
print("Submodule loaded")
class HeavyClass:
...
# main.py
import lib
lib.submod.HeavyClass # prints "Submodule loaded"

Categories

Resources