I have a package of commonly used utility functions which I import into tons of different projects, using different parts. I'm following patterns that I've seen in numpy, by doing things like:
utils/
__init__.py
```
from . import math
from . import plot
```
math/
__init__.py
```
from . import core
from . core import *
from . import stats
from . stats import *
__all__ = []
__all__.extend(core.__all__)
__all__.extend(stats.__all__)
```
core.py
```
__all__ = ['cfunc1', 'cfunc2']
def cfunc1()...
def cfunc2()...
```
stats.py
```
__all__ = ['sfunc1', 'sfunc2']
def sfunc1()...
def sfunc2()...
```
plot/
__init__.py
The nice thing about this structure is I can just call the submodule functions from the higher level namespaces, e.g. utils.math.cfunc1() and utils.math.sfunc2().
The issue is: both core and stats take a long time to import, so I don't want to automatically import them both from math.__init__.
Is there any way to not import both stats and core by default, and instead use something like import utils.math.core, but still have the functions in core end up in the math namespace? i.e. be able to do:
import utils.math.core
utils.math.cfunc1()
Related
I wrote a package with several modules
pkg/
├── __init__.py
├── mod.py
├── mod_calc.py
└── mod_plotting.py
mod.py uses functions from mod_calc.py and mod_plotting.py, so in mod.py I use:
# mod.py:
import pkg.mod_calc as MC
import pkg.mod_plotting as MP
MC.calculate_this()
MP.plot_this()
I reckon this is ok.
There are also scripts (jupyter notebooks) with the suggested workflow designed for users with very little python knowledge, and I'd like them to use the functions as calculate_this() (defined in mod_calc.py) etc (as oppose to mod_calc.calculate_this() etc)
So here is what I'm currently doing:
# __init__.py:
from pkg.mod import *
from pkg.mod_calc import *
from pkg.mod_plotting import *
# script:
from pkg import *
do_this() # from mod.py
calculate_this() # from mod_calc.py
plot_this() # from mod_plotting.py
This way the user doesn't need to worry about which function is defined in which module. It works fine, but I understand that from ... import * is not best practice. So what is the pythonic way to do this?
This is a question I have seen asked in many places, but not answered fully. I am using Python 3.7+. I want to understand how to import a class into another file which is in any subdirectory. For example, if I have the following:
App
|
Trending
|
__init__.py
Trending.py (holds Trending class, and get_trending function)
TrendingUtils
|
__init__.py
TrendingUtils.py (holds TrendingUtils class and get_trend_utils function)
Utils
|
__init__.py
Utils.py (holds Utils class, get_utils function)
Top
|
__init__.py
Top.py (holds Top class and get_top function)
TopUtils
|
__init__.py
TopUtils.py (holds TopUtils class and get_top_utils function)
__init__.py
How can I import the get_trending_utils function in the Trending.py file?
How can I import the get_trending_utils function in the Utils.py file?
How can I import the get_trending_utils function in the TopUtils.py file?
How can I use Utils.py (and its functions) in both Trending.py and Top.py?
How can I use get_top_utils in Trending.py?
The same questions go the deeper in subdirectories we go. I have read in different places that using the sys module can achieve this, but it seems clunky and not like best practice. I would like to do something like this, if possible:
In Trending.py:
import TrendingUtils
import Utils
var = TrendingUtils.get_trend_utils()
utils = Utils.get_utils()
also:
In Top.py
import Utils
import TrendingUtils
Import TopUtils
var1 = TrendingUtils.get_trend_utils()
var2 = Utils.get_utils()
var3 = TopUtils.get_top_utils.py
in TopUtils.py
from .. import TrendingUtils
from . import Top
var1 = TrendingUtils.get_trending_utils()
var2 = Top.get_top()
The init.py files are all emppty files. I have also seen that adding '.' will move up one directory, but trying something like
in Top.py
from . import Utils
from Trending import TrendingUtils
Also gives me errors. I have tried all possible combinations of the above as I can think of. I have also tried adding imports to the init.py files, but I keep getting the import errors. The only solution that seems to work is doing something like:
In Trending.py
PROJECT_ROOT = os.path.abspath(os.path.join(
os.path.dirname(__file__),
os.pardir)
)
sys.path.append(PROJECT_ROOT)
from Utils import get_utils
Again, this seems very clunky and dirty, and more importantly, not scalable to other OSs and other systems/computers. There must be a better way and I am just missing it. Thank you so much for the help.
As per title, I am trying to build a python package myself, I am already familiar with writing python packages reading notes from https://packaging.python.org/en/latest/tutorials/packaging-projects/ and https://docs.python.org/3/tutorial/modules.html#packages. These gave me an idea of how to write a bunch of object class/functions where I can import them.
What I want is to write a package like pandas and numpy, where I run import and they work as an "object", that is to say most/all the function is a dotted after the package.
E.g. after importing
import pandas as pd
import numpy as np
The pd and np would have all the functions and can be called with pd.read_csv() or np.arange(), and running dir(pd) and dir(np) would give me all the various functions available from them. I tried looking at the pandas src code to try an replicate their functionality. However, I could not do it. Maybe there is some parts of that I am missing or misunderstanding. Any help or point in the right direction to help me do this would be much appreciated.
In a more general example, I want to write a package and import it to have the functionalities dotted after it. E.g. import pypack and I can call pypack.FUNCTION() instead of having to import that function as such from pypack.module import FUNCTION and call FUNCTION() or instead of importing it as just a submodule.
I hope my question makes sense as I have no formal training in write software.
Let's assume you have a module (package) called my_library.
.
├── main.py
└── my_library/
└── __init__.py
/my_library/__init__.py
def foo(x):
return x
In your main.py you can import my_library
import my_library
print(my_library.foo("Hello World"))
The directory with __init__.py will be your package and can be imported.
Now consider a even deeper example.
.
├── main.py
└── my_library/
├── __init__.py
└── inner_module.py
inner_module.py
def bar(x):
return x
In your /my_library/__init__.py you can add
from .inner_module import bar
def foo(x):
return x
You can use bar() in your main as follows
import my_library
print(my_library.foo("Hello World"))
print(my_library.bar("Hello World"))
I'd like to import all my packages in only one file.
Let assume that I have a main.py file where I call all my class (from others .py files located in a src folder):
main.py
|-- src
|-- package1.py
|-- package2.py
the main.py looks like this:
from src.package1 import *
from src.package2 import *
def main():
class1 = ClassFromPackage1()
class2 = ClassFromPackage2()
if __name__ == '__main__':
main()
in package1.py I import let say numpy, scipy and pandas
import numpy
import scipy
import pandas
class ClassFromPackage1():
# Do stuff using numpy, scipy and pandas
and in package2.py I use numpy and scikit learn:
import numpy
import sklearn
class ClassFromPackage2():
# Do stuff using numpy and sklearn
Is there a way to import all packages in one file Foo.py where I only write:
import numpy
import sklearn
import scipy
import pandas
and import this Foo.py in src .py? like this for example with package1.py
import Foo
class ClassFromPackage1():
# Do stuff using numpy, scipy and pandas
Is this a good idea? Does it reduce memory consumption? Will it helps python to start the main.py faster?
Looks like you want to make code cleaner? What you can do is create a file like foo.py and put all imports in it. Then you can import modules inside foo.py by doing
from foo import *
This will indirectly import all modules.
The way you have already done it is how it is usually done. Similar to header files in C/C++, you make the dependencies explicit. And that it is a good thing.
You asked if it will run faster, the answer is no. All imports are shared. This, sometimes, causes unwanted side effects, but that is not the question here.
Assume the following structure...
root/test.py
root/myapp/__init__.py
root/myapp/myapp.py # contains My_App_Class()
root/myapp/SomeObject.py # contains My_Obj_Class()
If my __init__.py contains the following structure:
from myapp import *
import SomeObject
__all__ = ['SomeObject']
I want to be able to call myapp and have the routines pre-extracted from myapp/myapp.py so that I can do the following in test.py:
import myapp
if 'My_App_Class' in myapp.__dict__.keys():
print 'success'
else:
print 'fail'
if 'My_Obj_Class' in myapp.SomeObject.__dict__.keys():
print 'success'
else:
print 'fail'
so that I can effectively collapse from myapp.myapp import * into from myapp import *
In [1]: %run test.py
fail
success
The __all__ attribute in your __init__.py file is what is preventing other names (imported from myapp/myapp.py) from being visible when using the myapp package.
Just don't use the all - and maybe, to avoid ambiguity, change the import written as:
from myapp import *
to
from .myapp import *
In other words, your __init__.py file should be simply:
from .myapp import *
import SomeObject
and no other line.
Your test.py on the other hand if further incorrect - if it starts with import myapp , all of myapp/myapp.py's names will be in the myapp namespace. You should change that first line to be from myapp import * as you put further down in the question.
Mandatory note: You should avoid doing import * and variants in Python projects that import names from packages. Python programs and projects should always import individual names from packages/modules - with the name written explicitly, anyone reading your code will know exactly from were each function or class you use came from. Moreover, IDE's and other programming tools will be able to check for non-existing or inconsistent names and warn you.
However, note it is ok to use from module import * inside a package's __init__.py to get the names exposed (in the __all__ export) in each of the package's modules into the package namespace itself - which is what you want.