How to write nested __init__.py files - python

I am struggling with nested __init__.py in a Python package I am writting. The Package has the following architecture:
module/
├── __init__.py
├── submodule1
│   ├── __init__.py
│   └── source.py
└── submodule2
├── __init__.py
├── source.py
└── subsubmodule2
├── __init__.py
└── source.py
My intent is to be able to access functions defined in submodule2/source.py through module.submodule2.function and in subsubmodules2/source.py through module.submodule2.subsubmodule2.function.
The first thing I tried was to define __init__.py in submodule2 this way:
from .subsubmodule2 import *
But doing so, I get the functions defined in subsubmodules2/source.py through module.submodule2.function (and module.function).
If I do:
from . import subsubmodule2
I get these functions through module.subsubmodule2.function.
I also tried to define __all__ keyword in __init__, with no more success. If I understand well Python documentation, I guess I could leave empty __init__.py files and it could work, but from my understanding that is not the best practice either.
What would be the best way to access these functions as intended in my module?

in module __init__.py file write the module which you want to import as
from . import submodule1
from . import submodule2
__all__ = ['submodule1', 'submodule2']
Now, in submodule1 __init__.py file write
from . import source
from . import subsubmodule
# if you want to import functions from source then import them or in source.py
# define __all__ and add function which you want to expose
__all__ = ['source', 'subsubmodule']
now in subsubmodule __init__ file define function or class which you want to expose
from . source import *
__all__ = ['source']
# if you want to use * as import, i suggest you to use __all__ in source.py and mention all exposed function there

The __init__.py file represents its respective package. For example, module/submodule2/__init__.py represents module. submodule2 .
In order to pull objects defined in submodules into their package namespace, import them:
# module/submodule2/__init__.py
from .source import *
Since __init__.py is a regular Python module, one can also forgo a separate .source module and define objects directly inside __init__.py:
# module/submodule2/__init__.py
def function():
...
Note that subpackages themselves are already available as their respective name. One does not have to – and in fact should not – import them in the parent module. They will be imported if code using the package imports them.

Related

Use own modules in beeware

I have a beeware project and also want to use my own modules in it like Models and Controllers. Also, a module which creates some objects I can test with.
But when I want to import the module to create the test objects and use the method it just throws an error:
ImportError: attempted relative import beyond top-level package
After some research, I know that the path (directory) structures, where I put my modules in, and where the package is, are important. But where ever I put the modules it has the same (or kinda like this) errors. But I can import my Models to create objects of these classes. I also can't decide where the start point of the briefcase is.
Here my structure currently:
/Project_Dir (own created)
/briefcase_project (created from briefcase)
/src
/Models (own created)
/app_directory (created from briefcase)
here is the __main__.py and the __init__.py (the start point I guess) and the app.py (where beeware code is, and also my module import from Test)
/Test (own created, here is a file with a method I want to call)
Sadly there is not so much stuff to find about beeware so I could find a solution.
Please help. Thanks ^^
I did the following to workaround the issue. The example using the Beeware Tutorial 2 source code is on Github
.
├── __init__.py
├── __main__.py
├── app.py
├── mylib <--- # my lib.
│   ├── __init__.py
│   └── testlib.py
└── resources
├── __init__.py
├── beewarecustomlibexample.icns
├── beewarecustomlibexample.ico
└── beewarecustomlibexample.png
2 directories, 9 files
The mylib/testlib.py
def test(text: str) -> str:
return f"Hello: {text}"
In the app.py:
import toga
from toga.style import Pack
from toga.style.pack import COLUMN, ROW
from beewarecustomlibexample.mylib.testlib import test # Import custom lib
class BeewareCustomLibExample(toga.App):
def startup(self):
...
def say_hello(self, widget):
# Calling my test method
result = test(self.name_input.value)
self.main_window.info_dialog("Test Dialog", result)
def main():
return BeewareCustomLibExample()
The above is how I got it working. I've built it on MacOS and works fine.
Take your project folder name and then import from there, so if you're tinkering with the tutorial and you've set up a module folder called myModule in the same directory as your app.py and you have a file called file.py with a class called myClass, you might type:
from helloworld.myModule.file import myClass

AWS-Lamda fails because can't import variable outside lambda_handler

I checked all question asked before here and didn't find answer.
I have the following structure of my project:
project/
├──
└── lambdas
├── __init__.py
├── lambda_handler_1.py # def main(event, context)
├── lambda_handler_2.py
└── my_lib
├── __init__.py. # Here I have `import my_lids.utils \n import my_lids.exceptions`
├── exceptions.py
├── utils.py
└── api
├── __init__.py
├── some_api.py # Here I make `from my_lids.utils import my_func` and `from my_lids.exceptions import MyException`
When trigger runs lambda's I'm getting error
Runtime.ImportModuleError: Unable to import module 'lambdas/lambda_handler_1': cannot import name 'my_func' from 'my_lib.errors' (/var/task/my_lib/utils.py)
I do not use any function from module utils in my lambda. What do I do wrong?
I archived all my code and requirements into zip file and deploy it. Also there aren't any __pycache__, etc.
Thanks
A Lambda Function can contain multiple files.
If you use utils.py only for that function, then you should include its source as another file for the function
If that's the case, then put each Lambda Function in its own folder (not individual files), add utils.py to the folder of the Lambda you want to use it with. Then use the SAM CLI to build/package/deploy your Lambdas
If you use it in multiple places, you should package it as a Lambda Layer
Lambda Layers are a pain to update, however
PS you don't need to install Docker when installing the SAM ClI

Python imports - why main package name is required?

I would like to know why I need to include the main directory name in the import statements for creating my project's directory structure.
My project's structure
.
├── main.py
├── myModel
│   ├── __init__.py
│   ├── loader
│   │   ├── __init__.py
│   │   └── dataset_loader.py
│   └── models
│   ├── __init__.py
│   ├── static_model.py
│   └── task_model.py
└── tree.txt
main.py
from myModel import loader
from myModel import models
loader.fun ()
loader.dataset_loader.fun ()
myModel/__init__.py
import myModel.models
import myModel.loader
myModel/loader/__init__.py
from myModel.loader.dataset_loader import *
myModel/models/__init__.py
from myModel.models.static_model import StaticModel
My first question is why I need to put myModel even in subfolders of the myModel directory. I tried to remove it, but import didn't work so I think it needs to be there.
Secondly, why I can call fun directly from loader and not using the full qualified path?
I read something on the web. But I still have trouble understanding why this happens.
Absolute imports, like import x.y.z or from x.y import z require x to be in your path. In your specific case, myModel is on the path because of your working directory. The sub-packages are not on the path, and can therefore only be accessed by reiterating the root package.
A more intuitive approach might be to use relative paths. This is possible because all your files live in proper packages with __init__ files. Keep in mind that relative paths imply that you have modules that are designed to live in your package structure and not on their own. Otherwise, you may end up causing errors when you try to run some of the modules as standalone scripts.
Change myModel/__init__.py to:
from . import models
from . import loader
The . makes the import relative. Notice that I did not suggest changing main.py, since it lives outside your packages. Adding more dots lets you go up more levels in the file hierarchy.
Change myModel/loader/__init__.py to
from .dataset_loader import *
and myModel/models/__init__.py to
from .static_model import StaticModel
An import statement binds a name in your local namespace (usually the module you are executing it in). The name that is bound depends on which form of import you used:
import x binds the module described in x.py or x/__init__.py to the name x
import x.y binds the module described in x.py or x/__init__.py to the name x, and insures that x has an attribute y, either as an attribute defined in x.py/__init__.py, or as a sub-module in x/y.py.
from x import y binds the attribute or sub-module y from x.py/x/__init__.py or x/y.py to the name y. This option loads, but does not give you access to x.
When you run from myModel import loader, you get a module object loader that has a callable attribute fun.

Import function from submodule in __init__.py without exposing submodule

I' working on a Python project with a directory structure similar to this:
foo/
├── bar
│   ├── bar1.py
│   ├── bar2.py
│   └── __init__.py
└── __init__.py
Where the module bar1 defines the function function1.
I would like to have users of my code import function1 (and nothing else) directly from foo, i.e. via from foo import function1. Fair enough, that can be achieved with the following foo/__init__.py:
from .bar.bar1 import function1
__all__ = ['function1']
The problem now is that someone running import foo in e.g. the REPL will still be presented with foo.bar alongside foo.function1 when trying to autocomplete foo.. Is there a way to "hide" the existence of bar from users without changing its name to _bar?
I might be going about this the wrong way alltogether so I'm open to suggestions on how to restructure my code but I would like to avoid renaming modules.
You can hide it with deleting bar reference in foo/__init__.py:
from .bar.bar1 import function1
__all__ = ['function1']
del bar
Existence of __all__ affects the from <module> import * behavior only

Python accessing modules from package that is distributed over different directories

I have a question regarding one single module that is distributed over multiple directories.
Let's say I have these two file and directories:
~/lib/python
xxx
__init__.py
util
__init__.py
module1.py
module2.py
~/graphics/python
xxx
__init__.py
misc
__init__.py
module3.py
module4.py
So then in my Python modules, I did this:
import sys
pythonlibpath = '~/lib/python'
if pythonlibpath not in sys.path: sys.path.append(pythonlibpath)
import xxx.util.module1
which works.
Now, the problem is that I need xxx.misc.module3, so I did this:
import sys
graphicslibpath = '~/graphics/python'
if graphicslibpath not in sys.path: sys.path.append(graphicslibpath)
import xxx.misc.module3
but I get this error:
ImportError: No module named misc.module3
It seems like it somehow still remembers that there was a xxx package in ~/lib/python and then tries to find misc.module3 from there.
How do I get around this issue?
You can't without an extreme amount of trickery that pulls one package structure into the other. Python requires that all modules in a package be under a single subdirectory. See the os source to learn how it handles os.path.
Python does indeed remember that there was a xxx package. This is pretty much necessary to achieve acceptable performance, once modules and packages are loaded they are cached. You can see which modules are loaded by looking the the dictionary sys.modules.
sys.modules is a normal dictionary so you can remove a package from it to force it to be reloaded like below:
import sys
print sys.modules
import xml
print sys.modules
del sys.modules['xml']
print sys.modules
Notice that after importing the xml package it is the dictionary, however it is possible to remove it from that dictionary too. This is a point I make for pedagogical purposes only, I would not recommend this approach in a real application. Also if you need to use your misc and util packages together this would not work so great. If at all possible rearrange your source code structure to better fit the normal Python module loading mechanism.
This is addressed by Implicit Namespace Packages in Python 3.3. See PEP-420.
This is an adaptation of an answer to a similar question.
Following up on #Gary's answer, the PEP 420 page says to use the following code on shared __init__.py packages.
__init__.py:
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
This code should be placed inside the xxx directory's __init__.py.
See the *s below
someroot/
├── graphics
│   └── python
│   └── xxx
│   ├── ****__init__.py****
│   └── misc
│   ├── __init__.py
│   ├── module3.py
│   └── module4.py
└── lib
└── python
└── xxx
├── ****__init__.py****
└── util
├── __init__.py
├── module1.py
└── module2.py
Some setup.sh file to add to the Python Path:
libPath=someroot/lib/python/
graphicsPath=someroot/graphics/python/
export PYTHONPATH=$PYTHONPATH:$libPath:$graphicsPath
Python test code (tested on Python versions 2.7.14 and 3.6.4 using pyenv):
import xxx.util.module1
import xxx.misc.module3 # No errors

Categories

Resources