Use own modules in beeware - python

I have a beeware project and also want to use my own modules in it like Models and Controllers. Also, a module which creates some objects I can test with.
But when I want to import the module to create the test objects and use the method it just throws an error:
ImportError: attempted relative import beyond top-level package
After some research, I know that the path (directory) structures, where I put my modules in, and where the package is, are important. But where ever I put the modules it has the same (or kinda like this) errors. But I can import my Models to create objects of these classes. I also can't decide where the start point of the briefcase is.
Here my structure currently:
/Project_Dir (own created)
/briefcase_project (created from briefcase)
/src
/Models (own created)
/app_directory (created from briefcase)
here is the __main__.py and the __init__.py (the start point I guess) and the app.py (where beeware code is, and also my module import from Test)
/Test (own created, here is a file with a method I want to call)
Sadly there is not so much stuff to find about beeware so I could find a solution.
Please help. Thanks ^^

I did the following to workaround the issue. The example using the Beeware Tutorial 2 source code is on Github
.
├── __init__.py
├── __main__.py
├── app.py
├── mylib <--- # my lib.
│   ├── __init__.py
│   └── testlib.py
└── resources
├── __init__.py
├── beewarecustomlibexample.icns
├── beewarecustomlibexample.ico
└── beewarecustomlibexample.png
2 directories, 9 files
The mylib/testlib.py
def test(text: str) -> str:
return f"Hello: {text}"
In the app.py:
import toga
from toga.style import Pack
from toga.style.pack import COLUMN, ROW
from beewarecustomlibexample.mylib.testlib import test # Import custom lib
class BeewareCustomLibExample(toga.App):
def startup(self):
...
def say_hello(self, widget):
# Calling my test method
result = test(self.name_input.value)
self.main_window.info_dialog("Test Dialog", result)
def main():
return BeewareCustomLibExample()
The above is how I got it working. I've built it on MacOS and works fine.

Take your project folder name and then import from there, so if you're tinkering with the tutorial and you've set up a module folder called myModule in the same directory as your app.py and you have a file called file.py with a class called myClass, you might type:
from helloworld.myModule.file import myClass

Related

Python Import function from another file not working

I'm facing a problem with importing functions from another file that is not working.
Here is the simplify tree of my folders, located on /var/www/html/opencaptureforinvoices/ :
├── custom
│   └── test
│   └── src
│   └── backend
│   └── process_queue.py
└── src
└── backend
└── main.py
I run the process_queue.py script with the following command, using Kuyruk (lib to enqueue process)
cd /var/www/html/opencaptureforinvoices/custom/test || exit
/usr/local/bin/kuyruk --app src.backend.process_queue_verifier.kuyruk worker --queue
The problem is that I need function from main.py. I use import like this :
from src.backend.main import create_classes_from_custom_id, check_file, timer, str2bool
Before posting I tried to rewrite the path to the root of custom & src using sys.path.append or sys.path.insert or os.path.chdir but none of them working, the application said to me :
ModuleNotFoundError: No module named 'src.backend.main'
Here is the command I tried to move to root folder :
os.chdir('/var/www/html/opencaptureforinvoices/')
sys.path.append('/var/www/html/opencaptureforinvoices/')
sys.path.insert(0, '/var/www/html/opencaptureforinvoices/')
Any ideas ?
Thanks
Have you set up your subdirectories as packages using the __init__.py file?
https://docs.python.org/3/tutorial/modules.html#packages
The __init__.py files are required to make Python treat directories
containing the file as packages. This prevents directories with a
common name, such as string, unintentionally hiding valid modules that
occur later on the module search path. In the simplest case,
__init__.py can just be an empty file, but it can also execute initialization code for the package or set the __all__ variable,
described later.
Users of the package can import individual modules from the package,
for example:
import sound.effects.echo
First solution:
Make sure the folder also contains an __ init __.py, this allows it to be included as a package.
Second:
When importing a file, Python only searches the directory that the entry-point script is running from and sys.path which includes locations such as the package installation directory.
But you can add to the Python path at runtime:
some_file.py
import sys
sys.path.append('/path/to/application/app/folder')
import file

How to write nested __init__.py files

I am struggling with nested __init__.py in a Python package I am writting. The Package has the following architecture:
module/
├── __init__.py
├── submodule1
│   ├── __init__.py
│   └── source.py
└── submodule2
├── __init__.py
├── source.py
└── subsubmodule2
├── __init__.py
└── source.py
My intent is to be able to access functions defined in submodule2/source.py through module.submodule2.function and in subsubmodules2/source.py through module.submodule2.subsubmodule2.function.
The first thing I tried was to define __init__.py in submodule2 this way:
from .subsubmodule2 import *
But doing so, I get the functions defined in subsubmodules2/source.py through module.submodule2.function (and module.function).
If I do:
from . import subsubmodule2
I get these functions through module.subsubmodule2.function.
I also tried to define __all__ keyword in __init__, with no more success. If I understand well Python documentation, I guess I could leave empty __init__.py files and it could work, but from my understanding that is not the best practice either.
What would be the best way to access these functions as intended in my module?
in module __init__.py file write the module which you want to import as
from . import submodule1
from . import submodule2
__all__ = ['submodule1', 'submodule2']
Now, in submodule1 __init__.py file write
from . import source
from . import subsubmodule
# if you want to import functions from source then import them or in source.py
# define __all__ and add function which you want to expose
__all__ = ['source', 'subsubmodule']
now in subsubmodule __init__ file define function or class which you want to expose
from . source import *
__all__ = ['source']
# if you want to use * as import, i suggest you to use __all__ in source.py and mention all exposed function there
The __init__.py file represents its respective package. For example, module/submodule2/__init__.py represents module. submodule2 .
In order to pull objects defined in submodules into their package namespace, import them:
# module/submodule2/__init__.py
from .source import *
Since __init__.py is a regular Python module, one can also forgo a separate .source module and define objects directly inside __init__.py:
# module/submodule2/__init__.py
def function():
...
Note that subpackages themselves are already available as their respective name. One does not have to – and in fact should not – import them in the parent module. They will be imported if code using the package imports them.

Python imports - why main package name is required?

I would like to know why I need to include the main directory name in the import statements for creating my project's directory structure.
My project's structure
.
├── main.py
├── myModel
│   ├── __init__.py
│   ├── loader
│   │   ├── __init__.py
│   │   └── dataset_loader.py
│   └── models
│   ├── __init__.py
│   ├── static_model.py
│   └── task_model.py
└── tree.txt
main.py
from myModel import loader
from myModel import models
loader.fun ()
loader.dataset_loader.fun ()
myModel/__init__.py
import myModel.models
import myModel.loader
myModel/loader/__init__.py
from myModel.loader.dataset_loader import *
myModel/models/__init__.py
from myModel.models.static_model import StaticModel
My first question is why I need to put myModel even in subfolders of the myModel directory. I tried to remove it, but import didn't work so I think it needs to be there.
Secondly, why I can call fun directly from loader and not using the full qualified path?
I read something on the web. But I still have trouble understanding why this happens.
Absolute imports, like import x.y.z or from x.y import z require x to be in your path. In your specific case, myModel is on the path because of your working directory. The sub-packages are not on the path, and can therefore only be accessed by reiterating the root package.
A more intuitive approach might be to use relative paths. This is possible because all your files live in proper packages with __init__ files. Keep in mind that relative paths imply that you have modules that are designed to live in your package structure and not on their own. Otherwise, you may end up causing errors when you try to run some of the modules as standalone scripts.
Change myModel/__init__.py to:
from . import models
from . import loader
The . makes the import relative. Notice that I did not suggest changing main.py, since it lives outside your packages. Adding more dots lets you go up more levels in the file hierarchy.
Change myModel/loader/__init__.py to
from .dataset_loader import *
and myModel/models/__init__.py to
from .static_model import StaticModel
An import statement binds a name in your local namespace (usually the module you are executing it in). The name that is bound depends on which form of import you used:
import x binds the module described in x.py or x/__init__.py to the name x
import x.y binds the module described in x.py or x/__init__.py to the name x, and insures that x has an attribute y, either as an attribute defined in x.py/__init__.py, or as a sub-module in x/y.py.
from x import y binds the attribute or sub-module y from x.py/x/__init__.py or x/y.py to the name y. This option loads, but does not give you access to x.
When you run from myModel import loader, you get a module object loader that has a callable attribute fun.

Python calling local function in main code breaks imports in tests?

I have had it up to here with Python's import system... thought I'd finally got something reliable and then the inexplicable happens!
This is the directory structure of my application:
/
- my-application/
- subpackage/
- __init__.py
- my_module.py
- __init__.py
- tests/
- subpackage/
- __init__.py
- test_my_module.py
- __init__.py
- conftest.py
- run.py
- spark.py
I run all my tests through tests/run.py, which looks like the following (in an attempt to resolve all the import problems):
import os
import pytest
import sys
rootdir = os.path.abspath(os.path.join(os.path.dirname(__file__), ".."))
sys.path.insert(0, os.path.abspath(os.path.join(rootdir, "my-application")))
sys.exit(pytest.main([os.path.join(rootdir, "tests")]))
This worked like an absolute charm, until I made one modification to the file /my-application/subpackage/my-module.py - I added a local function call. So e.g. my_module.py:
def foo():
pass
def run_my_module():
def bar():
foo() <---- Added this line
bar()
print("Ran")
UPDATE: THIS works fine:
def foo():
pass
def run_my_module():
def bar():
pass
foo()
bar()
print("Ran")
As soon as I added that local function call, the tests stop working, with the error No module named "subpackage".
The test_my_module.py looks like this (basically):
from subpackage.my_module import run_my_module
def basic_test():
run_my_module()
Note that in test_my_module.py I am using subpackage as the first part of my import statement, because I am using the run.py file that sets my-application as a system path. If I change the import to start with my_application I get the same error referring to my_application.py.
I am still learning python, so suggest any change to my application structure you like. I can't believe the hassle of this import system - I do feel like I'm missing something basic here...
Thank you in advance!
Managing import paths manually is difficult.
A setup.py is the best way to manage python packages.
by convention package names should use _ not -.
Create a setup.py with this content next to my_application/
from setuptools import find_packages, setup
setup(
name='my_application',
version='0.0.1',
packages=find_packages(),
)
I recommend an application structure like the following:
$ tree
├── my_application
│   ├── __init__.py
│   ├── spark.py
│   └── subpackage
│   ├── __init__.py
│   └── my_module.py
├── setup.py
└── tests
├── conftest.py
└── subpackage
├── __init__.py
└── test_my_module.py
Install package locally
python setup.py develop
This will symlink (magically) install your package into the python package path
Now in any scripts you can use paths as you'd expect e.g.
from my_application.subpackage.my_module import run_my_module
also recommend you use a virtualenv
More on setup.py here

Python accessing modules from package that is distributed over different directories

I have a question regarding one single module that is distributed over multiple directories.
Let's say I have these two file and directories:
~/lib/python
xxx
__init__.py
util
__init__.py
module1.py
module2.py
~/graphics/python
xxx
__init__.py
misc
__init__.py
module3.py
module4.py
So then in my Python modules, I did this:
import sys
pythonlibpath = '~/lib/python'
if pythonlibpath not in sys.path: sys.path.append(pythonlibpath)
import xxx.util.module1
which works.
Now, the problem is that I need xxx.misc.module3, so I did this:
import sys
graphicslibpath = '~/graphics/python'
if graphicslibpath not in sys.path: sys.path.append(graphicslibpath)
import xxx.misc.module3
but I get this error:
ImportError: No module named misc.module3
It seems like it somehow still remembers that there was a xxx package in ~/lib/python and then tries to find misc.module3 from there.
How do I get around this issue?
You can't without an extreme amount of trickery that pulls one package structure into the other. Python requires that all modules in a package be under a single subdirectory. See the os source to learn how it handles os.path.
Python does indeed remember that there was a xxx package. This is pretty much necessary to achieve acceptable performance, once modules and packages are loaded they are cached. You can see which modules are loaded by looking the the dictionary sys.modules.
sys.modules is a normal dictionary so you can remove a package from it to force it to be reloaded like below:
import sys
print sys.modules
import xml
print sys.modules
del sys.modules['xml']
print sys.modules
Notice that after importing the xml package it is the dictionary, however it is possible to remove it from that dictionary too. This is a point I make for pedagogical purposes only, I would not recommend this approach in a real application. Also if you need to use your misc and util packages together this would not work so great. If at all possible rearrange your source code structure to better fit the normal Python module loading mechanism.
This is addressed by Implicit Namespace Packages in Python 3.3. See PEP-420.
This is an adaptation of an answer to a similar question.
Following up on #Gary's answer, the PEP 420 page says to use the following code on shared __init__.py packages.
__init__.py:
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
This code should be placed inside the xxx directory's __init__.py.
See the *s below
someroot/
├── graphics
│   └── python
│   └── xxx
│   ├── ****__init__.py****
│   └── misc
│   ├── __init__.py
│   ├── module3.py
│   └── module4.py
└── lib
└── python
└── xxx
├── ****__init__.py****
└── util
├── __init__.py
├── module1.py
└── module2.py
Some setup.sh file to add to the Python Path:
libPath=someroot/lib/python/
graphicsPath=someroot/graphics/python/
export PYTHONPATH=$PYTHONPATH:$libPath:$graphicsPath
Python test code (tested on Python versions 2.7.14 and 3.6.4 using pyenv):
import xxx.util.module1
import xxx.misc.module3 # No errors

Categories

Resources