Python 3.5 vs Python 2.7: Modules importing submodules - python

I have been googling this for the past hours and can't find an equivalent question anywhere. Also the documentation for 2.7 and 3.5 seem identical, so I don't think this behavior is documented.
Here is my directory structure:
project
-- project.py
-- api
-- __init__.py
-- subapi
-- __init__.py
contents of project/project.py: import api
contents of project/api/__init__.py: import subapi
If I execute python project.py (using python 2.7) from inside the projects folder, it returns without an error. If I do the same with python 3 (python3 project.py), then it crashes with
Traceback (most recent call last):
File "project.py", line 1, in <module>
import api
File "/home/me/Documents/project/api/__init__.py", line 1, in <module>
import subapi
ImportError: No module named 'subapi'
If I rewrite the import statement to use paths relative to the projects directory (import api.subapi), then it works with python 2 as well as 3. It's not a satisfying solution though because that requires me to reference parent modules from within sub modules which kind of defeats the idea of modularity..
Does anyone have an idea what I can do to get the behavior of python2 back? The module search algorithm should prioritize searching in the local directory of the file using the import statement. It should also prioritize these files above built in modules by the way. Try importing a module 'test'..
--EDIT--
I was asked by stackoverflow to differentiate my question from another called "How to do relative imports". I think this question is different because I am asking specifically about differences between two versions. Using relative imports is the solution, not the question.

Use an explicit relative import:
from . import subapi

Related

Trouble importing a module that imports a module

I'm having trouble with a python package that uses separate modules to structure code. The package itself is working, however when imported from another environment fails with a ModuleNotFound error.
Here's the structure:
Project-root
|
|--src/
| __init__.py
| module_a.py
| module_b.py
| module_c.py
| module_d.py
|--tests
etc.
In module_a.py I have:
from module_a import function_a1,...
from module_b import function_b1,...
from module_c import function_c1,...
In module_c I import module_d like:
from module_d import function_d1,...
As mentioned above, executing module_a or module_c directly from the CLI work as expected, the unit tests I've created in the test directory also work (with the help of sys.path.insert), however if I create a new environment and import the package I get the following error:
>>> import module_a
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/<abs_path>/.venv/lib/python3.9/site-packages/module_a.py", line 22, in <module>
from module_c import function_c1, function_c2
File /<abs_path>/.venv/lib/python3.9/site-packages/module_c.py", line 9, in <module>
import module_d
ModuleNotFoundError: No module named 'module_d'
>>>
I've exhausted all ideas how to overcome this, besides combining the code of modules c and d in one file, which I'd hate to do, or rethink the flow so that all modules are imported from module_a.
Any suggestions how to approach this would be greatly appreciated.
Update: It turned out to be a typing mistake in the name of module_d in setup.py. For whatever reason python setup.py install was failing silently or I wasn't reading the logs carefully.
The problem comes down to understanding the basics of the import system and the PYTHONPATH.
When you try to import a module (import module_a), Python will search in order in every directory listed in sys.path. If a directory matches the name (module_a)1, then it runs the __init__.py file is such exist.
When you get an [https://docs.python.org/3/library/exceptions.html#ImportError], it means that there is no directory in sys.path containing a directory with the name asked.
You said for your tests you did something like sys.path.insert(0, "some/path/"), but it is not a solution, just a broken fix.
What you should do is set your PYTHONPATH environment variable to contain the directory where your modules are located, Project-root/src in your case. That way, no need to ever use sys.path.insert, or fiddle with relative/absolute paths in import statements.
When you create your new environment, just set your environment variable PYTHONPATH to include Project-root/src and you are done. This is how installing regular Python modules (libraries) work : they are all put into a directory in site-packages.
1: this changed since old Python versions, it used to be required for the directory to contain an __init__.py file

__init__.py does not find modules in same directory [duplicate]

This question already has an answer here:
ImportError on python 3, worked fine on python 2.7
(1 answer)
Closed 5 years ago.
I need assistance on how to organize source in a python package - I've already followed several tutorials on the web (especially this one) on how to do so, but it does not work as explained and how I imagined it.
I want to create a python package named binaryio. It should offer two classes named BinaryReader and BinaryWriter which I want users to be able to import with
from binaryio import BinaryReader
from binaryio import BinaryWriter
Thus I have created my repository and package directory structure as follows:
binaryio (repository root)
binaryio (package root)
__init__.py (s. below)
binaryreader.py (contains the BinaryReader class)
binarywriter.py (contains the BinaryWriter class)
setup.py (contains the setuptools.setup call)
.gitignore, README.md, LICENSE, ...
As you can see, the classes are in separate files as I'm used to this (coming from a C# background). I'm not sure if this is a good idea due to modules being the "unit" in Python - but otherwise cramping all classes into one huge file did not seem logical to me.
__init__.py looks as follows to import those classes, making (as I understood it) the from binaryio import BinaryReader imports possible for users later on:
from binaryreader import BinaryReader
from binarywriter import BinaryWriter
However, when I install the package locally (which seems to work fine) and try to import binaryio, I get the following error:
>>> import binaryio
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "D:\Projects\Git\binaryio\binaryio\__init__.py", line 1, in <module>
from binaryreader import BinaryReader
ModuleNotFoundError: No module named 'binaryreader'
Apparently, something is wrong with my __init__.py file. I don't understand this, as a binaryreader.py file aka module exists in the same folder as you can see above. Funnily enough, my IDE (PyCharm, having set the package root as source folder) does not complain about the statements in it and can resolve all references.
What am I doing wrong here? According to the tutorial linked above, putting a file into xyz.py with a class named Abc and then write from xyz import Abc into __init__.py should work, but apparently it doesn't for me.
Your code would work for Python 2.x, but not 3.x because of different relative imports syntax: without dot, Python 2.x would look for modules in module root and current package, and Python 3.x will look only in module root.
Import statements you want to use are these:
from binaryio.binaryreader import BinaryReader
from binaryio.binarywriter import BinaryWriter
Works in both Python 2.x and 3.x without "futures"
I think you need to add a dot in your import statements :
from .binaryreader import BinaryReader
from .binarywriter import BinaryWriter

Python3 ImportError: No module named 'dateclass' [duplicate]

I have been googling this for the past hours and can't find an equivalent question anywhere. Also the documentation for 2.7 and 3.5 seem identical, so I don't think this behavior is documented.
Here is my directory structure:
project
-- project.py
-- api
-- __init__.py
-- subapi
-- __init__.py
contents of project/project.py: import api
contents of project/api/__init__.py: import subapi
If I execute python project.py (using python 2.7) from inside the projects folder, it returns without an error. If I do the same with python 3 (python3 project.py), then it crashes with
Traceback (most recent call last):
File "project.py", line 1, in <module>
import api
File "/home/me/Documents/project/api/__init__.py", line 1, in <module>
import subapi
ImportError: No module named 'subapi'
If I rewrite the import statement to use paths relative to the projects directory (import api.subapi), then it works with python 2 as well as 3. It's not a satisfying solution though because that requires me to reference parent modules from within sub modules which kind of defeats the idea of modularity..
Does anyone have an idea what I can do to get the behavior of python2 back? The module search algorithm should prioritize searching in the local directory of the file using the import statement. It should also prioritize these files above built in modules by the way. Try importing a module 'test'..
--EDIT--
I was asked by stackoverflow to differentiate my question from another called "How to do relative imports". I think this question is different because I am asking specifically about differences between two versions. Using relative imports is the solution, not the question.
Use an explicit relative import:
from . import subapi

Python Module Imports - Explicit vs Implicit Relative Imports

Last night, when working on my mac, I set up some module imports in my __init__.py's
from MongoProvider import MongoProvider
from Settings import Settings
etc. I'm unsure of what version of Python is on that machine. I'll edit the question later with that info once I have it.
Today, working on a different machine, which is Windows and using Python 3.3.3, my module imports were breaking. By adding an explicit relative import (adding a leading dot), I was able to fix the issue.
from .MongoProvider import MongoProvider
from .Settings import Settings
The trace I'm receiving is:
Traceback (most recent call last):
File "app.py", line 5, in <module> from modules.route_handlers import Route_Handlers
File "C:\Users\willb\bearded-dubstep\modules\route_handlers\Route_Handlers.py", line 9, in <module> from modules.backend_providers import Settings
File "C:\Users\willb\bearded-dubstep\modules\backend_providers\__init__.py", line 1, in <module> from MongoProvider import MongoProvider
ImportError: No module named 'MongoProvider'
My Project Layout is
root
|_modules
|_api_helpers
| __init__.py
| InvalidUsage.py
| response_utils.py
|_backend_providers
| __init__.py
| MongoProvider.py
| Settings.py
|_route_handlers
| __init__.py
| Route_Handlers
| app.py
Any ideas what would cause this? Is there a configuration file I should be looking at?
Well, according to PEP-8 imports section:
Implicit relative imports should never be used and have been removed in Python 3.
As Python 3.3 is the one causing you trouble, making you explicit import relatives modules, I assume this explains the situation. It's probably that on Mac you have Python 2.x, that's why it works there.
Looking at your project's file distribution, Settings.py and MongoProvider are indeed relative modules. This means that the removal of implicit relative imports in Python 3 is the one causing you trouble, because the import system is unable to find the module:
ImportError: No module named 'MongoProvider'
It seems that the Python Version on your mac is 2.x while the Python on your windows is 3.x.
I have met the same problem before using module tkinter (which is Tkinter in Python 2.x).
It seems that we need to use from XXX.xxx import xxx to import...●īšâ—
I don't know why, maybe it's the designer's adjusting to Python.

Can modules with a common package hierarchy mentioned multiple times in my PYTHONPATH?

I have two separate projects that share a package name. They run OK as long as they are not both on the PYTHONPATH, but as soon as they both appear one of them cannot find imports in its own project.
Example, two projects like this:
Project 1:
x/
__init__.py
test.py
foo.py
test.py contains the line:
import x.foo
Project 2:
x/
__init__.py
bar.py
If I run
PYTHONPATH=. python x/y/test.py
there is no error. But if I run
PYTHONPATH='pathtoproject2:.' python x/test.py
I get the error:
Traceback (most recent call last):
File "x/test.py", line 1, in <module>
import x.foo
ImportError: No module named foo
Is there a way to have different Python projects with a common package share the PYTHONPATH? Or will Python always use only the first path where a package is found?
Note: I know if you modify the import from x.foo to import foo then it will work. But I want to know if it is possible to do it without modifying either package.
Although not supported natively by the import mechanism, there is a workaround solution to create namespaced packages in python. You just have to put the following code on both __init__.py files.
try:
import pkg_resources
pkg_resources.declare_namespace(__name__)
except ImportError:
import pkgutil
__path__ = pkgutil.extend_path(__path__, __name__)
pkg_resources is provided by setuptools python package and has the advantage that also handles packages contained within egg zip files.
pkgutil is contained in python's standard library so we rely on it to handle the namespace extension if setuptools is not installed in the system.
for more information about python namespace packages can be found here:
http://packages.python.org/distribute/setuptools.html#namespace-packages
http://www.python.org/dev/peps/pep-0382/
Currently, Python does not support packages from different directories. A package is an unit, not just a namespace. This is different from Java "packages" or the more appropriately named "namespaces" in .NET.
When importing a package, Python will scan sys.path, sequentially, and use the first match. If there is another module or package with a matching name in a directory that appears later in the path, it won't be found.
Your "note" is not true, by the way. When you use import foo, Python will try a relative import within the directory of test.py, find no match, then attempt an absolute import of module foo, which does not exist either, and then raise an ImportError.
Instead of using package names to group modules using a common prefix, think of packages as smallish, self-contained libraries. In Python, flat is better than nested, and it is preferable to have multiple top-level packages, each fulfilling one distinct purpose, than having one large monolithic package. Instead of org.example.foo and org.example.bar, just use foo and bar.

Categories

Resources