Running unittest with modules that must import other modules - python

Our Python 3.10 unit tests are breaking when the modules being tested need to import other modules. When we use the packaging techniques recommended by other posts and articles, either the unit tests fail to import modules, or the direct calls to run the app fail to import modules. The other posts and articles we have read do not show how to validate that both the application itself and the unit tests can each import modules when called separately. So we created a bare bones example below and are asking how to structure the packaging correctly.
What specific changes must be made to the syntax below in order for the two python commands given below to successfully run on the bare bones example app given below?
PROBLEM DESCRIPTION:
A python 3.10 app must import modules when called either directly as an app or indirectly through unit tests.
Packages must be used to organize the code.
Calls to unit tests are breaking because modules cannot be found.
The two test commands that must run without errors to validate solution of this problem are:
C:\path\to\dir>python repoName\app\first.py
C:\path\to\dir>python -m unittest repoName.unitTests.test_example
This post is different from the other posts on this topic. We have reviewed many articles and posts on this topic, but the other sources failed to address our use case, so we have created a more explicit example below to test the two types of commands that must succeed in order to meet the needs of this more explicit use case.
APP STRUCTURE:
The very simple structure of the app that is failing to import packages during unit tests is:
repoName
app
__init__.py
first.py
second.py
third.py
unitTests
__init__.py
test_example.py
__init__.py
SIMPLE CODE TO REPRODUCE PROBLEM:
The code for a stripped down example to reproduce the problem is as follows:
The contents of repoName\app\__init__.py are:
print('inside app __init__.py')
__all__ = ['first', 'second', 'third']
The contents of first.py are:
import second as second
from third import third
import sys
inputArgs=sys.argv
def runCommands():
trd = third()
if second.something == 'platform':
if second.another == 'on':
trd.doThree()
if second.something != 'unittest' :
sys.exit(0)
second.processInputArgs(inputArgs)
runCommands()
The contents of second.py are:
something = ''
another = ''
inputVars = {}
def processInputArgs(inputArgs):
global something
global another
global inputVars
if ('unittest' in inputArgs[0]):
something = 'unittest'
elif ('unittest' not in inputArgs[0]):
something = 'platform'
another = 'on'
jonesy = 'go'
inputVars = { 'jonesy': jonesy }
The contents of third.py are:
print('inside third.py')
import second as second
class third:
def __init__(self):
pass
##public
def doThree(self):
print("jonesy is: ", second.inputVars.get('jonesy'))
The contents of repoName\unitTests\__init__.py are:
print('inside unit-tests __init__.py')
__all__ = ['test_example']
The contents of test_example.py are:
import unittest
class test_third(unittest.TestCase):
def test_doThree(self):
from repoName.app.third import third
num3 = third()
num3.doThree()
self.assertTrue(True)
if __name__ == '__main__':
unittest.main()
The contents of repoName\__init__.py are:
print('inside repoName __init__.py')
__all__ = ['app', 'unitTests']
ERROR RESULTING FROM RUNNING COMMANDS:
The command line response to the two commands are given below. You can see that the call to the app succeeds, while the call to the unit test fails.
C:\path\to\dir>python repoName\app\first.py
inside third.py
jonesy is: go
C:\path\to\dir>python -m unittest repoName.unitTests.test_example
inside repoName __init__.py
inside unit-tests __init__.py
inside app __init__.py
inside third.py
E
======================================================================
ERROR: test_doThree (repoName.unitTests.test_example.test_third)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\path\to\dir\repoName\unitTests\test_example.py", line 15, in test_doThree
from repoName.app.third import third
File "C:\path\to\dir\repoName\app\third.py", line 3, in <module>
import second as second
ModuleNotFoundError: No module named 'second'
----------------------------------------------------------------------
Ran 1 test in 0.002s
FAILED (errors=1)
What specific changes must be made to the code above in order for all the modules to be imported correctly when either of the given commands are run?

Creating an "alias" for modules
Update the contents of repoName\app\__init__.py to:
print('inside app __init__.py')
__all__ = ['first', 'second', 'third']
import sys
import repoName.app.second as second
sys.modules['second'] = second
import repoName.app.third as third
sys.modules['third'] = third
import repoName.app.first as first
sys.modules['first'] = first
How to ensure first.py gets run even when imported
So when the test fixture imports repoName.app.third, Python will recursively import the parent packages so that:
import repoName.app.third is equivalent to
import repoName
# inside repoName __init__.py
import app
#inside app __init__.py
import third
#inside third.py
So running from repoName.app.third import third inside test_doThree, executes repoName\app\__init__.py. In __init__.py, import repoName.app.first as first is called. Importing first will execute the following lines at the bottom of first.py
second.processInputArgs(inputArgs)
runCommands()
In second.processInputArgs, jonesy = 'go' is executed setting the variable to be printed out when the rest of the test is ran.

Here is how I have gone about trying to solve this.
I exported the PYTHONPATH to the repo folder repoName (I am using linux)
cd repoName
export PYTHONPATH=`pwd`
then in test_example.py
import unittest
class test_third(unittest.TestCase):
def test_doThree(self):
from app.third import third # changed here
num3 = third()
num3.doThree()
self.assertTrue(True)
if __name__ == '__main__':
unittest.main()
Then in third.py
print('inside third.py')
import app.second as second # changed here
class third:
def __init__(self):
pass
##public
def doThree(self):
print("jonesy is: ", second.inputVars.get('jonesy'))
Also it is worth noting that I did not create any __init__.py files

The code in the question relies on first.py being imported so it calls a function in second.py to set a global that is used by third.py. As the Zen Of Python says:
Explicit is better than implicit
The current structure will be difficult to maintain, test, and debug as your project grows. I have redone the example in the question removing globals and code being executed on import.
first.py
import sys
from app import second
from app.third import Third
def run_commands(input_args):
trd = Third()
if input_args.another == "on":
trd.do_three(input_args)
def main():
input_args = second.process_input_args(sys.argv)
run_commands(input_args)
if __name__ == "__main__":
main()
second.py
from dataclasses import dataclass
#dataclass
class InputArgs:
something: str
another: str
jonesy: str
def process_input_args(input_args):
something = "platform"
another = "on"
jonesy = "go"
return InputArgs(something, another, jonesy)
third.py
import sys
print("inside third.py")
class Third:
def __init__(self):
pass
# #public
def do_three(self, input_args):
print("jonesy is: ", input_args.jonesy)
test_example.py
import io
import unittest
from unittest import mock
from app.second import InputArgs
from app.third import Third
class ThirdTests(unittest.TestCase):
def test_doThree(self):
input_args = InputArgs(something="platform",
another="on",
jonesy="go")
num3 = Third()
with unittest.mock.patch('sys.stdout', new=io.StringIO()) as fake_out:
num3.do_three(input_args)
self.assertEqual("jonesy is: go\n", fake_out.getvalue())
if __name__ == "__main__":
unittest.main()
For Python development I would always recommend having a Python Virtual Environment (venv) so that each repo's development is isolated.
In the repoName directory do (for Linux):
python3.10 -m venv venv
Or like the following for windows:
c:\>c:\Python310\python -m venv venv
You will then need to activate the venv.
Linux: . venv/bin/activate
Windows: .\venv\scripts\activate.ps1
I would suggest packaging the app as your module then all your imports will be of the style:
from app.third import third
trd = third()
or
from app import third
trd = third.third()
To package app create a setup.py file in the repoName directory. The file will look something like this:
from setuptools import setup
setup(
name='My App',
version='1.0.0',
url='https://github.com/mypackage.git',
author='Author Name',
author_email='author#gmail.com',
description='Description of my package',
packages=['app'],
install_requires=[],
entry_points={
'console_scripts': ['my-app=app.first:main'],
},
)
I would also rename the unitTests directory to something like tests so that the unittest module can find it automatically as it looks for files and directories starting with test.
So a structure something like this:
repoName/
├── app
│   ├── __init__.py
│   ├── first.py
│   ├── second.py
│   └── third.py
├── setup.py
├── tests
│   ├── __init__.py
│   └── test_example.py
└── venv
You can now do pip install to install from a local src tree in development mode. The great thing about this is that you don't have to mess with the python path or sys.path.
(venv) repoName $ pip install -e .
Obtaining file:///home/user/projects/repoName
Preparing metadata (setup.py) ... done
Installing collected packages: My-App
Running setup.py develop for My-App
Successfully installed My-App-1.0.0
With the install done the app can be launched:
(venv) repoName $ python app/first.py
inside app __init__.py
inside third.py
jonesy is: go
In the setup file we told python that my-app was an entry point so we can use this to launch the same thing:
(venv) repoName $ my-app
inside app __init__.py
inside third.py
jonesy is: go
For the unittests we can use the following command and it will discover all the tests because we have used test to start directory and file names.
(venv) repoName $ python -m unittest
inside app __init__.py
inside unit-tests __init__.py
inside third.py
jonesy is: go
.
----------------------------------------------------------------------
Ran 1 test in 0.000s
OK
Now we have this setup it is easy to package up app for distribution. Either directly to users or via a Package Index like https://pypi.org/
Install the build module:
(venv) repoName $ pip install --upgrade build
To build the Python wheel:
(venv) repoName $ python build
There should now be a dist folder with a wheel in it which you can send to users. They can install this with pip:
pip install My_App-1.0.0-py3-none-any.whl

Related

Getting desperate with Python file importing [duplicate]

I want to import a function from another file in the same directory.
Usually, one of the following works:
from .mymodule import myfunction
from mymodule import myfunction
...but the other one gives me one of these errors:
ImportError: attempted relative import with no known parent package
ModuleNotFoundError: No module named 'mymodule'
SystemError: Parent module '' not loaded, cannot perform relative import
Why is this?
unfortunately, this module needs to be inside the package, and it also
needs to be runnable as a script, sometimes. Any idea how I could
achieve that?
It's quite common to have a layout like this...
main.py
mypackage/
__init__.py
mymodule.py
myothermodule.py
...with a mymodule.py like this...
#!/usr/bin/env python3
# Exported function
def as_int(a):
return int(a)
# Test function for module
def _test():
assert as_int('1') == 1
if __name__ == '__main__':
_test()
...a myothermodule.py like this...
#!/usr/bin/env python3
from .mymodule import as_int
# Exported function
def add(a, b):
return as_int(a) + as_int(b)
# Test function for module
def _test():
assert add('1', '1') == 2
if __name__ == '__main__':
_test()
...and a main.py like this...
#!/usr/bin/env python3
from mypackage.myothermodule import add
def main():
print(add('1', '1'))
if __name__ == '__main__':
main()
...which works fine when you run main.py or mypackage/mymodule.py, but fails with mypackage/myothermodule.py, due to the relative import...
from .mymodule import as_int
The way you're supposed to run it is...
python3 -m mypackage.myothermodule
...but it's somewhat verbose, and doesn't mix well with a shebang line like #!/usr/bin/env python3.
The simplest fix for this case, assuming the name mymodule is globally unique, would be to avoid using relative imports, and just use...
from mymodule import as_int
...although, if it's not unique, or your package structure is more complex, you'll need to include the directory containing your package directory in PYTHONPATH, and do it like this...
from mypackage.mymodule import as_int
...or if you want it to work "out of the box", you can frob the PYTHONPATH in code first with this...
import sys
import os
SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.append(os.path.dirname(SCRIPT_DIR))
from mypackage.mymodule import as_int
It's kind of a pain, but there's a clue as to why in an email written by a certain Guido van Rossum...
I'm -1 on this and on any other proposed twiddlings of the __main__
machinery. The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern. To make me change my mind you'd have to convince me that
it isn't.
Whether running scripts inside a package is an antipattern or not is subjective, but personally I find it really useful in a package I have which contains some custom wxPython widgets, so I can run the script for any of the source files to display a wx.Frame containing only that widget for testing purposes.
Explanation
From PEP 328
Relative imports use a module's __name__ attribute to determine that
module's position in the package hierarchy. If the module's name does
not contain any package information (e.g. it is set to '__main__')
then relative imports are resolved as if the module were a top level
module, regardless of where the module is actually located on the file
system.
At some point PEP 338 conflicted with PEP 328:
... relative imports rely on __name__ to determine the current
module's position in the package hierarchy. In a main module, the
value of __name__ is always '__main__', so explicit relative imports
will always fail (as they only work for a module inside a package)
and to address the issue, PEP 366 introduced the top level variable __package__:
By adding a new module level attribute, this PEP allows relative
imports to work automatically if the module is executed using the -m
switch. A small amount of boilerplate in the module itself will allow
the relative imports to work when the file is executed by name. [...] When it [the attribute] is present, relative imports will be based on this attribute
rather than the module __name__ attribute. [...] When the main module is specified by its filename, then the __package__ attribute will be set to None. [...] When the import system encounters an explicit relative import in a
module without __package__ set (or with it set to None), it will
calculate and store the correct value (__name__.rpartition('.')[0]
for normal modules and __name__ for package initialisation modules)
(emphasis mine)
If the __name__ is '__main__', __name__.rpartition('.')[0] returns empty string. This is why there's empty string literal in the error description:
SystemError: Parent module '' not loaded, cannot perform relative import
The relevant part of the CPython's PyImport_ImportModuleLevelObject function:
if (PyDict_GetItem(interp->modules, package) == NULL) {
PyErr_Format(PyExc_SystemError,
"Parent module %R not loaded, cannot perform relative "
"import", package);
goto error;
}
CPython raises this exception if it was unable to find package (the name of the package) in interp->modules (accessible as sys.modules). Since sys.modules is "a dictionary that maps module names to modules which have already been loaded", it's now clear that the parent module must be explicitly absolute-imported before performing relative import.
Note: The patch from the issue 18018 has added another if block, which will be executed before the code above:
if (PyUnicode_CompareWithASCIIString(package, "") == 0) {
PyErr_SetString(PyExc_ImportError,
"attempted relative import with no known parent package");
goto error;
} /* else if (PyDict_GetItem(interp->modules, package) == NULL) {
...
*/
If package (same as above) is empty string, the error message will be
ImportError: attempted relative import with no known parent package
However, you will only see this in Python 3.6 or newer.
Solution #1: Run your script using -m
Consider a directory (which is a Python package):
.
├── package
│   ├── __init__.py
│   ├── module.py
│   └── standalone.py
All of the files in package begin with the same 2 lines of code:
from pathlib import Path
print('Running' if __name__ == '__main__' else 'Importing', Path(__file__).resolve())
I'm including these two lines only to make the order of operations obvious. We can ignore them completely, since they don't affect the execution.
__init__.py and module.py contain only those two lines (i.e., they are effectively empty).
standalone.py additionally attempts to import module.py via relative import:
from . import module # explicit relative import
We're well aware that /path/to/python/interpreter package/standalone.py will fail. However, we can run the module with the -m command line option that will "search sys.path for the named module and execute its contents as the __main__ module":
vaultah#base:~$ python3 -i -m package.standalone
Importing /home/vaultah/package/__init__.py
Running /home/vaultah/package/standalone.py
Importing /home/vaultah/package/module.py
>>> __file__
'/home/vaultah/package/standalone.py'
>>> __package__
'package'
>>> # The __package__ has been correctly set and module.py has been imported.
... # What's inside sys.modules?
... import sys
>>> sys.modules['__main__']
<module 'package.standalone' from '/home/vaultah/package/standalone.py'>
>>> sys.modules['package.module']
<module 'package.module' from '/home/vaultah/package/module.py'>
>>> sys.modules['package']
<module 'package' from '/home/vaultah/package/__init__.py'>
-m does all the importing stuff for you and automatically sets __package__, but you can do that yourself in the
Solution #2: Set __package__ manually
Please treat it as a proof of concept rather than an actual solution. It isn't well-suited for use in real-world code.
PEP 366 has a workaround to this problem, however, it's incomplete, because setting __package__ alone is not enough. You're going to need to import at least N preceding packages in the module hierarchy, where N is the number of parent directories (relative to the directory of the script) that will be searched for the module being imported.
Thus,
Add the parent directory of the Nth predecessor of the current module to sys.path
Remove the current file's directory from sys.path
Import the parent module of the current module using its fully-qualified name
Set __package__ to the fully-qualified name from 2
Perform the relative import
I'll borrow files from the Solution #1 and add some more subpackages:
package
├── __init__.py
├── module.py
└── subpackage
├── __init__.py
└── subsubpackage
├── __init__.py
└── standalone.py
This time standalone.py will import module.py from the package package using the following relative import
from ... import module # N = 3
We'll need to precede that line with the boilerplate code, to make it work.
import sys
from pathlib import Path
if __name__ == '__main__' and __package__ is None:
file = Path(__file__).resolve()
parent, top = file.parent, file.parents[3]
sys.path.append(str(top))
try:
sys.path.remove(str(parent))
except ValueError: # Already removed
pass
import package.subpackage.subsubpackage
__package__ = 'package.subpackage.subsubpackage'
from ... import module # N = 3
It allows us to execute standalone.py by filename:
vaultah#base:~$ python3 package/subpackage/subsubpackage/standalone.py
Running /home/vaultah/package/subpackage/subsubpackage/standalone.py
Importing /home/vaultah/package/__init__.py
Importing /home/vaultah/package/subpackage/__init__.py
Importing /home/vaultah/package/subpackage/subsubpackage/__init__.py
Importing /home/vaultah/package/module.py
A more general solution wrapped in a function can be found here. Example usage:
if __name__ == '__main__' and __package__ is None:
import_parents(level=3) # N = 3
from ... import module
from ...module.submodule import thing
Solution #3: Use absolute imports and setuptools
The steps are -
Replace explicit relative imports with equivalent absolute imports
Install package to make it importable
For instance, the directory structure may be as follows
.
├── project
│   ├── package
│   │   ├── __init__.py
│   │   ├── module.py
│   │   └── standalone.py
│   └── setup.py
where setup.py is
from setuptools import setup, find_packages
setup(
name = 'your_package_name',
packages = find_packages(),
)
The rest of the files were borrowed from the Solution #1.
Installation will allow you to import the package regardless of your working directory (assuming there'll be no naming issues).
We can modify standalone.py to use this advantage (step 1):
from package import module # absolute import
Change your working directory to project and run /path/to/python/interpreter setup.py install --user (--user installs the package in your site-packages directory) (step 2):
vaultah#base:~$ cd project
vaultah#base:~/project$ python3 setup.py install --user
Let's verify that it's now possible to run standalone.py as a script:
vaultah#base:~/project$ python3 -i package/standalone.py
Running /home/vaultah/project/package/standalone.py
Importing /home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/__init__.py
Importing /home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/module.py
>>> module
<module 'package.module' from '/home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/module.py'>
>>> import sys
>>> sys.modules['package']
<module 'package' from '/home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/__init__.py'>
>>> sys.modules['package.module']
<module 'package.module' from '/home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/module.py'>
Note: If you decide to go down this route, you'd be better off using virtual environments to install packages in isolation.
Solution #4: Use absolute imports and some boilerplate code
Frankly, the installation is not necessary - you could add some boilerplate code to your script to make absolute imports work.
I'm going to borrow files from Solution #1 and change standalone.py:
Add the parent directory of package to sys.path before attempting to import anything from package using absolute imports:
import sys
from pathlib import Path # if you haven't already done so
file = Path(__file__).resolve()
parent, root = file.parent, file.parents[1]
sys.path.append(str(root))
# Additionally remove the current file's directory from sys.path
try:
sys.path.remove(str(parent))
except ValueError: # Already removed
pass
Replace the relative import by the absolute import:
from package import module # absolute import
standalone.py runs without problems:
vaultah#base:~$ python3 -i package/standalone.py
Running /home/vaultah/package/standalone.py
Importing /home/vaultah/package/__init__.py
Importing /home/vaultah/package/module.py
>>> module
<module 'package.module' from '/home/vaultah/package/module.py'>
>>> import sys
>>> sys.modules['package']
<module 'package' from '/home/vaultah/package/__init__.py'>
>>> sys.modules['package.module']
<module 'package.module' from '/home/vaultah/package/module.py'>
I feel that I should warn you: try not to do this, especially if your project has a complex structure.
As a side note, PEP 8 recommends the use of absolute imports, but states that in some scenarios explicit relative imports are acceptable:
Absolute imports are recommended, as they are usually more readable
and tend to be better behaved (or at least give better error
messages). [...] However, explicit relative imports are an acceptable
alternative to absolute imports, especially when dealing with complex
package layouts where using absolute imports would be unnecessarily
verbose.
Put this inside your package's __init__.py file:
# For relative imports to work in Python 3.6
import os, sys; sys.path.append(os.path.dirname(os.path.realpath(__file__)))
Assuming your package is like this:
├── project
│ ├── package
│ │ ├── __init__.py
│ │ ├── module1.py
│ │ └── module2.py
│ └── setup.py
Now use regular imports in you package, like:
# in module2.py
from module1 import class1
This works in both python 2 and 3.
I ran into this issue. A hack workaround is importing via an if/else block like follows:
#!/usr/bin/env python3
#myothermodule
if __name__ == '__main__':
from mymodule import as_int
else:
from .mymodule import as_int
# Exported function
def add(a, b):
return as_int(a) + as_int(b)
# Test function for module
def _test():
assert add('1', '1') == 2
if __name__ == '__main__':
_test()
SystemError: Parent module '' not loaded, cannot perform relative import
This means you are running a module inside the package as a script. Mixing scripts inside packages is tricky and should be avoided if at all possible. Use a wrapper script that imports the package and runs your scripty function instead.
If your top-level directory is called foo, which is on your PYTHONPATH module search path, and you have a package bar there (it is a directory you'd expect an __init__.py file in), scripts should not be placed inside bar, but should live on in foo at best.
Note that scripts differ from modules here in that they are used as a filename argument to the python command, either by using python <filename> or via a #! (shebang) line. It is loaded directly as the __main__ module (this is why if __name__ == "__main__": works in scripts), and there is no package context to build on for relative imports.
Your options
If you can, package your project with setuptools (or poetry or flit, which can help simplify packaging), and create console script entrypoints; installing your project with pip then creates scripts that know how to import your package properly. You can install your package locally with pip install -e ., so it can still be edited in-place.
Otherwise, never, ever, use python path/to/packagename/file.py, always use python path/to/script.py and script.py can use from packagename import ....
As a fallback, you could use the -m command-line switch to tell Python to import a module and use that as the __main__ file instead. This does not work with a shebang line, as there is no script file any more, however.
If you use python -m foo.bar and foo/bar.py is found in a sys.path directory, that is then imported and executed as __main__ with the right package context. If bar is also a package, inside foo/, it must have a __main__.py file (so foo/bar/__main__.py as the path from the sys.path directory).
In extreme circumstances, add the metadata Python uses to resolve relative imports by setting __package__ directly; the file foo/bar/spam.py, importable as foo.bar.spam, is given the global __package__ = "foo.bar". It is just another global, like __file__ and __name__, set by Python when imported.
On sys.path
The above all requires that your package can be imported, which means it needs to be found in one of the directories (or zipfiles) listed in sys.path. There are several options here too:
The directory where path/to/script.py was found (so path/to) is automatically added to sys.path. Executing python path/to/foo.py adds path/to to sys.path.
If you packaged your project (with setuptools, poetry, flit or another Python packaging tool), and installed it, the package has been added to the right place already.
As a last resort, add the right directory to sys.path yourself. If the package can be located relatively to the script file, use the __file__ variable in the script global namespace (e.g. using the pathlib.Path object, HERE = Path(__file__).resolve().parent is a reference to the directory the file lives in, as absolute path).
For PyCharm users:
I also was getting ImportError: attempted relative import with no known parent package because I was adding the . notation to silence a PyCharm parsing error. PyCharm innaccurately reports not being able to find:
lib.thing import function
If you change it to:
.lib.thing import function
it silences the error but then you get the aforementioned ImportError: attempted relative import with no known parent package. Just ignore PyCharm's parser. It's wrong and the code runs fine despite what it says.
To obviate this problem, I devised a solution with the repackage package, which has worked for me for some time. It adds the upper directory to the lib path:
import repackage
repackage.up()
from mypackage.mymodule import myfunction
Repackage can make relative imports that work in a wide range of cases, using an intelligent strategy (inspecting the call stack).
TL;DR: to #Aya's answer, updated with pathlib library, and working for Jupyter notebooks where __file__ is not defined:
You want to import my_function defined under ../my_Folder_where_the_package_lives/my_package.py
respect to where you are writing the code.
Then do:
import os
import sys
import pathlib
PACKAGE_PARENT = pathlib.Path(__file__).parent
#PACKAGE_PARENT = pathlib.Path.cwd().parent # if on jupyter notebook
SCRIPT_DIR = PACKAGE_PARENT / "my_Folder_where_the_package_lives"
sys.path.append(str(SCRIPT_DIR))
from my_package import my_function
Hopefully, this will be of value to someone out there - I went through half a dozen stackoverflow posts trying to figure out relative imports similar to whats posted above here. I set up everything as suggested but I was still hitting ModuleNotFoundError: No module named 'my_module_name'
Since I was just developing locally and playing around, I hadn't created/run a setup.py file. I also hadn't apparently set my PYTHONPATH.
I realized that when I ran my code as I had been when the tests were in the same directory as the module, I couldn't find my module:
$ python3 test/my_module/module_test.py 2.4.0
Traceback (most recent call last):
File "test/my_module/module_test.py", line 6, in <module>
from my_module.module import *
ModuleNotFoundError: No module named 'my_module'
However, when I explicitly specified the path things started to work:
$ PYTHONPATH=. python3 test/my_module/module_test.py 2.4.0
...........
----------------------------------------------------------------------
Ran 11 tests in 0.001s
OK
So, in the event that anyone has tried a few suggestions, believes their code is structured correctly and still finds themselves in a similar situation as myself try either of the following if you don't export the current directory to your PYTHONPATH:
Run your code and explicitly include the path like so:
$ PYTHONPATH=. python3 test/my_module/module_test.py
To avoid calling PYTHONPATH=., create a setup.py file with contents like the following and run python setup.py development to add packages to the path:
# setup.py
from setuptools import setup, find_packages
setup(
name='sample',
packages=find_packages()
)
TL;DR
You can only relatively import modules inside another module in the same package.
Concept Clarify
We see a lot of example code in books/docs/articles, they show us how to relatively import a module, but when we do so, it fails.
The reason is, put it in a simple sentence, we did not run the code as the python module mechanism expects, even though the code is written totally right. It's like some kind of runtime thing.
Module loading is depended on how you run the code. That is the source of confusion.
What is a module?
A module is a python file when and only when it is being imported by another file. Given a file mod.py, is it a module? Yes and No, if you run python mod.py, it is not a module, because it is not imported.
What is a package?
A package is a folder that includes Python module(s).
BTW, __init__.py is not necessary from python 3.3, if you don't need any package initialization or auto-load submodules. You don't need to place a blank __init__.py in a directory.
That proves a package is just a folder as long as there are files being imported.
Real Answer
Now, this description becomes clearer.
You can only relatively import modules inside another module in the same package.
Given a directory:
. CWD
|-- happy_maker.py # content: print('Sends Happy')
`-- me.py # content: from . import happy_maker
Run python me.py, we got attempted relative import with no known parent package
me.py is run directly, it is not a module, and we can't use relative import in it.
Solution 1
Use import happy_maker instead of from . import happy_maker
Solution 2
Switch our working directory to the parent folder.
. CWD
|-- happy
| |-- happy_maker.py
`-- me.py
Run python -m happy.me.
When we are in the directory that includes happy, happy is a package, me.py, happy_maker.py are modules, we can use relative import now, and we still want to run me.py, so we use -m which means run the module as a script.
Python Idiom
. CWD
|-- happy
| |-- happy_maker.py # content: print('Sends Happy')
| `-- me.py # content: from . import happy_maker
`-- main.py # content: import happy.me
This structure is the python idiom. main is our script, best practice in Python. Finally, we got there.
Siblings or Grandparents
Another common need:
.
|-- happy
| |-- happy_maker.py
| `-- me.py
`-- sad
`-- sad_maker.py
We want to import sad_maker in me.py, How to do that?
First, we need to make happy and sad in the same package, so we have to go up a directory level. And then from ..sad import sad_maker in the me.py.
That is all.
My boilerplate to make a module with relative imports in a package runnable standalone.
package/module.py
## Standalone boilerplate before relative imports
if __package__ is None:
DIR = Path(__file__).resolve().parent
sys.path.insert(0, str(DIR.parent))
__package__ = DIR.name
from . import variable_in__init__py
from . import other_module_in_package
...
Now you can use your module in any fashion:
Run module as usual: python -m package.module
Use it as a module: python -c 'from package import module'
Run it standalone: python package/module.py
or with shebang (#!/bin/env python) just: package/module.py
NB! Using sys.path.append instead of sys.path.insert will give you a hard to trace error if your module has the same name as your package. E.g. my_script/my_script.py
Of course if you have relative imports from higher levels in your package hierarchy, than this is not enough, but for most cases, it's just okay.
I needed to run python3 from the main project directory to make it work.
For example, if the project has the following structure:
project_demo/
├── main.py
├── some_package/
│ ├── __init__.py
│ └── project_configs.py
└── test/
└── test_project_configs.py
Solution
I would run python3 inside folder project_demo/ and then perform a
from some_package import project_configs
I was getting this ImportError: attempted relative import with no known parent package
In my program I was using the file from current path for importing its function.
from .filename import function
Then I modified the current path (Dot) with package name. Which resolved my issue.
from package_name.filename import function
I hope the above answer helps you.
Importing from same directory
Firstly, you can import from the same directory.
Here is the file structure...
Folder
|
├─ Scripts
| ├─ module123.py
|
├─ main.py
├─ script123.py
Here is main.py
from . import script123
from Scripts import module123
As you can see, importing from . imports from current directory.
Note: if running using anything but IDLE, make sure that your terminal is navigated to the same directory as the main.py file before running.
Also, importing from a local folder also works.
Importing from parent directory
As seen in my GitHub gist here, there is the following method.
Take the following file tree...
ParentDirectory
├─ Folder
| |
| ├─ Scripts
| | ├─ module123.py
| |
| ├─ main.py
| ├─ script123.py
|
├─ parentModule.py
Then, just add this code to the top of your main.py file.
import inspect
import os
import sys
current_dir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parent_dir = os.path.dirname(current_dir)
sys.path.insert(0, parent_dir)
from ParentDirectory import Stuff
I tried all of the above to no avail, only to realize I mistakenly had a - in my package name.
In short, don't have - in the directory where __init__.py is. I've never felt elated after finding out such inanity.
if both packages are in your import path (sys.path), and the module/class you want is in example/example.py, then to access the class without relative import try:
from example.example import fkt
If none of the above worked for you, you can specify the module explicitly.
Directory:
├── Project
│ ├── Dir
│ │ ├── __init__.py
│ │ ├── module.py
│ │ └── standalone.py
Solution:
#in standalone.py
from Project.Dir.module import ...
module - the module to be imported
Here is a three-liner for those who disagree with Guido:
import sys
from pathlib import Path
sys.path.append(str(Path(sys.argv[0]).absolute().parent.parent))
Hope it helps.
I think the best solution is to create a package for your module:
Here is more info on how to do it.
Once you have a package you don't need to worry about relative import, you can just do absolute imports.
I encounter this a lot when I am working with Django, since a lot of functionality is performed from the manage.py script but I also want to have some of my modules runnable directly as scripts as well (ideally you would make them manage.py directives but we're not there yet).
This is a mock up of what such a project might look like;
├── dj_app
│   ├── models.py
│   ├── ops
│   │   ├── bar.py
│   │   └── foo.py
│   ├── script.py
│   ├── tests.py
│   ├── utils.py
│   └── views.py
└── manage.py
The important parts here being manage.py, dj_app/script.py, and dj_app/tests.py. We also have submodules dj_app/ops/bar.py and dj_app/ops/foo.py which contain more items we want to use throughout the project.
The source of the issue commonly comes from wanting your dj_app/script.py script methods to have test cases in dj_app/tests.py which get invoked when you run manage.py test.
This is how I set up the project and its imports;
# dj_app/ops/foo.py
# Foo operation methods and classes
foo_val = "foo123"
.
# dj_app/ops/bar.py
# Bar operations methods and classes
bar_val = "bar123"
.
# dj_app/script.py
# script to run app methods from CLI
# if run directly from command line
if __name__ == '__main__':
from ops.bar import bar_val
from ops.foo import foo_val
# otherwise
else:
from .ops.bar import bar_val
from .ops.foo import foo_val
def script_method1():
print("this is script_method1")
print("bar_val: {}".format(bar_val))
print("foo_val: {}".format(foo_val))
if __name__ == '__main__':
print("running from the script")
script_method1()
.
# dj_app/tests.py
# test cases for the app
# do not run this directly from CLI or the imports will break
from .script import script_method1
from .ops.bar import bar_val
from .ops.foo import foo_val
def main():
print("Running the test case")
print("testing script method")
script_method1()
if __name__ == '__main__':
print("running tests from command line")
main()
.
# manage.py
# just run the test cases for this example
import dj_app.tests
dj_app.tests.main()
.
Running the test cases from manage.py;
$ python3 manage.py
Running the test case
testing script method
this is script_method1
bar_val: bar123
foo_val: foo123
Running the script on its own;
$ python3 dj_app/script.py
running from the script
this is script_method1
bar_val: bar123
foo_val: foo123
Note that you get an error if you try to run the test.py directly however, so don't do that;
$ python3 dj_app/tests.py
Traceback (most recent call last):
File "dj_app/tests.py", line 5, in <module>
from .script import script_method1
ModuleNotFoundError: No module named '__main__.script'; '__main__' is not a package
If I run into more complicated situations for imports, I usually end up implementing something like this to hack through it;
import os
import sys
THIS_DIR = os.path.dirname(os.path.realpath(__file__))
sys.path.insert(0, THIS_DIR)
from script import script_method1
sys.path.pop(0)
This my project structure
├── folder
| |
│ ├── moduleA.py
| | |
| | └--function1()
| | └~~ uses function2()
| |
│ └── moduleB.py
| |
| └--function2()
|
└── main.py
└~~ uses function1()
Here my moduleA imports moduleB and main imports moduleA
I added the snippet below in moduleA to import moduleB
try:
from .moduleB import function2
except:
from moduleB import function2
Now I can execute both main.py as well as moduleA.py individually
Is this a solution ?
The below solution is tested on Python3
├── classes
| |
| ├──__init__.py
| |
│ ├── userclass.py
| | |
| | └--viewDetails()
| |
| |
│ └── groupclass.py
| |
| └--viewGroupDetails()
|
└── start.py
└~~ uses function1()
Now, in order to use viewDetails of userclass or viewGroupDetails of groupclass define that in _ init _.py of classess directory first.
Ex: In _ init _.py
from .userclasss import viewDetails
from .groupclass import viewGroupDetails
Step2: Now, in start.py we can directly import viewDetails
Ex: In start.py
from classes import viewDetails
from classes import viewGroupDetails
I ran into a similar problem when trying to write a python file that can be loaded either as a module or an executable script.
Setup
/path/to/project/
├── __init__.py
└── main.py
└── mylib/
├── list_util.py
└── args_util.py
with:
main.py:
#!/usr/bin/env python3
import sys
import mylib.args_util
if __name__ == '__main__':
print(f'{mylib.args_util.parseargs(sys.argv[1:])=}')
mylib/list_util.py:
def to_int_list(args):
return [int(x) for x in args]
mylib/args_util.py:
#!/usr/bin/env python3
import sys
from . import list_util as lu
def parseargs(args):
return sum(lu.to_int_list(args))
if __name__ == '__main__':
print(f'{parseargs(sys.argv[1:])=}')
Output
$ ./main.py 1 2 3
mylib.args_util.parseargs(sys.argv[1:])=6
$ mylib/args_util.py 1 2 3
Traceback (most recent call last):
File "/path/to/project/mylib/args_util.py", line 10, in <module>
from . import list_util as lu
ImportError: attempted relative import with no known parent package
Solution
I settled for a Bash/Python polyglot solution. The Bash version of the program just calls python3 -m mylib.args_util then exits.
The Python version ignores the Bash code because it's contained in the docstring.
The Bash version ignores the Python code because it uses exec to stop parsing/running lines.
mylib/args_util.py:
#!/bin/bash
# -*- Mode: python -*-
''''true
exec /usr/bin/env python3 -m mylib.args_util "$#"
'''
import sys
from . import list_util as lu
def parseargs(args):
return sum(lu.to_int_list(args))
if __name__ == '__main__':
print(f'{parseargs(sys.argv[1:])=}')
Output
$ ./main.py 1 2 3
mylib.args_util.parseargs(sys.argv[1:])=6
$ mylib/args_util.py 1 2 3
parseargs(sys.argv[1:])=6
Explanation
Line 1: #!/bin/bash; this is the "shebang" line; it tells the interactive shell how run this script.
Python: ignored (comment)
Bash: ignored (comment)
Line 2: # -*- Mode: python -*- optional; this is called the "mode-line"; it tells Emacs to use Python syntax highlighting instead of guessing that the language is Bash when reading the file.
Python: ignored (comment)
Bash: ignored (comment)
Line 3: ''''true
Python: views this as an unassigned docstring starting with 'true\n
Bash: views this as three strings (of which the first two are empty strings) that expand to true (i.e. '' + '' + 'true' = 'true'); it then runs true (which does nothing) and continues to the next line
Line 4: exec /usr/bin/env python3 -m mylib.args_util "$#"
Python: still views this as part of the docstring from line 3.
Bash: runs python3 -m mylib.args_util then exits (it doesn't parse anything beyond this line)
Line 5: '''
Python: views this as the end of the docstring from line 3.
Bash: doesn't parse this line
Caveats
This doesn't work on Windows:
Workaround: Use WSL or a Batch wrapper script to call python -m mylib.args_util.
This only works if the current working directory is set to /path/to/project/.
Workaround: Set PYTHONPATH when calling /usr/bin/env
#!/bin/bash
# -*- Mode: python -*-
''''true
exec /usr/bin/env python3 \
PYTHONPATH="$(cd "$(dirname "$0")/.." ; pwd)" \
-m mylib.args_util "$#"
'''
I've created a new, experimental import library for Python: ultraimport
It gives the programmer more control over imports and makes them unambiguous. Also it gives better error messages when an import fails.
It allows you to do relative, file-system based imports that always work, no matter how you run your code and no matter what is your current working directory. It does not matter if you run a script or module. You also don't have to change sys.path which might have other side effects.
You would then change
from .mymodule import myfunction
to
import ultraimport
myfunction = ultraimport('__dir__/mymodule.py', 'myfunction')
This way the import will always work, even if you run the code as script.
One issue when importing scripts like this is that subsequent relative imports might fail. ultraimport has a builtin preprocessor to automatically rewrite relative imports.
I had a similar problem: I needed a Linux service and cgi plugin which use common constants to cooperate. The 'natural' way to do this is to place them in the init.py of the package, but I cannot start the cgi plugin with the -m parameter.
My final solution was similar to Solution #2 above:
import sys
import pathlib as p
import importlib
pp = p.Path(sys.argv[0])
pack = pp.resolve().parent
pkg = importlib.import_module('__init__', package=str(pack))
The disadvantage is that you must prefix the constants (or common functions) with pkg:
print(pkg.Glob)
TLDR; Append Script path to the System Path by adding following in the entry point of your python script.
import os.path
import sys
PACKAGE_PARENT = '..'
SCRIPT_DIR = os.path.dirname(os.path.realpath(os.path.join(os.getcwd(), os.path.expanduser(__file__))))
sys.path.append(os.path.normpath(os.path.join(SCRIPT_DIR, PACKAGE_PARENT)))
Thats it now you can run your project in PyCharma as well as from Terminal!!
Moving the file from which you are importing to an outside directory helps.
This is extra useful when your main file makes any other files in its own directory.
Ex:
Before:
Project
|---dir1
|-------main.py
|-------module1.py
After:
Project
|---module1.py
|---dir1
|-------main.py
I was getting the same error and my project structure was like
->project
->vendors
->vendors.py
->main.py
I was trying to call like this
from .vendors.Amazon import Amazom_Purchase
Here it was throwing an error so I fixed it simply by removing the first . from the statement
from vendors.Amazon import Amazom_Purchase
Hope this helps.
It's good to note that sometimes the cache causes of all it - I've tried different things after re-arranging classes into new directories and relative import started to work after I removed the __pycache__
If the following import:
from . import something
doesn't work for you it is because this is python-packaging import and will not work with your regular implementation, and here is an example to show how to use it:
Folder structure:
.
└── funniest
├── funniest
│ ├── __init__.py
│ └── text.py
├── main.py
└── setup.py
inside __init__.py add:
def available_module():
return "hello world"
text.py add:
from . import available_module
inside setup.py add
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
zip_safe=False)
Now, this is the most important part you need to install your package:
pip install .
Anywhere else in our system using the same Python, we can do this now:
>> import funnies.text as fun
>> fun.available_module()
This should output 'hello world'
you can test this in main.py (this will not require any installation of the Package)
Here is main.py as well
import funniest.text as fun
print(fun.available_module())

Import problem when separating applications and package tests

Maybe my goal and what I try to do here is wrong in the meaning of unpythonic. I am open for any suggestions about that.
My goals
Application (myapp) with its own tests folder.
A package (mypackage) with its own tests folder.
The tests for the package should be runable from the application folder and from the package folder.
The package have implicit and explicit components. The latter need to be imported explicit (e.g. via import mypackage.mymoduleB).
The package (folder) can be copied (shipped for reuse in other applications?) to other file system locations without loosing its functionality and testabilibty. That is why tests is inside the package folder and not outside.
That is the folder tree where itest is the name of the project, myapp is the application with an if __name__ == '__main__': in it and mypackag is the package.
itest
└── myapp
├── myapp.py
├── mypackage
│   ├── __init__.py
│   ├── _mymoduleA.py
│   ├── mymoduleB.py
│   └── tests
│   ├── __init__.py
│   └── test_all.py
└── tests
├── __init__.py
└── test_myapp.py
The problem
I can run the unittests from the application directory without problems.
/home/user/tab-cloud/_transfer/itest/myapp $ python3 -m unittest -vvv
test_A (mypackage.tests.test_all.TestAll) ... mymoduleA.foo()
ok
test_B (mypackage.tests.test_all.TestAll) ... mymoduleB.bar()
ok
test_myname (tests.test_myapp.TestMyApp) ... ok
----------------------------------------------------------------------
Ran 3 tests in 0.001s
OK
But when I go inside the package the tests do not run (sieh goal #3).
/home/user/tab-cloud/_transfer/itest/myapp/mypackage $ python3 -m unittest -vvv
tests.test_all (unittest.loader._FailedTest) ... ERROR
======================================================================
ERROR: tests.test_all (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: tests.test_all
Traceback (most recent call last):
File "/usr/lib/python3.9/unittest/loader.py", line 436, in _find_test_path
module = self._get_module_from_name(name)
File "/usr/lib/python3.9/unittest/loader.py", line 377, in _get_module_from_name
__import__(name)
File "/home/user/tab-cloud/_transfer/itest/myapp/mypackage/tests/test_all.py", line 12, in <module>
from . import mypackage
ImportError: cannot import name 'mypackage' from 'tests' (/home/user/tab-cloud/_transfer/itest/myapp/mypackage/tests/__init__.py)
----------------------------------------------------------------------
Ran 1 test in 0.001s
FAILED (errors=1)
The MWE
No I show you the files. To make sure the tests for the package using the right import when run from the application folder or from the package folder I use importlib (based on foreign solution).
The three files form the package
This is myapp/mypackage/__init__.py:
# imported implicite via 'mypackage'
from ._mymoduleA import *
# 'mymoduleB' need to be imported explicite
# via 'mypackage.moduleB'
This is myapp/mypackage/_mymoduleA.py:
def foo():
print('mymoduleA.foo()')
return 1
This is myapp/mypackage/mymoduleB.py:
def bar():
print('mymoduleB.bar()')
return 2
The tests for the package
The myapp/mypackage/tests/__init__.py is empty.
This is myapp/mypackage/tests/test_all.py:
import importlib
import unittest
# The package should be able to be tested by itself (run unittest inside the
# package directory) AND from the using application (run unittest in
# application directory).
# Based on: https://stackoverflow.com/a/14050282/4865723
if importlib.util.find_spec('mypackage'):
import mypackage
import mypackage.mymoduleB
else:
from . import mypackage
from mypackage import mymoduleB
class TestAll(unittest.TestCase):
def test_A(self):
self.assertEqual(1, mypackage.foo())
def test_B(self):
self.assertEqual(2, mypackage.mymoduleB.bar())
The application
This is cat myapp/myapp.py:
#!/usr/bin/env python3
import mypackage
def myname():
return 'My application!'
if __name__ == '__main__':
print(myname())
mypackage.foo()
try:
mypackage.mymoduleB.bar()
except AttributeError:
# we expecting this
print('Not imported yet: "mymoduleB.bar()"')
# this should work
import mypackage.mymoduleB
mypackage.mymoduleB.bar()
The test for the application
The myapp/tests/__init__.py is empty.
This is myapp/tests/test_myapp.py:
import unittest
import myapp
class TestMyApp(unittest.TestCase):
def test_myname(self):
self.assertEqual(myapp.myname(), 'My application!')
Sidenotes
Please let me explain something more about my goals. The mypackage should be reusable in other projects. In practice this means I copy the mypackage folder from one place to another. And while copy that folder I do want that tests folder come with it without explicte thinking about it because it is outside the package folder. And if the new project does unittesting the tests of the package should be involved in that unittesting automaticlly (via discover).
Your goal is really a bit unpythonic. But sometimes, you have to break the rules to free your heart.
You can solve the problem by checking for the __package__ attribute in myapp/mypackage/__init__.py like this:
# hint from there: https://stackoverflow.com/a/65426846/4865723
import sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
if __package__:
from ._mymoduleA import foo
else:
from _mymoduleA import *
In this case myapp/mypackage/tests/test_all.py the code gets a little simpler:
import importlib
import unittest
if not importlib.util.find_spec('mypackage'):
from __init__ import *
import mypackage
from mypackage import mymoduleB
class TestAll(unittest.TestCase):
def test_A(self):
self.assertEqual(1, mypackage.foo())
def test_B(self):
self.assertEqual(2, mymoduleB.bar())
All other files remain unchanged.
As a result, you get the ability to run tests from both /myapp and /myapp/mypackage folder. At the same time, there is no need to hardcode any absolute paths. The app can be copied to any other file system locations.
I hope it will useful for you.
I created an import library a couple of years ago. It works on pathing. I used it to create a plugin system where I could essentially install and import multiple versions of any library (with some limitations).
For this we get the current path of the module. Then we import the package using the path. This library will automatically add the proper path to sys.path.
All you need to do is install pylibimp pip install pylibimp and edit myapp/mypackage/tests/test_all.py
import os
import pylibimp
import unittest
path_tests = os.path.join(os.path.dirname(__file__))
path_mypackage = os.path.dirname(path_tests)
path_myapp = os.path.dirname(path_mypackage)
mypackage = pylibimp.import_module(os.path.join(path_myapp, 'mypackage'), reset_modules=False)
class TestAll(unittest.TestCase):
def test_A(self):
self.assertEqual(1, mypackage.foo())
def test_B(self):
self.assertEqual(2, mypackage.mymoduleB.bar())
I believe the background is fairly simple.
import os
import sys
sys.path.insert(0, os.path.abspath('path/to/myapp'))
# Since path is added we can "import mypackage"
mypackage = __import__('mypackage')
sys.path.pop(0) # remove the added path to not mess with other imports
I hope this is what you are looking for.

Python Pyramid framework: Unable to include another .py file [duplicate]

I want to import a function from another file in the same directory.
Usually, one of the following works:
from .mymodule import myfunction
from mymodule import myfunction
...but the other one gives me one of these errors:
ImportError: attempted relative import with no known parent package
ModuleNotFoundError: No module named 'mymodule'
SystemError: Parent module '' not loaded, cannot perform relative import
Why is this?
unfortunately, this module needs to be inside the package, and it also
needs to be runnable as a script, sometimes. Any idea how I could
achieve that?
It's quite common to have a layout like this...
main.py
mypackage/
__init__.py
mymodule.py
myothermodule.py
...with a mymodule.py like this...
#!/usr/bin/env python3
# Exported function
def as_int(a):
return int(a)
# Test function for module
def _test():
assert as_int('1') == 1
if __name__ == '__main__':
_test()
...a myothermodule.py like this...
#!/usr/bin/env python3
from .mymodule import as_int
# Exported function
def add(a, b):
return as_int(a) + as_int(b)
# Test function for module
def _test():
assert add('1', '1') == 2
if __name__ == '__main__':
_test()
...and a main.py like this...
#!/usr/bin/env python3
from mypackage.myothermodule import add
def main():
print(add('1', '1'))
if __name__ == '__main__':
main()
...which works fine when you run main.py or mypackage/mymodule.py, but fails with mypackage/myothermodule.py, due to the relative import...
from .mymodule import as_int
The way you're supposed to run it is...
python3 -m mypackage.myothermodule
...but it's somewhat verbose, and doesn't mix well with a shebang line like #!/usr/bin/env python3.
The simplest fix for this case, assuming the name mymodule is globally unique, would be to avoid using relative imports, and just use...
from mymodule import as_int
...although, if it's not unique, or your package structure is more complex, you'll need to include the directory containing your package directory in PYTHONPATH, and do it like this...
from mypackage.mymodule import as_int
...or if you want it to work "out of the box", you can frob the PYTHONPATH in code first with this...
import sys
import os
SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
sys.path.append(os.path.dirname(SCRIPT_DIR))
from mypackage.mymodule import as_int
It's kind of a pain, but there's a clue as to why in an email written by a certain Guido van Rossum...
I'm -1 on this and on any other proposed twiddlings of the __main__
machinery. The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern. To make me change my mind you'd have to convince me that
it isn't.
Whether running scripts inside a package is an antipattern or not is subjective, but personally I find it really useful in a package I have which contains some custom wxPython widgets, so I can run the script for any of the source files to display a wx.Frame containing only that widget for testing purposes.
Explanation
From PEP 328
Relative imports use a module's __name__ attribute to determine that
module's position in the package hierarchy. If the module's name does
not contain any package information (e.g. it is set to '__main__')
then relative imports are resolved as if the module were a top level
module, regardless of where the module is actually located on the file
system.
At some point PEP 338 conflicted with PEP 328:
... relative imports rely on __name__ to determine the current
module's position in the package hierarchy. In a main module, the
value of __name__ is always '__main__', so explicit relative imports
will always fail (as they only work for a module inside a package)
and to address the issue, PEP 366 introduced the top level variable __package__:
By adding a new module level attribute, this PEP allows relative
imports to work automatically if the module is executed using the -m
switch. A small amount of boilerplate in the module itself will allow
the relative imports to work when the file is executed by name. [...] When it [the attribute] is present, relative imports will be based on this attribute
rather than the module __name__ attribute. [...] When the main module is specified by its filename, then the __package__ attribute will be set to None. [...] When the import system encounters an explicit relative import in a
module without __package__ set (or with it set to None), it will
calculate and store the correct value (__name__.rpartition('.')[0]
for normal modules and __name__ for package initialisation modules)
(emphasis mine)
If the __name__ is '__main__', __name__.rpartition('.')[0] returns empty string. This is why there's empty string literal in the error description:
SystemError: Parent module '' not loaded, cannot perform relative import
The relevant part of the CPython's PyImport_ImportModuleLevelObject function:
if (PyDict_GetItem(interp->modules, package) == NULL) {
PyErr_Format(PyExc_SystemError,
"Parent module %R not loaded, cannot perform relative "
"import", package);
goto error;
}
CPython raises this exception if it was unable to find package (the name of the package) in interp->modules (accessible as sys.modules). Since sys.modules is "a dictionary that maps module names to modules which have already been loaded", it's now clear that the parent module must be explicitly absolute-imported before performing relative import.
Note: The patch from the issue 18018 has added another if block, which will be executed before the code above:
if (PyUnicode_CompareWithASCIIString(package, "") == 0) {
PyErr_SetString(PyExc_ImportError,
"attempted relative import with no known parent package");
goto error;
} /* else if (PyDict_GetItem(interp->modules, package) == NULL) {
...
*/
If package (same as above) is empty string, the error message will be
ImportError: attempted relative import with no known parent package
However, you will only see this in Python 3.6 or newer.
Solution #1: Run your script using -m
Consider a directory (which is a Python package):
.
├── package
│   ├── __init__.py
│   ├── module.py
│   └── standalone.py
All of the files in package begin with the same 2 lines of code:
from pathlib import Path
print('Running' if __name__ == '__main__' else 'Importing', Path(__file__).resolve())
I'm including these two lines only to make the order of operations obvious. We can ignore them completely, since they don't affect the execution.
__init__.py and module.py contain only those two lines (i.e., they are effectively empty).
standalone.py additionally attempts to import module.py via relative import:
from . import module # explicit relative import
We're well aware that /path/to/python/interpreter package/standalone.py will fail. However, we can run the module with the -m command line option that will "search sys.path for the named module and execute its contents as the __main__ module":
vaultah#base:~$ python3 -i -m package.standalone
Importing /home/vaultah/package/__init__.py
Running /home/vaultah/package/standalone.py
Importing /home/vaultah/package/module.py
>>> __file__
'/home/vaultah/package/standalone.py'
>>> __package__
'package'
>>> # The __package__ has been correctly set and module.py has been imported.
... # What's inside sys.modules?
... import sys
>>> sys.modules['__main__']
<module 'package.standalone' from '/home/vaultah/package/standalone.py'>
>>> sys.modules['package.module']
<module 'package.module' from '/home/vaultah/package/module.py'>
>>> sys.modules['package']
<module 'package' from '/home/vaultah/package/__init__.py'>
-m does all the importing stuff for you and automatically sets __package__, but you can do that yourself in the
Solution #2: Set __package__ manually
Please treat it as a proof of concept rather than an actual solution. It isn't well-suited for use in real-world code.
PEP 366 has a workaround to this problem, however, it's incomplete, because setting __package__ alone is not enough. You're going to need to import at least N preceding packages in the module hierarchy, where N is the number of parent directories (relative to the directory of the script) that will be searched for the module being imported.
Thus,
Add the parent directory of the Nth predecessor of the current module to sys.path
Remove the current file's directory from sys.path
Import the parent module of the current module using its fully-qualified name
Set __package__ to the fully-qualified name from 2
Perform the relative import
I'll borrow files from the Solution #1 and add some more subpackages:
package
├── __init__.py
├── module.py
└── subpackage
├── __init__.py
└── subsubpackage
├── __init__.py
└── standalone.py
This time standalone.py will import module.py from the package package using the following relative import
from ... import module # N = 3
We'll need to precede that line with the boilerplate code, to make it work.
import sys
from pathlib import Path
if __name__ == '__main__' and __package__ is None:
file = Path(__file__).resolve()
parent, top = file.parent, file.parents[3]
sys.path.append(str(top))
try:
sys.path.remove(str(parent))
except ValueError: # Already removed
pass
import package.subpackage.subsubpackage
__package__ = 'package.subpackage.subsubpackage'
from ... import module # N = 3
It allows us to execute standalone.py by filename:
vaultah#base:~$ python3 package/subpackage/subsubpackage/standalone.py
Running /home/vaultah/package/subpackage/subsubpackage/standalone.py
Importing /home/vaultah/package/__init__.py
Importing /home/vaultah/package/subpackage/__init__.py
Importing /home/vaultah/package/subpackage/subsubpackage/__init__.py
Importing /home/vaultah/package/module.py
A more general solution wrapped in a function can be found here. Example usage:
if __name__ == '__main__' and __package__ is None:
import_parents(level=3) # N = 3
from ... import module
from ...module.submodule import thing
Solution #3: Use absolute imports and setuptools
The steps are -
Replace explicit relative imports with equivalent absolute imports
Install package to make it importable
For instance, the directory structure may be as follows
.
├── project
│   ├── package
│   │   ├── __init__.py
│   │   ├── module.py
│   │   └── standalone.py
│   └── setup.py
where setup.py is
from setuptools import setup, find_packages
setup(
name = 'your_package_name',
packages = find_packages(),
)
The rest of the files were borrowed from the Solution #1.
Installation will allow you to import the package regardless of your working directory (assuming there'll be no naming issues).
We can modify standalone.py to use this advantage (step 1):
from package import module # absolute import
Change your working directory to project and run /path/to/python/interpreter setup.py install --user (--user installs the package in your site-packages directory) (step 2):
vaultah#base:~$ cd project
vaultah#base:~/project$ python3 setup.py install --user
Let's verify that it's now possible to run standalone.py as a script:
vaultah#base:~/project$ python3 -i package/standalone.py
Running /home/vaultah/project/package/standalone.py
Importing /home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/__init__.py
Importing /home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/module.py
>>> module
<module 'package.module' from '/home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/module.py'>
>>> import sys
>>> sys.modules['package']
<module 'package' from '/home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/__init__.py'>
>>> sys.modules['package.module']
<module 'package.module' from '/home/vaultah/.local/lib/python3.6/site-packages/your_package_name-0.0.0-py3.6.egg/package/module.py'>
Note: If you decide to go down this route, you'd be better off using virtual environments to install packages in isolation.
Solution #4: Use absolute imports and some boilerplate code
Frankly, the installation is not necessary - you could add some boilerplate code to your script to make absolute imports work.
I'm going to borrow files from Solution #1 and change standalone.py:
Add the parent directory of package to sys.path before attempting to import anything from package using absolute imports:
import sys
from pathlib import Path # if you haven't already done so
file = Path(__file__).resolve()
parent, root = file.parent, file.parents[1]
sys.path.append(str(root))
# Additionally remove the current file's directory from sys.path
try:
sys.path.remove(str(parent))
except ValueError: # Already removed
pass
Replace the relative import by the absolute import:
from package import module # absolute import
standalone.py runs without problems:
vaultah#base:~$ python3 -i package/standalone.py
Running /home/vaultah/package/standalone.py
Importing /home/vaultah/package/__init__.py
Importing /home/vaultah/package/module.py
>>> module
<module 'package.module' from '/home/vaultah/package/module.py'>
>>> import sys
>>> sys.modules['package']
<module 'package' from '/home/vaultah/package/__init__.py'>
>>> sys.modules['package.module']
<module 'package.module' from '/home/vaultah/package/module.py'>
I feel that I should warn you: try not to do this, especially if your project has a complex structure.
As a side note, PEP 8 recommends the use of absolute imports, but states that in some scenarios explicit relative imports are acceptable:
Absolute imports are recommended, as they are usually more readable
and tend to be better behaved (or at least give better error
messages). [...] However, explicit relative imports are an acceptable
alternative to absolute imports, especially when dealing with complex
package layouts where using absolute imports would be unnecessarily
verbose.
Put this inside your package's __init__.py file:
# For relative imports to work in Python 3.6
import os, sys; sys.path.append(os.path.dirname(os.path.realpath(__file__)))
Assuming your package is like this:
├── project
│ ├── package
│ │ ├── __init__.py
│ │ ├── module1.py
│ │ └── module2.py
│ └── setup.py
Now use regular imports in you package, like:
# in module2.py
from module1 import class1
This works in both python 2 and 3.
I ran into this issue. A hack workaround is importing via an if/else block like follows:
#!/usr/bin/env python3
#myothermodule
if __name__ == '__main__':
from mymodule import as_int
else:
from .mymodule import as_int
# Exported function
def add(a, b):
return as_int(a) + as_int(b)
# Test function for module
def _test():
assert add('1', '1') == 2
if __name__ == '__main__':
_test()
SystemError: Parent module '' not loaded, cannot perform relative import
This means you are running a module inside the package as a script. Mixing scripts inside packages is tricky and should be avoided if at all possible. Use a wrapper script that imports the package and runs your scripty function instead.
If your top-level directory is called foo, which is on your PYTHONPATH module search path, and you have a package bar there (it is a directory you'd expect an __init__.py file in), scripts should not be placed inside bar, but should live on in foo at best.
Note that scripts differ from modules here in that they are used as a filename argument to the python command, either by using python <filename> or via a #! (shebang) line. It is loaded directly as the __main__ module (this is why if __name__ == "__main__": works in scripts), and there is no package context to build on for relative imports.
Your options
If you can, package your project with setuptools (or poetry or flit, which can help simplify packaging), and create console script entrypoints; installing your project with pip then creates scripts that know how to import your package properly. You can install your package locally with pip install -e ., so it can still be edited in-place.
Otherwise, never, ever, use python path/to/packagename/file.py, always use python path/to/script.py and script.py can use from packagename import ....
As a fallback, you could use the -m command-line switch to tell Python to import a module and use that as the __main__ file instead. This does not work with a shebang line, as there is no script file any more, however.
If you use python -m foo.bar and foo/bar.py is found in a sys.path directory, that is then imported and executed as __main__ with the right package context. If bar is also a package, inside foo/, it must have a __main__.py file (so foo/bar/__main__.py as the path from the sys.path directory).
In extreme circumstances, add the metadata Python uses to resolve relative imports by setting __package__ directly; the file foo/bar/spam.py, importable as foo.bar.spam, is given the global __package__ = "foo.bar". It is just another global, like __file__ and __name__, set by Python when imported.
On sys.path
The above all requires that your package can be imported, which means it needs to be found in one of the directories (or zipfiles) listed in sys.path. There are several options here too:
The directory where path/to/script.py was found (so path/to) is automatically added to sys.path. Executing python path/to/foo.py adds path/to to sys.path.
If you packaged your project (with setuptools, poetry, flit or another Python packaging tool), and installed it, the package has been added to the right place already.
As a last resort, add the right directory to sys.path yourself. If the package can be located relatively to the script file, use the __file__ variable in the script global namespace (e.g. using the pathlib.Path object, HERE = Path(__file__).resolve().parent is a reference to the directory the file lives in, as absolute path).
For PyCharm users:
I also was getting ImportError: attempted relative import with no known parent package because I was adding the . notation to silence a PyCharm parsing error. PyCharm innaccurately reports not being able to find:
lib.thing import function
If you change it to:
.lib.thing import function
it silences the error but then you get the aforementioned ImportError: attempted relative import with no known parent package. Just ignore PyCharm's parser. It's wrong and the code runs fine despite what it says.
To obviate this problem, I devised a solution with the repackage package, which has worked for me for some time. It adds the upper directory to the lib path:
import repackage
repackage.up()
from mypackage.mymodule import myfunction
Repackage can make relative imports that work in a wide range of cases, using an intelligent strategy (inspecting the call stack).
TL;DR: to #Aya's answer, updated with pathlib library, and working for Jupyter notebooks where __file__ is not defined:
You want to import my_function defined under ../my_Folder_where_the_package_lives/my_package.py
respect to where you are writing the code.
Then do:
import os
import sys
import pathlib
PACKAGE_PARENT = pathlib.Path(__file__).parent
#PACKAGE_PARENT = pathlib.Path.cwd().parent # if on jupyter notebook
SCRIPT_DIR = PACKAGE_PARENT / "my_Folder_where_the_package_lives"
sys.path.append(str(SCRIPT_DIR))
from my_package import my_function
Hopefully, this will be of value to someone out there - I went through half a dozen stackoverflow posts trying to figure out relative imports similar to whats posted above here. I set up everything as suggested but I was still hitting ModuleNotFoundError: No module named 'my_module_name'
Since I was just developing locally and playing around, I hadn't created/run a setup.py file. I also hadn't apparently set my PYTHONPATH.
I realized that when I ran my code as I had been when the tests were in the same directory as the module, I couldn't find my module:
$ python3 test/my_module/module_test.py 2.4.0
Traceback (most recent call last):
File "test/my_module/module_test.py", line 6, in <module>
from my_module.module import *
ModuleNotFoundError: No module named 'my_module'
However, when I explicitly specified the path things started to work:
$ PYTHONPATH=. python3 test/my_module/module_test.py 2.4.0
...........
----------------------------------------------------------------------
Ran 11 tests in 0.001s
OK
So, in the event that anyone has tried a few suggestions, believes their code is structured correctly and still finds themselves in a similar situation as myself try either of the following if you don't export the current directory to your PYTHONPATH:
Run your code and explicitly include the path like so:
$ PYTHONPATH=. python3 test/my_module/module_test.py
To avoid calling PYTHONPATH=., create a setup.py file with contents like the following and run python setup.py development to add packages to the path:
# setup.py
from setuptools import setup, find_packages
setup(
name='sample',
packages=find_packages()
)
TL;DR
You can only relatively import modules inside another module in the same package.
Concept Clarify
We see a lot of example code in books/docs/articles, they show us how to relatively import a module, but when we do so, it fails.
The reason is, put it in a simple sentence, we did not run the code as the python module mechanism expects, even though the code is written totally right. It's like some kind of runtime thing.
Module loading is depended on how you run the code. That is the source of confusion.
What is a module?
A module is a python file when and only when it is being imported by another file. Given a file mod.py, is it a module? Yes and No, if you run python mod.py, it is not a module, because it is not imported.
What is a package?
A package is a folder that includes Python module(s).
BTW, __init__.py is not necessary from python 3.3, if you don't need any package initialization or auto-load submodules. You don't need to place a blank __init__.py in a directory.
That proves a package is just a folder as long as there are files being imported.
Real Answer
Now, this description becomes clearer.
You can only relatively import modules inside another module in the same package.
Given a directory:
. CWD
|-- happy_maker.py # content: print('Sends Happy')
`-- me.py # content: from . import happy_maker
Run python me.py, we got attempted relative import with no known parent package
me.py is run directly, it is not a module, and we can't use relative import in it.
Solution 1
Use import happy_maker instead of from . import happy_maker
Solution 2
Switch our working directory to the parent folder.
. CWD
|-- happy
| |-- happy_maker.py
`-- me.py
Run python -m happy.me.
When we are in the directory that includes happy, happy is a package, me.py, happy_maker.py are modules, we can use relative import now, and we still want to run me.py, so we use -m which means run the module as a script.
Python Idiom
. CWD
|-- happy
| |-- happy_maker.py # content: print('Sends Happy')
| `-- me.py # content: from . import happy_maker
`-- main.py # content: import happy.me
This structure is the python idiom. main is our script, best practice in Python. Finally, we got there.
Siblings or Grandparents
Another common need:
.
|-- happy
| |-- happy_maker.py
| `-- me.py
`-- sad
`-- sad_maker.py
We want to import sad_maker in me.py, How to do that?
First, we need to make happy and sad in the same package, so we have to go up a directory level. And then from ..sad import sad_maker in the me.py.
That is all.
My boilerplate to make a module with relative imports in a package runnable standalone.
package/module.py
## Standalone boilerplate before relative imports
if __package__ is None:
DIR = Path(__file__).resolve().parent
sys.path.insert(0, str(DIR.parent))
__package__ = DIR.name
from . import variable_in__init__py
from . import other_module_in_package
...
Now you can use your module in any fashion:
Run module as usual: python -m package.module
Use it as a module: python -c 'from package import module'
Run it standalone: python package/module.py
or with shebang (#!/bin/env python) just: package/module.py
NB! Using sys.path.append instead of sys.path.insert will give you a hard to trace error if your module has the same name as your package. E.g. my_script/my_script.py
Of course if you have relative imports from higher levels in your package hierarchy, than this is not enough, but for most cases, it's just okay.
I needed to run python3 from the main project directory to make it work.
For example, if the project has the following structure:
project_demo/
├── main.py
├── some_package/
│ ├── __init__.py
│ └── project_configs.py
└── test/
└── test_project_configs.py
Solution
I would run python3 inside folder project_demo/ and then perform a
from some_package import project_configs
I was getting this ImportError: attempted relative import with no known parent package
In my program I was using the file from current path for importing its function.
from .filename import function
Then I modified the current path (Dot) with package name. Which resolved my issue.
from package_name.filename import function
I hope the above answer helps you.
Importing from same directory
Firstly, you can import from the same directory.
Here is the file structure...
Folder
|
├─ Scripts
| ├─ module123.py
|
├─ main.py
├─ script123.py
Here is main.py
from . import script123
from Scripts import module123
As you can see, importing from . imports from current directory.
Note: if running using anything but IDLE, make sure that your terminal is navigated to the same directory as the main.py file before running.
Also, importing from a local folder also works.
Importing from parent directory
As seen in my GitHub gist here, there is the following method.
Take the following file tree...
ParentDirectory
├─ Folder
| |
| ├─ Scripts
| | ├─ module123.py
| |
| ├─ main.py
| ├─ script123.py
|
├─ parentModule.py
Then, just add this code to the top of your main.py file.
import inspect
import os
import sys
current_dir = os.path.dirname(os.path.abspath(inspect.getfile(inspect.currentframe())))
parent_dir = os.path.dirname(current_dir)
sys.path.insert(0, parent_dir)
from ParentDirectory import Stuff
I tried all of the above to no avail, only to realize I mistakenly had a - in my package name.
In short, don't have - in the directory where __init__.py is. I've never felt elated after finding out such inanity.
if both packages are in your import path (sys.path), and the module/class you want is in example/example.py, then to access the class without relative import try:
from example.example import fkt
If none of the above worked for you, you can specify the module explicitly.
Directory:
├── Project
│ ├── Dir
│ │ ├── __init__.py
│ │ ├── module.py
│ │ └── standalone.py
Solution:
#in standalone.py
from Project.Dir.module import ...
module - the module to be imported
Here is a three-liner for those who disagree with Guido:
import sys
from pathlib import Path
sys.path.append(str(Path(sys.argv[0]).absolute().parent.parent))
Hope it helps.
I think the best solution is to create a package for your module:
Here is more info on how to do it.
Once you have a package you don't need to worry about relative import, you can just do absolute imports.
I encounter this a lot when I am working with Django, since a lot of functionality is performed from the manage.py script but I also want to have some of my modules runnable directly as scripts as well (ideally you would make them manage.py directives but we're not there yet).
This is a mock up of what such a project might look like;
├── dj_app
│   ├── models.py
│   ├── ops
│   │   ├── bar.py
│   │   └── foo.py
│   ├── script.py
│   ├── tests.py
│   ├── utils.py
│   └── views.py
└── manage.py
The important parts here being manage.py, dj_app/script.py, and dj_app/tests.py. We also have submodules dj_app/ops/bar.py and dj_app/ops/foo.py which contain more items we want to use throughout the project.
The source of the issue commonly comes from wanting your dj_app/script.py script methods to have test cases in dj_app/tests.py which get invoked when you run manage.py test.
This is how I set up the project and its imports;
# dj_app/ops/foo.py
# Foo operation methods and classes
foo_val = "foo123"
.
# dj_app/ops/bar.py
# Bar operations methods and classes
bar_val = "bar123"
.
# dj_app/script.py
# script to run app methods from CLI
# if run directly from command line
if __name__ == '__main__':
from ops.bar import bar_val
from ops.foo import foo_val
# otherwise
else:
from .ops.bar import bar_val
from .ops.foo import foo_val
def script_method1():
print("this is script_method1")
print("bar_val: {}".format(bar_val))
print("foo_val: {}".format(foo_val))
if __name__ == '__main__':
print("running from the script")
script_method1()
.
# dj_app/tests.py
# test cases for the app
# do not run this directly from CLI or the imports will break
from .script import script_method1
from .ops.bar import bar_val
from .ops.foo import foo_val
def main():
print("Running the test case")
print("testing script method")
script_method1()
if __name__ == '__main__':
print("running tests from command line")
main()
.
# manage.py
# just run the test cases for this example
import dj_app.tests
dj_app.tests.main()
.
Running the test cases from manage.py;
$ python3 manage.py
Running the test case
testing script method
this is script_method1
bar_val: bar123
foo_val: foo123
Running the script on its own;
$ python3 dj_app/script.py
running from the script
this is script_method1
bar_val: bar123
foo_val: foo123
Note that you get an error if you try to run the test.py directly however, so don't do that;
$ python3 dj_app/tests.py
Traceback (most recent call last):
File "dj_app/tests.py", line 5, in <module>
from .script import script_method1
ModuleNotFoundError: No module named '__main__.script'; '__main__' is not a package
If I run into more complicated situations for imports, I usually end up implementing something like this to hack through it;
import os
import sys
THIS_DIR = os.path.dirname(os.path.realpath(__file__))
sys.path.insert(0, THIS_DIR)
from script import script_method1
sys.path.pop(0)
This my project structure
├── folder
| |
│ ├── moduleA.py
| | |
| | └--function1()
| | └~~ uses function2()
| |
│ └── moduleB.py
| |
| └--function2()
|
└── main.py
└~~ uses function1()
Here my moduleA imports moduleB and main imports moduleA
I added the snippet below in moduleA to import moduleB
try:
from .moduleB import function2
except:
from moduleB import function2
Now I can execute both main.py as well as moduleA.py individually
Is this a solution ?
The below solution is tested on Python3
├── classes
| |
| ├──__init__.py
| |
│ ├── userclass.py
| | |
| | └--viewDetails()
| |
| |
│ └── groupclass.py
| |
| └--viewGroupDetails()
|
└── start.py
└~~ uses function1()
Now, in order to use viewDetails of userclass or viewGroupDetails of groupclass define that in _ init _.py of classess directory first.
Ex: In _ init _.py
from .userclasss import viewDetails
from .groupclass import viewGroupDetails
Step2: Now, in start.py we can directly import viewDetails
Ex: In start.py
from classes import viewDetails
from classes import viewGroupDetails
I ran into a similar problem when trying to write a python file that can be loaded either as a module or an executable script.
Setup
/path/to/project/
├── __init__.py
└── main.py
└── mylib/
├── list_util.py
└── args_util.py
with:
main.py:
#!/usr/bin/env python3
import sys
import mylib.args_util
if __name__ == '__main__':
print(f'{mylib.args_util.parseargs(sys.argv[1:])=}')
mylib/list_util.py:
def to_int_list(args):
return [int(x) for x in args]
mylib/args_util.py:
#!/usr/bin/env python3
import sys
from . import list_util as lu
def parseargs(args):
return sum(lu.to_int_list(args))
if __name__ == '__main__':
print(f'{parseargs(sys.argv[1:])=}')
Output
$ ./main.py 1 2 3
mylib.args_util.parseargs(sys.argv[1:])=6
$ mylib/args_util.py 1 2 3
Traceback (most recent call last):
File "/path/to/project/mylib/args_util.py", line 10, in <module>
from . import list_util as lu
ImportError: attempted relative import with no known parent package
Solution
I settled for a Bash/Python polyglot solution. The Bash version of the program just calls python3 -m mylib.args_util then exits.
The Python version ignores the Bash code because it's contained in the docstring.
The Bash version ignores the Python code because it uses exec to stop parsing/running lines.
mylib/args_util.py:
#!/bin/bash
# -*- Mode: python -*-
''''true
exec /usr/bin/env python3 -m mylib.args_util "$#"
'''
import sys
from . import list_util as lu
def parseargs(args):
return sum(lu.to_int_list(args))
if __name__ == '__main__':
print(f'{parseargs(sys.argv[1:])=}')
Output
$ ./main.py 1 2 3
mylib.args_util.parseargs(sys.argv[1:])=6
$ mylib/args_util.py 1 2 3
parseargs(sys.argv[1:])=6
Explanation
Line 1: #!/bin/bash; this is the "shebang" line; it tells the interactive shell how run this script.
Python: ignored (comment)
Bash: ignored (comment)
Line 2: # -*- Mode: python -*- optional; this is called the "mode-line"; it tells Emacs to use Python syntax highlighting instead of guessing that the language is Bash when reading the file.
Python: ignored (comment)
Bash: ignored (comment)
Line 3: ''''true
Python: views this as an unassigned docstring starting with 'true\n
Bash: views this as three strings (of which the first two are empty strings) that expand to true (i.e. '' + '' + 'true' = 'true'); it then runs true (which does nothing) and continues to the next line
Line 4: exec /usr/bin/env python3 -m mylib.args_util "$#"
Python: still views this as part of the docstring from line 3.
Bash: runs python3 -m mylib.args_util then exits (it doesn't parse anything beyond this line)
Line 5: '''
Python: views this as the end of the docstring from line 3.
Bash: doesn't parse this line
Caveats
This doesn't work on Windows:
Workaround: Use WSL or a Batch wrapper script to call python -m mylib.args_util.
This only works if the current working directory is set to /path/to/project/.
Workaround: Set PYTHONPATH when calling /usr/bin/env
#!/bin/bash
# -*- Mode: python -*-
''''true
exec /usr/bin/env python3 \
PYTHONPATH="$(cd "$(dirname "$0")/.." ; pwd)" \
-m mylib.args_util "$#"
'''
I've created a new, experimental import library for Python: ultraimport
It gives the programmer more control over imports and makes them unambiguous. Also it gives better error messages when an import fails.
It allows you to do relative, file-system based imports that always work, no matter how you run your code and no matter what is your current working directory. It does not matter if you run a script or module. You also don't have to change sys.path which might have other side effects.
You would then change
from .mymodule import myfunction
to
import ultraimport
myfunction = ultraimport('__dir__/mymodule.py', 'myfunction')
This way the import will always work, even if you run the code as script.
One issue when importing scripts like this is that subsequent relative imports might fail. ultraimport has a builtin preprocessor to automatically rewrite relative imports.
I had a similar problem: I needed a Linux service and cgi plugin which use common constants to cooperate. The 'natural' way to do this is to place them in the init.py of the package, but I cannot start the cgi plugin with the -m parameter.
My final solution was similar to Solution #2 above:
import sys
import pathlib as p
import importlib
pp = p.Path(sys.argv[0])
pack = pp.resolve().parent
pkg = importlib.import_module('__init__', package=str(pack))
The disadvantage is that you must prefix the constants (or common functions) with pkg:
print(pkg.Glob)
TLDR; Append Script path to the System Path by adding following in the entry point of your python script.
import os.path
import sys
PACKAGE_PARENT = '..'
SCRIPT_DIR = os.path.dirname(os.path.realpath(os.path.join(os.getcwd(), os.path.expanduser(__file__))))
sys.path.append(os.path.normpath(os.path.join(SCRIPT_DIR, PACKAGE_PARENT)))
Thats it now you can run your project in PyCharma as well as from Terminal!!
Moving the file from which you are importing to an outside directory helps.
This is extra useful when your main file makes any other files in its own directory.
Ex:
Before:
Project
|---dir1
|-------main.py
|-------module1.py
After:
Project
|---module1.py
|---dir1
|-------main.py
I was getting the same error and my project structure was like
->project
->vendors
->vendors.py
->main.py
I was trying to call like this
from .vendors.Amazon import Amazom_Purchase
Here it was throwing an error so I fixed it simply by removing the first . from the statement
from vendors.Amazon import Amazom_Purchase
Hope this helps.
It's good to note that sometimes the cache causes of all it - I've tried different things after re-arranging classes into new directories and relative import started to work after I removed the __pycache__
If the following import:
from . import something
doesn't work for you it is because this is python-packaging import and will not work with your regular implementation, and here is an example to show how to use it:
Folder structure:
.
└── funniest
├── funniest
│ ├── __init__.py
│ └── text.py
├── main.py
└── setup.py
inside __init__.py add:
def available_module():
return "hello world"
text.py add:
from . import available_module
inside setup.py add
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
zip_safe=False)
Now, this is the most important part you need to install your package:
pip install .
Anywhere else in our system using the same Python, we can do this now:
>> import funnies.text as fun
>> fun.available_module()
This should output 'hello world'
you can test this in main.py (this will not require any installation of the Package)
Here is main.py as well
import funniest.text as fun
print(fun.available_module())

How to correctly make a python package?

This is my first time making a python package, and I am thoroughly confused about __init__.py, __main__.py, and their relation to making a package.
Here is my project structure:
package_name/
setup.py
README.md
LICENSE
package_name/
__init__.py
__main__.py
A.py
B.py
Class A in A.py depends on a class B in B.py.
Should I have both __init__.py and __main__.py?
What should be in the files?
What I have tried:
in A.py:
from B import B and from .B import B
The first allows me to run normally locally, but when I try to upload it to pypi and install it, I get ModuleNotFoundError: No module named 'B'
The second allows me to import it after installing it from pypi, but I can't run it normal locally.
My goal is to import Class A from the package with the following
from package_name import A
and be able to run my package locally.
Edit:
I am using Python 3.
Files named init.py are used to mark directories on disk as a Python package directories, you can let this empty most of the time. But let's say you have for example this file structure with the following code inside them:
Structure
package_name/
├── setup.py
├── package_name
│ ├── __init__.py
│ └── main.py
└── package_name.py
setup.py
#!/usr/bin/env python3
import sys
from setuptools import setup
setup(
name = "package_name",
version = "0.1",
packages=["package_name"],
package_data = {},
author="xxx",
author_email = "xxx#xxx.xx",
description = "The familiar example program in Python",
license = "BSD",
keywords= "example documentation tutorial",
)
package_name.py
#!/usr/bin/env python
import sys
import package_name.main
if __name__ == '__main__':
sys.exit(package_name.main.main())
main.py
#!/usr/bin/env python3
import sys
def main(argv=None):
if argv is None:
argv = sys.argv
print("Hello, world")
return 0
Go in the terminal to the package_name folder and type: "python3 package_name.py" to execute the code.
Output
Hello, world
The package.py goes to the main.py and execute the code that is in the main.py. If you want to import in your example try using for example "from package_name.A.py import [function name]" in the the python files you wish to access the functions. It should work.
I would like to hear from you if this helped you out and gave you a better understanding.

How do I import files in Python?

I'm trying to use pytest to check a function—here is my code:
# src/return_self.py
def return_self(n):
return n
# tests/return_self_test.py
import pytest
def test_1():
value = return_self(1)
assert value == 1
How do I require in my src file so that I can test it with pytest? I have tried a few things:
1. import return_self
2. from src.return_self import *
3. import sys
sys.path.append('../src')
4. import imp
return_self = imp.load_source('return_self', '/source/return_self.py')
I have also tried them with and without __init__.py files in the root and src directories. But each time, I get some variation on the error E ModuleNotFoundError: No module named 'return_self'. How can I require in my file?
You can try this approach:
# tests/return_self_test.py
import os
import sys
import pytest
sys.path.insert(1, os.path.join(sys.path[0], '..'))
from src.return_self import return_self
def test_1():
value = return_self(1)
assert value == 1
First, you must check that src/ and tests/ are in the same directory, I checked the function importing return_self in return_self_test.py and this is how:
Testpy search for files with test_[prefix] so I recommend changing return_self_test.py to test_return_self.py
# src/return_self.py
def return_self(n):
return n
# tests/test_return_self.py
import return_self
def test_1():
value = return_self.return_self(1)
assert value == 1
Finally, test in cmd (in the correct path) or Pycharm terminal with the following command:
py.test -v and voila! It's done (:
I will suggest you the setuptools approach (which makes your package distributable :D)
Project files' structure:
.
├── sample
│   ├── __init__.py
│   └── return_self.py
├── setup.cfg
├── setup.py
└── tests
└── test_return_self.py
where the sample/ directory matches the package's name and also must contain the source.
Minimal setup.py file content:
from setuptools import setup
setup(
setup_requires=['pytest-runner'],
tests_require=['pytest'],
name='sample'
)
Here we are configuring our test environment (you can extend the tests_require variable to include more testing requirements).
setup.cfg file content:
[aliases]
test=pytest
And here we specify we want to run the command pytest every time the developer does: python setup.py test
tests/test_return_self.py
from pytest import *
from sample.return_self import return_self
def test_return_self():
assert return_self(4) == 4
sample/return_self.py
def return_self(n):
return n
So, the next thing to do is to run:
python setup.py develop
to make your package available (while running the tests). If you're having problems with permission denied issues: append the --user option to the previous command to instruct python you wanna use the package without root permissions --in short the package will be installed into userland directories.
And finally run the tests using:
python setup.py test
Notes:
By using this approach, you'll be able to alter your project code on the fly (python setup.py develop needs to be run just once)
No ugly statements to inject the source directory into the current python path.
References:
Integrating with setuptools / python setup.py test / pytest-runner
Building and Distributing Packages with Setuptools

Categories

Resources