Related
I'm working on a project with this folder structure using pytest for tests
project
├── example
│ ├── __init__.py
│ └── example.py
├── setup.py
└── tests
└── test_1.py
I'm trying to import example.py from test_1.py but every way I've tried hasn't worked.
I've tried using sys.path.append('../) then import example and just using from ... import example but neither of these methods has worked. sys.path.append('../') didn't let me import example and from ... import example errored with ImportError: attempted relative import with no known parent package. I've also tried using python setup.py develop and it let me import example, but none of the methods are imported.
This is the code that is in test_1.py. It currently imports example but it has no content.
import sys
import os
sys.path.append("../example/")
import example
example.test()
Contents of example.py
class test:
def __init__(self):
self.a = 1
self.b = 2
The setup.py is the template from here
I think your problem is that the path resolution isn't functioning as you expect. In your example.py file do this:
import sys
import os
dir_path = os.path.dirname(os.path.realpath(__file__))
sys.path.append(os.path.realpath(f"{dir_path}/../example"))
import example
example.test()
Then, when in your root directory run python tests/test_1.py. This should execute test() from the example directory.
The realpath() function is the key here, as it will resolve the actual system path. If that isn't resolved it assumes whatever you're handing it is from the root (i.e. /example, which is wrong).
Use of sys.path.append results in flake8 reporting E402 module level import not at the top of the file. There are various hacks to this around which essentially depend on disabling E402; directly or by deception. There are at least two other ways.
Add __init__.py to the tests directory so that pytest loads the tests from within the parent dir and natively finds the include directory.
pip install -e . from within the parent and have the package loaded in a developer environment.
For the sake of consistency I started with uninstalling every package (so pip freeze returns nothing) and creating virtual environments for each of the approaches. I omitted the setup.py file favoring a minimal pyproject.toml file. Then pip install pytest and pip install flake8 in both. For packages, the second approach then shows an additional example package, i.e.
(.venv) PS C:\xxxx\Stackoverflow\proj_hatch> pip freeze
attrs==22.1.0
colorama==0.4.5
# Editable install with no version control (example==0.0.1)
-e c:\xxxx\stackoverflow\proj_hatch
flake8==5.0.4
iniconfig==1.1.1
mccabe==0.7.0
packaging==21.3
pluggy==1.0.0
py==1.11.0
pycodestyle==2.9.1
pyflakes==2.5.0
pyparsing==3.0.9
pytest==7.1.3
tomli==2.0.1
The net outcome is that both approaches have a working pytest.
(.venv) PS C:\xxxx\Stackoverflow\proj_hatch> pytest
================ test session starts =========================
platform win32 -- Python 3.10.8, pytest-7.1.3, pluggy-1.0.0
rootdir: C:\xxxx\Stackoverflow\proj_hatch
collected 1 item
tests\test_1.py . [100%]
=================== 1 passed in 0.01s ========================
(.venv) PS C:\xxxx\Stackoverflow\proj_hatch>
In both versions set example/__init__.py as below. This has an additional benefit of allowing the structure (which function in which file) to change later while keeping the import for other applications stable.
from example.example import test
__all__ = [
"test",
]
In both versions the test is:
import example
def test_1():
ex1 = example.Test()
assert ex1.b == 2
With the first approach it is necessary to add .\tests\__init__.py so that pytest loads the test from within the project directory and is able to find the example directory. I expect this has some implications for how you write tests.
With the second approach I started off using hatch however it seemed to work with little more than moving the example directory to src\example and by creating a pyproject.toml file. I've a suspicion there is some magic in the later versions of python that help. I am using 3.10.8 for this exercise.
[project]
name = "example"
version = "0.0.1"
Example:
sys.path.insert(0, "path_to_your_project")
Our Python 3.10 unit tests are breaking when the modules being tested need to import other modules. When we use the packaging techniques recommended by other posts and articles, either the unit tests fail to import modules, or the direct calls to run the app fail to import modules. The other posts and articles we have read do not show how to validate that both the application itself and the unit tests can each import modules when called separately. So we created a bare bones example below and are asking how to structure the packaging correctly.
What specific changes must be made to the syntax below in order for the two python commands given below to successfully run on the bare bones example app given below?
PROBLEM DESCRIPTION:
A python 3.10 app must import modules when called either directly as an app or indirectly through unit tests.
Packages must be used to organize the code.
Calls to unit tests are breaking because modules cannot be found.
The two test commands that must run without errors to validate solution of this problem are:
C:\path\to\dir>python repoName\app\first.py
C:\path\to\dir>python -m unittest repoName.unitTests.test_example
This post is different from the other posts on this topic. We have reviewed many articles and posts on this topic, but the other sources failed to address our use case, so we have created a more explicit example below to test the two types of commands that must succeed in order to meet the needs of this more explicit use case.
APP STRUCTURE:
The very simple structure of the app that is failing to import packages during unit tests is:
repoName
app
__init__.py
first.py
second.py
third.py
unitTests
__init__.py
test_example.py
__init__.py
SIMPLE CODE TO REPRODUCE PROBLEM:
The code for a stripped down example to reproduce the problem is as follows:
The contents of repoName\app\__init__.py are:
print('inside app __init__.py')
__all__ = ['first', 'second', 'third']
The contents of first.py are:
import second as second
from third import third
import sys
inputArgs=sys.argv
def runCommands():
trd = third()
if second.something == 'platform':
if second.another == 'on':
trd.doThree()
if second.something != 'unittest' :
sys.exit(0)
second.processInputArgs(inputArgs)
runCommands()
The contents of second.py are:
something = ''
another = ''
inputVars = {}
def processInputArgs(inputArgs):
global something
global another
global inputVars
if ('unittest' in inputArgs[0]):
something = 'unittest'
elif ('unittest' not in inputArgs[0]):
something = 'platform'
another = 'on'
jonesy = 'go'
inputVars = { 'jonesy': jonesy }
The contents of third.py are:
print('inside third.py')
import second as second
class third:
def __init__(self):
pass
##public
def doThree(self):
print("jonesy is: ", second.inputVars.get('jonesy'))
The contents of repoName\unitTests\__init__.py are:
print('inside unit-tests __init__.py')
__all__ = ['test_example']
The contents of test_example.py are:
import unittest
class test_third(unittest.TestCase):
def test_doThree(self):
from repoName.app.third import third
num3 = third()
num3.doThree()
self.assertTrue(True)
if __name__ == '__main__':
unittest.main()
The contents of repoName\__init__.py are:
print('inside repoName __init__.py')
__all__ = ['app', 'unitTests']
ERROR RESULTING FROM RUNNING COMMANDS:
The command line response to the two commands are given below. You can see that the call to the app succeeds, while the call to the unit test fails.
C:\path\to\dir>python repoName\app\first.py
inside third.py
jonesy is: go
C:\path\to\dir>python -m unittest repoName.unitTests.test_example
inside repoName __init__.py
inside unit-tests __init__.py
inside app __init__.py
inside third.py
E
======================================================================
ERROR: test_doThree (repoName.unitTests.test_example.test_third)
----------------------------------------------------------------------
Traceback (most recent call last):
File "C:\path\to\dir\repoName\unitTests\test_example.py", line 15, in test_doThree
from repoName.app.third import third
File "C:\path\to\dir\repoName\app\third.py", line 3, in <module>
import second as second
ModuleNotFoundError: No module named 'second'
----------------------------------------------------------------------
Ran 1 test in 0.002s
FAILED (errors=1)
What specific changes must be made to the code above in order for all the modules to be imported correctly when either of the given commands are run?
Creating an "alias" for modules
Update the contents of repoName\app\__init__.py to:
print('inside app __init__.py')
__all__ = ['first', 'second', 'third']
import sys
import repoName.app.second as second
sys.modules['second'] = second
import repoName.app.third as third
sys.modules['third'] = third
import repoName.app.first as first
sys.modules['first'] = first
How to ensure first.py gets run even when imported
So when the test fixture imports repoName.app.third, Python will recursively import the parent packages so that:
import repoName.app.third is equivalent to
import repoName
# inside repoName __init__.py
import app
#inside app __init__.py
import third
#inside third.py
So running from repoName.app.third import third inside test_doThree, executes repoName\app\__init__.py. In __init__.py, import repoName.app.first as first is called. Importing first will execute the following lines at the bottom of first.py
second.processInputArgs(inputArgs)
runCommands()
In second.processInputArgs, jonesy = 'go' is executed setting the variable to be printed out when the rest of the test is ran.
Here is how I have gone about trying to solve this.
I exported the PYTHONPATH to the repo folder repoName (I am using linux)
cd repoName
export PYTHONPATH=`pwd`
then in test_example.py
import unittest
class test_third(unittest.TestCase):
def test_doThree(self):
from app.third import third # changed here
num3 = third()
num3.doThree()
self.assertTrue(True)
if __name__ == '__main__':
unittest.main()
Then in third.py
print('inside third.py')
import app.second as second # changed here
class third:
def __init__(self):
pass
##public
def doThree(self):
print("jonesy is: ", second.inputVars.get('jonesy'))
Also it is worth noting that I did not create any __init__.py files
The code in the question relies on first.py being imported so it calls a function in second.py to set a global that is used by third.py. As the Zen Of Python says:
Explicit is better than implicit
The current structure will be difficult to maintain, test, and debug as your project grows. I have redone the example in the question removing globals and code being executed on import.
first.py
import sys
from app import second
from app.third import Third
def run_commands(input_args):
trd = Third()
if input_args.another == "on":
trd.do_three(input_args)
def main():
input_args = second.process_input_args(sys.argv)
run_commands(input_args)
if __name__ == "__main__":
main()
second.py
from dataclasses import dataclass
#dataclass
class InputArgs:
something: str
another: str
jonesy: str
def process_input_args(input_args):
something = "platform"
another = "on"
jonesy = "go"
return InputArgs(something, another, jonesy)
third.py
import sys
print("inside third.py")
class Third:
def __init__(self):
pass
# #public
def do_three(self, input_args):
print("jonesy is: ", input_args.jonesy)
test_example.py
import io
import unittest
from unittest import mock
from app.second import InputArgs
from app.third import Third
class ThirdTests(unittest.TestCase):
def test_doThree(self):
input_args = InputArgs(something="platform",
another="on",
jonesy="go")
num3 = Third()
with unittest.mock.patch('sys.stdout', new=io.StringIO()) as fake_out:
num3.do_three(input_args)
self.assertEqual("jonesy is: go\n", fake_out.getvalue())
if __name__ == "__main__":
unittest.main()
For Python development I would always recommend having a Python Virtual Environment (venv) so that each repo's development is isolated.
In the repoName directory do (for Linux):
python3.10 -m venv venv
Or like the following for windows:
c:\>c:\Python310\python -m venv venv
You will then need to activate the venv.
Linux: . venv/bin/activate
Windows: .\venv\scripts\activate.ps1
I would suggest packaging the app as your module then all your imports will be of the style:
from app.third import third
trd = third()
or
from app import third
trd = third.third()
To package app create a setup.py file in the repoName directory. The file will look something like this:
from setuptools import setup
setup(
name='My App',
version='1.0.0',
url='https://github.com/mypackage.git',
author='Author Name',
author_email='author#gmail.com',
description='Description of my package',
packages=['app'],
install_requires=[],
entry_points={
'console_scripts': ['my-app=app.first:main'],
},
)
I would also rename the unitTests directory to something like tests so that the unittest module can find it automatically as it looks for files and directories starting with test.
So a structure something like this:
repoName/
├── app
│ ├── __init__.py
│ ├── first.py
│ ├── second.py
│ └── third.py
├── setup.py
├── tests
│ ├── __init__.py
│ └── test_example.py
└── venv
You can now do pip install to install from a local src tree in development mode. The great thing about this is that you don't have to mess with the python path or sys.path.
(venv) repoName $ pip install -e .
Obtaining file:///home/user/projects/repoName
Preparing metadata (setup.py) ... done
Installing collected packages: My-App
Running setup.py develop for My-App
Successfully installed My-App-1.0.0
With the install done the app can be launched:
(venv) repoName $ python app/first.py
inside app __init__.py
inside third.py
jonesy is: go
In the setup file we told python that my-app was an entry point so we can use this to launch the same thing:
(venv) repoName $ my-app
inside app __init__.py
inside third.py
jonesy is: go
For the unittests we can use the following command and it will discover all the tests because we have used test to start directory and file names.
(venv) repoName $ python -m unittest
inside app __init__.py
inside unit-tests __init__.py
inside third.py
jonesy is: go
.
----------------------------------------------------------------------
Ran 1 test in 0.000s
OK
Now we have this setup it is easy to package up app for distribution. Either directly to users or via a Package Index like https://pypi.org/
Install the build module:
(venv) repoName $ pip install --upgrade build
To build the Python wheel:
(venv) repoName $ python build
There should now be a dist folder with a wheel in it which you can send to users. They can install this with pip:
pip install My_App-1.0.0-py3-none-any.whl
Maybe my goal and what I try to do here is wrong in the meaning of unpythonic. I am open for any suggestions about that.
My goals
Application (myapp) with its own tests folder.
A package (mypackage) with its own tests folder.
The tests for the package should be runable from the application folder and from the package folder.
The package have implicit and explicit components. The latter need to be imported explicit (e.g. via import mypackage.mymoduleB).
The package (folder) can be copied (shipped for reuse in other applications?) to other file system locations without loosing its functionality and testabilibty. That is why tests is inside the package folder and not outside.
That is the folder tree where itest is the name of the project, myapp is the application with an if __name__ == '__main__': in it and mypackag is the package.
itest
└── myapp
├── myapp.py
├── mypackage
│ ├── __init__.py
│ ├── _mymoduleA.py
│ ├── mymoduleB.py
│ └── tests
│ ├── __init__.py
│ └── test_all.py
└── tests
├── __init__.py
└── test_myapp.py
The problem
I can run the unittests from the application directory without problems.
/home/user/tab-cloud/_transfer/itest/myapp $ python3 -m unittest -vvv
test_A (mypackage.tests.test_all.TestAll) ... mymoduleA.foo()
ok
test_B (mypackage.tests.test_all.TestAll) ... mymoduleB.bar()
ok
test_myname (tests.test_myapp.TestMyApp) ... ok
----------------------------------------------------------------------
Ran 3 tests in 0.001s
OK
But when I go inside the package the tests do not run (sieh goal #3).
/home/user/tab-cloud/_transfer/itest/myapp/mypackage $ python3 -m unittest -vvv
tests.test_all (unittest.loader._FailedTest) ... ERROR
======================================================================
ERROR: tests.test_all (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: tests.test_all
Traceback (most recent call last):
File "/usr/lib/python3.9/unittest/loader.py", line 436, in _find_test_path
module = self._get_module_from_name(name)
File "/usr/lib/python3.9/unittest/loader.py", line 377, in _get_module_from_name
__import__(name)
File "/home/user/tab-cloud/_transfer/itest/myapp/mypackage/tests/test_all.py", line 12, in <module>
from . import mypackage
ImportError: cannot import name 'mypackage' from 'tests' (/home/user/tab-cloud/_transfer/itest/myapp/mypackage/tests/__init__.py)
----------------------------------------------------------------------
Ran 1 test in 0.001s
FAILED (errors=1)
The MWE
No I show you the files. To make sure the tests for the package using the right import when run from the application folder or from the package folder I use importlib (based on foreign solution).
The three files form the package
This is myapp/mypackage/__init__.py:
# imported implicite via 'mypackage'
from ._mymoduleA import *
# 'mymoduleB' need to be imported explicite
# via 'mypackage.moduleB'
This is myapp/mypackage/_mymoduleA.py:
def foo():
print('mymoduleA.foo()')
return 1
This is myapp/mypackage/mymoduleB.py:
def bar():
print('mymoduleB.bar()')
return 2
The tests for the package
The myapp/mypackage/tests/__init__.py is empty.
This is myapp/mypackage/tests/test_all.py:
import importlib
import unittest
# The package should be able to be tested by itself (run unittest inside the
# package directory) AND from the using application (run unittest in
# application directory).
# Based on: https://stackoverflow.com/a/14050282/4865723
if importlib.util.find_spec('mypackage'):
import mypackage
import mypackage.mymoduleB
else:
from . import mypackage
from mypackage import mymoduleB
class TestAll(unittest.TestCase):
def test_A(self):
self.assertEqual(1, mypackage.foo())
def test_B(self):
self.assertEqual(2, mypackage.mymoduleB.bar())
The application
This is cat myapp/myapp.py:
#!/usr/bin/env python3
import mypackage
def myname():
return 'My application!'
if __name__ == '__main__':
print(myname())
mypackage.foo()
try:
mypackage.mymoduleB.bar()
except AttributeError:
# we expecting this
print('Not imported yet: "mymoduleB.bar()"')
# this should work
import mypackage.mymoduleB
mypackage.mymoduleB.bar()
The test for the application
The myapp/tests/__init__.py is empty.
This is myapp/tests/test_myapp.py:
import unittest
import myapp
class TestMyApp(unittest.TestCase):
def test_myname(self):
self.assertEqual(myapp.myname(), 'My application!')
Sidenotes
Please let me explain something more about my goals. The mypackage should be reusable in other projects. In practice this means I copy the mypackage folder from one place to another. And while copy that folder I do want that tests folder come with it without explicte thinking about it because it is outside the package folder. And if the new project does unittesting the tests of the package should be involved in that unittesting automaticlly (via discover).
Your goal is really a bit unpythonic. But sometimes, you have to break the rules to free your heart.
You can solve the problem by checking for the __package__ attribute in myapp/mypackage/__init__.py like this:
# hint from there: https://stackoverflow.com/a/65426846/4865723
import sys
from pathlib import Path
sys.path.insert(0, str(Path(__file__).resolve().parent.parent))
if __package__:
from ._mymoduleA import foo
else:
from _mymoduleA import *
In this case myapp/mypackage/tests/test_all.py the code gets a little simpler:
import importlib
import unittest
if not importlib.util.find_spec('mypackage'):
from __init__ import *
import mypackage
from mypackage import mymoduleB
class TestAll(unittest.TestCase):
def test_A(self):
self.assertEqual(1, mypackage.foo())
def test_B(self):
self.assertEqual(2, mymoduleB.bar())
All other files remain unchanged.
As a result, you get the ability to run tests from both /myapp and /myapp/mypackage folder. At the same time, there is no need to hardcode any absolute paths. The app can be copied to any other file system locations.
I hope it will useful for you.
I created an import library a couple of years ago. It works on pathing. I used it to create a plugin system where I could essentially install and import multiple versions of any library (with some limitations).
For this we get the current path of the module. Then we import the package using the path. This library will automatically add the proper path to sys.path.
All you need to do is install pylibimp pip install pylibimp and edit myapp/mypackage/tests/test_all.py
import os
import pylibimp
import unittest
path_tests = os.path.join(os.path.dirname(__file__))
path_mypackage = os.path.dirname(path_tests)
path_myapp = os.path.dirname(path_mypackage)
mypackage = pylibimp.import_module(os.path.join(path_myapp, 'mypackage'), reset_modules=False)
class TestAll(unittest.TestCase):
def test_A(self):
self.assertEqual(1, mypackage.foo())
def test_B(self):
self.assertEqual(2, mypackage.mymoduleB.bar())
I believe the background is fairly simple.
import os
import sys
sys.path.insert(0, os.path.abspath('path/to/myapp'))
# Since path is added we can "import mypackage"
mypackage = __import__('mypackage')
sys.path.pop(0) # remove the added path to not mess with other imports
I hope this is what you are looking for.
I've tried reading through questions about sibling imports and even the
package documentation, but I've yet to find an answer.
With the following structure:
├── LICENSE.md
├── README.md
├── api
│ ├── __init__.py
│ ├── api.py
│ └── api_key.py
├── examples
│ ├── __init__.py
│ ├── example_one.py
│ └── example_two.py
└── tests
│ ├── __init__.py
│ └── test_one.py
How can the scripts in the examples and tests directories import from the
api module and be run from the commandline?
Also, I'd like to avoid the ugly sys.path.insert hack for every file. Surely
this can be done in Python, right?
Tired of sys.path hacks?
There are plenty of sys.path.append -hacks available, but I found an alternative way of solving the problem in hand.
Summary
Wrap the code into one folder (e.g. packaged_stuff)
Create setup.py script where you use setuptools.setup(). (see minimal setup.py below)
Pip install the package in editable state with pip install -e <myproject_folder>
Import using from packaged_stuff.modulename import function_name
Setup
The starting point is the file structure you have provided, wrapped in a folder called myproject.
.
└── myproject
├── api
│ ├── api_key.py
│ ├── api.py
│ └── __init__.py
├── examples
│ ├── example_one.py
│ ├── example_two.py
│ └── __init__.py
├── LICENCE.md
├── README.md
└── tests
├── __init__.py
└── test_one.py
I will call the . the root folder, and in my example case it is located at C:\tmp\test_imports\.
api.py
As a test case, let's use the following ./api/api.py
def function_from_api():
return 'I am the return value from api.api!'
test_one.py
from api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
Try to run test_one:
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\myproject\tests\test_one.py", line 1, in <module>
from api.api import function_from_api
ModuleNotFoundError: No module named 'api'
Also trying relative imports wont work:
Using from ..api.api import function_from_api would result into
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\tests\test_one.py", line 1, in <module>
from ..api.api import function_from_api
ValueError: attempted relative import beyond top-level package
Steps
Make a setup.py file to the root level directory
The contents for the setup.py would be*
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
Create virtual env
python -m venv venv
Activate virtual env
source ./venv/bin/activate (Linux, macOS) or ./venv/Scripts/activate (Win)
To learn more about this, just Google out "python virtual env tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
PS C:\tmp\test_imports> python -m venv venv
PS C:\tmp\test_imports> .\venv\Scripts\activate
(venv) PS C:\tmp\test_imports>
and your folder tree should look like this**
.
├── myproject
│ ├── api
│ │ ├── api_key.py
│ │ ├── api.py
│ │ └── __init__.py
│ ├── examples
│ │ ├── example_one.py
│ │ ├── example_two.py
│ │ └── __init__.py
│ ├── LICENCE.md
│ ├── README.md
│ └── tests
│ ├── __init__.py
│ └── test_one.py
├── setup.py
└── venv
├── Include
├── Lib
├── pyvenv.cfg
└── Scripts [87 entries exceeds filelimit, not opening dir]
pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
(venv) PS C:\tmp\test_imports> pip install -e .
Obtaining file:///C:/tmp/test_imports
Installing collected packages: myproject
Running setup.py develop for myproject
Successfully installed myproject
(venv) PS C:\tmp\test_imports> pip freeze
myproject==1.0
Add myproject. into your imports
Note that you will have to add myproject. only into imports that would not work otherwise. Imports that worked without the setup.py & pip install will work still work fine. See an example below.
Test the solution
Now, let's test the solution using api.py defined above, and test_one.py defined below.
test_one.py
from myproject.api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
running the test
(venv) PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
I am the return value from api.api!
* See the setuptools docs for more verbose setup.py examples.
** In reality, you could put your virtual environment anywhere on your hard disk.
Seven years after
Since I wrote the answer below, modifying sys.path is still a quick-and-dirty trick that works well for private scripts, but there has been several improvements
Installing the package (in a virtualenv or not) will give you what you want, though I would suggest using pip to do it rather than using setuptools directly (and using setup.cfg to store the metadata)
Using the -m flag and running as a package works too (but will turn out a bit awkward if you want to convert your working directory into an installable package).
For the tests, specifically, pytest is able to find the api package in this situation and takes care of the sys.path hacks for you
So it really depends on what you want to do. In your case, though, since it seems that your goal is to make a proper package at some point, installing through pip -e is probably your best bet, even if it is not perfect yet.
Old answer
As already stated elsewhere, the awful truth is that you have to do ugly hacks to allow imports from siblings modules or parents package from a __main__ module. The issue is detailed in PEP 366. PEP 3122 attempted to handle imports in a more rational way but Guido has rejected it one the account of
The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern.
(here)
Though, I use this pattern on a regular basis with
# Ugly hack to allow absolute import from the root folder
# whatever its name is. Please forgive the heresy.
if __name__ == "__main__" and __package__ is None:
from sys import path
from os.path import dirname as dir
path.append(dir(path[0]))
__package__ = "examples"
import api
Here path[0] is your running script's parent folder and dir(path[0]) your top level folder.
I have still not been able to use relative imports with this, though, but it does allow absolute imports from the top level (in your example api's parent folder).
Here is another alternative that I insert at top of the Python files in tests folder:
# Path hack.
import sys, os
sys.path.insert(0, os.path.abspath('..'))
You don't need and shouldn't hack sys.path unless it is necessary and in this case it is not. Use:
import api.api_key # in tests, examples
Run from the project directory: python -m tests.test_one.
You should probably move tests (if they are api's unittests) inside api and run python -m api.test to run all tests (assuming there is __main__.py) or python -m api.test.test_one to run test_one instead.
You could also remove __init__.py from examples (it is not a Python package) and run the examples in a virtualenv where api is installed e.g., pip install -e . in a virtualenv would install inplace api package if you have proper setup.py.
I don't yet have the comprehension of Pythonology necessary to see the intended way of sharing code amongst unrelated projects without a sibling/relative import hack. Until that day, this is my solution. For examples or tests to import stuff from ..\api, it would look like:
import sys.path
import os.path
# Import from sibling directory ..\api
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + "/..")
import api.api
import api.api_key
For siblings package imports, you can use either the insert or the append method of the [sys.path][2] module:
if __name__ == '__main__' and if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
This will work if you are launching your scripts as follows:
python examples/example_one.py
python tests/test_one.py
On the other hand, you can also use the relative import:
if __name__ == '__main__' and if __package__ is not None:
import ..api.api
In this case you will have to launch your script with the '-m' argument (note that, in this case, you must not give the '.py' extension):
python -m packageName.examples.example_one
python -m packageName.tests.test_one
Of course, you can mix the two approaches, so that your script will work no matter how it is called:
if __name__ == '__main__':
if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
else:
import ..api.api
For readers in 2021: If you're not confident with pip install -e :
Consider this hierarchy, as recommended by an answer from Relative imports in Python 3:
MyProject
├── src
│ ├── bot
│ │ ├── __init__.py
│ │ ├── main.py
│ │ └── sib1.py
│ └── mod
│ ├── __init__.py
│ └── module1.py
└── main.py
The content of main.py, which is the starting point and we use absolute import (no leading dots) here:
from src.bot import main
if __name__ == '__main__':
main.magic_tricks()
The content of bot/main.py, which takes advantage of explicit relative imports:
from .sib1 import my_drink # Both are explicit-relative-imports.
from ..mod.module1 import relative_magic
def magic_tricks():
# Using sub-magic
relative_magic(in=["newbie", "pain"], advice="cheer_up")
my_drink()
# Do your work
...
Now here comes the reasoning:
When executing python MyProject/main.py, the path/to/MyProject is added into the sys.path.
The absolute import import src.bot will read it.
The from ..mod part means it will go up one level to MyProject/src.
Can we see it? YES, since path/to/MyProject is added into the sys.path.
So the point is:
We should put the main script next to MyProject/src, since that when doing relative-referencing, we won't go out of the src, and the absolute import import src. provides the just-fit scope for us: the src/ scope.
See also: ModuleNotFoundError: No module named 'sib1'
TLDR
This method does not require setuptools, path hacks, additional command line arguments, or specifying the top level of the package in every single file of your project.
Just make a script in the parent directory of whatever your are calling to be your __main__ and run everything from there. For further explanation continue reading.
Explanation
This can be accomplished without hacking a new path together, extra command line args, or adding code to each of your programs to recognize its siblings.
The reason this fails as I believe was mentioned before is the programs being called have their __name__ set as __main__. When this occurs the script being called accepts itself to be on the top level of the package and refuses to recognize scripts in sibling directories.
However, everything under the top level of the directory will still recognize ANYTHING ELSE under the top level. This means the ONLY thing you have to do to get files in sibling directories to recognize/utilize each other is to call them from a script in their parent directory.
Proof of Concept
In a dir with the following structure:
.
|__Main.py
|
|__Siblings
|
|___sib1
| |
| |__call.py
|
|___sib2
|
|__callsib.py
Main.py contains the following code:
import sib1.call as call
def main():
call.Call()
if __name__ == '__main__':
main()
sib1/call.py contains:
import sib2.callsib as callsib
def Call():
callsib.CallSib()
if __name__ == '__main__':
Call()
and sib2/callsib.py contains:
def CallSib():
print("Got Called")
if __name__ == '__main__':
CallSib()
If you reproduce this example you will notice that calling Main.py will result in "Got Called" being printed as is defined in sib2/callsib.py even though sib2/callsib.py got called through sib1/call.py. However if one were to directly call sib1/call.py (after making appropriate changes to the imports) it throws an exception. Even though it worked when called by the script in its parent directory, it will not work if it believes itself to be on the top level of the package.
You need to look to see how the import statements are written in the related code. If examples/example_one.py uses the following import statement:
import api.api
...then it expects the root directory of the project to be in the system path.
The easiest way to support this without any hacks (as you put it) would be to run the examples from the top level directory, like this:
PYTHONPATH=$PYTHONPATH:. python examples/example_one.py
Just in case someone using Pydev on Eclipse end up here: you can add the sibling's parent path (and thus the calling module's parent) as an external library folder using Project->Properties and setting External Libraries under the left menu Pydev-PYTHONPATH. Then you can import from your sibling, e. g. from sibling import some_class.
I wanted to comment on the solution provided by np8 but I don't have enough reputation so I'll just mention that you can create a setup.py file exactly as they suggested, and then do pipenv install --dev -e . from the project root directory to turn it into an editable dependency. Then your absolute imports will work e.g. from api.api import foo and you don't have to mess around with system-wide installations.
Documentation
If you're using pytest then the pytest docs describe a method of how to reference source packages from a separate test package.
The suggested project directory structure is:
setup.py
src/
mypkg/
__init__.py
app.py
view.py
tests/
__init__.py
foo/
__init__.py
test_view.py
bar/
__init__.py
test_view.py
Contents of the setup.py file:
from setuptools import setup, find_packages
setup(name="PACKAGENAME", packages=find_packages())
Install the packages in editable mode:
pip install -e .
The pytest article references this blog post by Ionel Cristian Mărieș.
I made a sample project to demonstrate how I handled this, which is indeed another sys.path hack as indicated above. Python Sibling Import Example, which relies on:
if __name__ == '__main__': import os import sys sys.path.append(os.getcwd())
This seems to be pretty effective so long as your working directory remains at the root of the Python project.
in your main file add this:
import sys
import os
sys.path.append(os.path.abspath(os.path.join(__file__,mainScriptDepth)))
mainScriptDepth = the depth of the main file from the root of the project.
Here is your case mainScriptDepth = "../../". Then you can import by specifying the path (from api.api import * ) from the root of your project.
The problem:
You simply can not get import mypackage to work in test.py. You will need either an editable install, change to path, or changes to __name__ and path
demo
├── dev
│ └── test.py
└── src
└── mypackage
├── __init__.py
└── module_of_mypackage.py
--------------------------------------------------------------
ValueError: attempted relative import beyond top-level package
The solution:
import sys; sys.path += [sys.path[0][:-3]+"src"]
Put the above before attempting imports in test.py. Thats it. You can now import mypackage.
This will work both on Windows and Linux. It will also not care from which path you run your script. It is short enough to slap it anywhere you might need it.
Why it works:
The sys.path contains the places, in order, where to look for packages when attempting imports if they are not found in installed site packages. When you run test.py the first item in sys.path will be something like /mnt/c/Users/username/Desktop/demo/dev i.e.: where you ran your file. The oneliner will simply add the sibling folder to path and everything works. You will not have to worry about Windows vs Linux file paths since we are only editing the last folder name and nothing else. If you project structure is already set in stone for your repository we can also reasonably just use the magic number 3 to slice away dev and substitute src
for the main question:
call sibling folder as module:
from .. import siblingfolder
call a_file.py from sibling folder as module:
from ..siblingfolder import a_file
call a_function inside a file in sibling folder as module:
from..siblingmodule.a_file import func_name_exists_in_a_file
The easiest way.
go to lib/site-packages folder.
if exists 'easy_install.pth' file, just edit it and add your directory that you have script that you want make it as module.
if not exists, just make it one...and put your folder that you want there
after you add it..., python will be automatically perceive that folder as similar like site-packages and you can call every script from that folder or subfolder as a module.
i wrote this by my phone, and hard to set it to make everyone comfortable to read.
First, you should avoid having files with the same name as the module itself. It may break other imports.
When you import a file, first the interpreter checks the current directory and then searchs global directories.
Inside examples or tests you can call:
from ..api import api
Project
1.1 User
1.1.1 about.py
1.1.2 init.py
1.2 Tech
1.2.1 info.py
1.1.2 init.py
Now, if you want to access about.py module in the User package, from the info.py module in Tech package then you have to bring the cmd (in windows) path to Project i.e.
**C:\Users\Personal\Desktop\Project>**as per the above Package example. And from this path you have to enter, python -m Package_name.module_name
For example for the above Package we have to do,
C:\Users\Personal\Desktop\Project>python -m Tech.info
Imp Points
Don't use .py extension after info module i.e. python -m Tech.info.py
Enter this, where the siblings packages are in the same level.
-m is the flag, to check about it you can type from the cmd python --help
I've tried reading through questions about sibling imports and even the
package documentation, but I've yet to find an answer.
With the following structure:
├── LICENSE.md
├── README.md
├── api
│ ├── __init__.py
│ ├── api.py
│ └── api_key.py
├── examples
│ ├── __init__.py
│ ├── example_one.py
│ └── example_two.py
└── tests
│ ├── __init__.py
│ └── test_one.py
How can the scripts in the examples and tests directories import from the
api module and be run from the commandline?
Also, I'd like to avoid the ugly sys.path.insert hack for every file. Surely
this can be done in Python, right?
Tired of sys.path hacks?
There are plenty of sys.path.append -hacks available, but I found an alternative way of solving the problem in hand.
Summary
Wrap the code into one folder (e.g. packaged_stuff)
Create setup.py script where you use setuptools.setup(). (see minimal setup.py below)
Pip install the package in editable state with pip install -e <myproject_folder>
Import using from packaged_stuff.modulename import function_name
Setup
The starting point is the file structure you have provided, wrapped in a folder called myproject.
.
└── myproject
├── api
│ ├── api_key.py
│ ├── api.py
│ └── __init__.py
├── examples
│ ├── example_one.py
│ ├── example_two.py
│ └── __init__.py
├── LICENCE.md
├── README.md
└── tests
├── __init__.py
└── test_one.py
I will call the . the root folder, and in my example case it is located at C:\tmp\test_imports\.
api.py
As a test case, let's use the following ./api/api.py
def function_from_api():
return 'I am the return value from api.api!'
test_one.py
from api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
Try to run test_one:
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\myproject\tests\test_one.py", line 1, in <module>
from api.api import function_from_api
ModuleNotFoundError: No module named 'api'
Also trying relative imports wont work:
Using from ..api.api import function_from_api would result into
PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
Traceback (most recent call last):
File ".\tests\test_one.py", line 1, in <module>
from ..api.api import function_from_api
ValueError: attempted relative import beyond top-level package
Steps
Make a setup.py file to the root level directory
The contents for the setup.py would be*
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Use a virtual environment
If you are familiar with virtual environments, activate one, and skip to the next step. Usage of virtual environments are not absolutely required, but they will really help you out in the long run (when you have more than 1 project ongoing..). The most basic steps are (run in the root folder)
Create virtual env
python -m venv venv
Activate virtual env
source ./venv/bin/activate (Linux, macOS) or ./venv/Scripts/activate (Win)
To learn more about this, just Google out "python virtual env tutorial" or similar. You probably never need any other commands than creating, activating and deactivating.
Once you have made and activated a virtual environment, your console should give the name of the virtual environment in parenthesis
PS C:\tmp\test_imports> python -m venv venv
PS C:\tmp\test_imports> .\venv\Scripts\activate
(venv) PS C:\tmp\test_imports>
and your folder tree should look like this**
.
├── myproject
│ ├── api
│ │ ├── api_key.py
│ │ ├── api.py
│ │ └── __init__.py
│ ├── examples
│ │ ├── example_one.py
│ │ ├── example_two.py
│ │ └── __init__.py
│ ├── LICENCE.md
│ ├── README.md
│ └── tests
│ ├── __init__.py
│ └── test_one.py
├── setup.py
└── venv
├── Include
├── Lib
├── pyvenv.cfg
└── Scripts [87 entries exceeds filelimit, not opening dir]
pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
(venv) PS C:\tmp\test_imports> pip install -e .
Obtaining file:///C:/tmp/test_imports
Installing collected packages: myproject
Running setup.py develop for myproject
Successfully installed myproject
(venv) PS C:\tmp\test_imports> pip freeze
myproject==1.0
Add myproject. into your imports
Note that you will have to add myproject. only into imports that would not work otherwise. Imports that worked without the setup.py & pip install will work still work fine. See an example below.
Test the solution
Now, let's test the solution using api.py defined above, and test_one.py defined below.
test_one.py
from myproject.api.api import function_from_api
def test_function():
print(function_from_api())
if __name__ == '__main__':
test_function()
running the test
(venv) PS C:\tmp\test_imports> python .\myproject\tests\test_one.py
I am the return value from api.api!
* See the setuptools docs for more verbose setup.py examples.
** In reality, you could put your virtual environment anywhere on your hard disk.
Seven years after
Since I wrote the answer below, modifying sys.path is still a quick-and-dirty trick that works well for private scripts, but there has been several improvements
Installing the package (in a virtualenv or not) will give you what you want, though I would suggest using pip to do it rather than using setuptools directly (and using setup.cfg to store the metadata)
Using the -m flag and running as a package works too (but will turn out a bit awkward if you want to convert your working directory into an installable package).
For the tests, specifically, pytest is able to find the api package in this situation and takes care of the sys.path hacks for you
So it really depends on what you want to do. In your case, though, since it seems that your goal is to make a proper package at some point, installing through pip -e is probably your best bet, even if it is not perfect yet.
Old answer
As already stated elsewhere, the awful truth is that you have to do ugly hacks to allow imports from siblings modules or parents package from a __main__ module. The issue is detailed in PEP 366. PEP 3122 attempted to handle imports in a more rational way but Guido has rejected it one the account of
The only use case seems to be running scripts that happen
to be living inside a module's directory, which I've always seen as an
antipattern.
(here)
Though, I use this pattern on a regular basis with
# Ugly hack to allow absolute import from the root folder
# whatever its name is. Please forgive the heresy.
if __name__ == "__main__" and __package__ is None:
from sys import path
from os.path import dirname as dir
path.append(dir(path[0]))
__package__ = "examples"
import api
Here path[0] is your running script's parent folder and dir(path[0]) your top level folder.
I have still not been able to use relative imports with this, though, but it does allow absolute imports from the top level (in your example api's parent folder).
Here is another alternative that I insert at top of the Python files in tests folder:
# Path hack.
import sys, os
sys.path.insert(0, os.path.abspath('..'))
You don't need and shouldn't hack sys.path unless it is necessary and in this case it is not. Use:
import api.api_key # in tests, examples
Run from the project directory: python -m tests.test_one.
You should probably move tests (if they are api's unittests) inside api and run python -m api.test to run all tests (assuming there is __main__.py) or python -m api.test.test_one to run test_one instead.
You could also remove __init__.py from examples (it is not a Python package) and run the examples in a virtualenv where api is installed e.g., pip install -e . in a virtualenv would install inplace api package if you have proper setup.py.
I don't yet have the comprehension of Pythonology necessary to see the intended way of sharing code amongst unrelated projects without a sibling/relative import hack. Until that day, this is my solution. For examples or tests to import stuff from ..\api, it would look like:
import sys.path
import os.path
# Import from sibling directory ..\api
sys.path.append(os.path.dirname(os.path.abspath(__file__)) + "/..")
import api.api
import api.api_key
For siblings package imports, you can use either the insert or the append method of the [sys.path][2] module:
if __name__ == '__main__' and if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
This will work if you are launching your scripts as follows:
python examples/example_one.py
python tests/test_one.py
On the other hand, you can also use the relative import:
if __name__ == '__main__' and if __package__ is not None:
import ..api.api
In this case you will have to launch your script with the '-m' argument (note that, in this case, you must not give the '.py' extension):
python -m packageName.examples.example_one
python -m packageName.tests.test_one
Of course, you can mix the two approaches, so that your script will work no matter how it is called:
if __name__ == '__main__':
if __package__ is None:
import sys
from os import path
sys.path.append( path.dirname( path.dirname( path.abspath(__file__) ) ) )
import api
else:
import ..api.api
For readers in 2021: If you're not confident with pip install -e :
Consider this hierarchy, as recommended by an answer from Relative imports in Python 3:
MyProject
├── src
│ ├── bot
│ │ ├── __init__.py
│ │ ├── main.py
│ │ └── sib1.py
│ └── mod
│ ├── __init__.py
│ └── module1.py
└── main.py
The content of main.py, which is the starting point and we use absolute import (no leading dots) here:
from src.bot import main
if __name__ == '__main__':
main.magic_tricks()
The content of bot/main.py, which takes advantage of explicit relative imports:
from .sib1 import my_drink # Both are explicit-relative-imports.
from ..mod.module1 import relative_magic
def magic_tricks():
# Using sub-magic
relative_magic(in=["newbie", "pain"], advice="cheer_up")
my_drink()
# Do your work
...
Now here comes the reasoning:
When executing python MyProject/main.py, the path/to/MyProject is added into the sys.path.
The absolute import import src.bot will read it.
The from ..mod part means it will go up one level to MyProject/src.
Can we see it? YES, since path/to/MyProject is added into the sys.path.
So the point is:
We should put the main script next to MyProject/src, since that when doing relative-referencing, we won't go out of the src, and the absolute import import src. provides the just-fit scope for us: the src/ scope.
See also: ModuleNotFoundError: No module named 'sib1'
TLDR
This method does not require setuptools, path hacks, additional command line arguments, or specifying the top level of the package in every single file of your project.
Just make a script in the parent directory of whatever your are calling to be your __main__ and run everything from there. For further explanation continue reading.
Explanation
This can be accomplished without hacking a new path together, extra command line args, or adding code to each of your programs to recognize its siblings.
The reason this fails as I believe was mentioned before is the programs being called have their __name__ set as __main__. When this occurs the script being called accepts itself to be on the top level of the package and refuses to recognize scripts in sibling directories.
However, everything under the top level of the directory will still recognize ANYTHING ELSE under the top level. This means the ONLY thing you have to do to get files in sibling directories to recognize/utilize each other is to call them from a script in their parent directory.
Proof of Concept
In a dir with the following structure:
.
|__Main.py
|
|__Siblings
|
|___sib1
| |
| |__call.py
|
|___sib2
|
|__callsib.py
Main.py contains the following code:
import sib1.call as call
def main():
call.Call()
if __name__ == '__main__':
main()
sib1/call.py contains:
import sib2.callsib as callsib
def Call():
callsib.CallSib()
if __name__ == '__main__':
Call()
and sib2/callsib.py contains:
def CallSib():
print("Got Called")
if __name__ == '__main__':
CallSib()
If you reproduce this example you will notice that calling Main.py will result in "Got Called" being printed as is defined in sib2/callsib.py even though sib2/callsib.py got called through sib1/call.py. However if one were to directly call sib1/call.py (after making appropriate changes to the imports) it throws an exception. Even though it worked when called by the script in its parent directory, it will not work if it believes itself to be on the top level of the package.
You need to look to see how the import statements are written in the related code. If examples/example_one.py uses the following import statement:
import api.api
...then it expects the root directory of the project to be in the system path.
The easiest way to support this without any hacks (as you put it) would be to run the examples from the top level directory, like this:
PYTHONPATH=$PYTHONPATH:. python examples/example_one.py
Just in case someone using Pydev on Eclipse end up here: you can add the sibling's parent path (and thus the calling module's parent) as an external library folder using Project->Properties and setting External Libraries under the left menu Pydev-PYTHONPATH. Then you can import from your sibling, e. g. from sibling import some_class.
I wanted to comment on the solution provided by np8 but I don't have enough reputation so I'll just mention that you can create a setup.py file exactly as they suggested, and then do pipenv install --dev -e . from the project root directory to turn it into an editable dependency. Then your absolute imports will work e.g. from api.api import foo and you don't have to mess around with system-wide installations.
Documentation
If you're using pytest then the pytest docs describe a method of how to reference source packages from a separate test package.
The suggested project directory structure is:
setup.py
src/
mypkg/
__init__.py
app.py
view.py
tests/
__init__.py
foo/
__init__.py
test_view.py
bar/
__init__.py
test_view.py
Contents of the setup.py file:
from setuptools import setup, find_packages
setup(name="PACKAGENAME", packages=find_packages())
Install the packages in editable mode:
pip install -e .
The pytest article references this blog post by Ionel Cristian Mărieș.
I made a sample project to demonstrate how I handled this, which is indeed another sys.path hack as indicated above. Python Sibling Import Example, which relies on:
if __name__ == '__main__': import os import sys sys.path.append(os.getcwd())
This seems to be pretty effective so long as your working directory remains at the root of the Python project.
in your main file add this:
import sys
import os
sys.path.append(os.path.abspath(os.path.join(__file__,mainScriptDepth)))
mainScriptDepth = the depth of the main file from the root of the project.
Here is your case mainScriptDepth = "../../". Then you can import by specifying the path (from api.api import * ) from the root of your project.
The problem:
You simply can not get import mypackage to work in test.py. You will need either an editable install, change to path, or changes to __name__ and path
demo
├── dev
│ └── test.py
└── src
└── mypackage
├── __init__.py
└── module_of_mypackage.py
--------------------------------------------------------------
ValueError: attempted relative import beyond top-level package
The solution:
import sys; sys.path += [sys.path[0][:-3]+"src"]
Put the above before attempting imports in test.py. Thats it. You can now import mypackage.
This will work both on Windows and Linux. It will also not care from which path you run your script. It is short enough to slap it anywhere you might need it.
Why it works:
The sys.path contains the places, in order, where to look for packages when attempting imports if they are not found in installed site packages. When you run test.py the first item in sys.path will be something like /mnt/c/Users/username/Desktop/demo/dev i.e.: where you ran your file. The oneliner will simply add the sibling folder to path and everything works. You will not have to worry about Windows vs Linux file paths since we are only editing the last folder name and nothing else. If you project structure is already set in stone for your repository we can also reasonably just use the magic number 3 to slice away dev and substitute src
for the main question:
call sibling folder as module:
from .. import siblingfolder
call a_file.py from sibling folder as module:
from ..siblingfolder import a_file
call a_function inside a file in sibling folder as module:
from..siblingmodule.a_file import func_name_exists_in_a_file
The easiest way.
go to lib/site-packages folder.
if exists 'easy_install.pth' file, just edit it and add your directory that you have script that you want make it as module.
if not exists, just make it one...and put your folder that you want there
after you add it..., python will be automatically perceive that folder as similar like site-packages and you can call every script from that folder or subfolder as a module.
i wrote this by my phone, and hard to set it to make everyone comfortable to read.
First, you should avoid having files with the same name as the module itself. It may break other imports.
When you import a file, first the interpreter checks the current directory and then searchs global directories.
Inside examples or tests you can call:
from ..api import api
Project
1.1 User
1.1.1 about.py
1.1.2 init.py
1.2 Tech
1.2.1 info.py
1.1.2 init.py
Now, if you want to access about.py module in the User package, from the info.py module in Tech package then you have to bring the cmd (in windows) path to Project i.e.
**C:\Users\Personal\Desktop\Project>**as per the above Package example. And from this path you have to enter, python -m Package_name.module_name
For example for the above Package we have to do,
C:\Users\Personal\Desktop\Project>python -m Tech.info
Imp Points
Don't use .py extension after info module i.e. python -m Tech.info.py
Enter this, where the siblings packages are in the same level.
-m is the flag, to check about it you can type from the cmd python --help