I have a very simple test Python 3 project with the following file structure:
test/a.py
test/b.py
test/__init__.py
Everywhere I read, people say that in a.py I should import b.py using an absolute path:
from test.b import *
However, when I try I get the following error:
Traceback (most recent call last):
File "a.py", line 1, in <module>
from test.b import *
ModuleNotFoundError: No module named 'test.b'
I understand that I can import b.py using from b import *, however this is not what people recommend. They all recommend from test.b import *. But I can't get even this simple example to work.
As Martijn said in the comment, it depends on how you call a.py.
If you call it directly from within the directory by typing python a.py you will get the error above.
However, if you call it like that: python -m test.a while being one directory above the test directory, your import will work just fine.
The common directory structure is like this:
test/a.py
test/b.py
test/__init__.py
run.py
The main code should be put into run.py. When you want to import a.py in run.py, just write from test.a import * or something like that. And if you need to import b.py in a.py, do as you have been told from test.b import *. Then, run run.py would get the correct result.
Related
I am trying to make a package with modules inside that import each other and I'm very confused on how it works. I made three files: test1.py, test2.py and __init__.py which are contained in my package directory testpack.
The contents of the files are as follows:
# test1.py
def demo():
print("123")
return
# test2.py
import test1
def demo2():
print("test 2")
test1.demo()
if __name__=="__main__":
demo2()
The __init__.py file is left empty. When I run test2.py on it's own I get the expected output:
test 2
123
I also have a main.py file in the parent directory of testpack
# main.py
import testpack.test1
import testpack.test2
testpack.test2.demo2()
Running this gives the error:
Traceback (most recent call last):
File "E:/packages/main.py", line 2, in <module>
import testpack.test2
File "E:\packages\testpack\test2.py", line 1, in <module>
import test1
ModuleNotFoundError: No module named 'test1'
Following some tips I tried running a different version of the code which yielded the same results:
# main.py alternative
from testpack.test2 import demo2
demo2()
What am I missing? My logic was that when importing testpack.test2 Python goes into test2.py where it has to import test1. Importing test1 shouldn't be an issue because we can successfully run test2.py on it's own, so it would construct demo2() using the function from test1, and I'd be all done, but I'm obviously wrong. At one point I thought that running import testpack.test1 before import testpack.test2 could be posing problems because I'd be importing test1 twice (once inside test2 also), but with the alternate version of main not running as well I'm completely lost.
SOLVED: As per MisterMiyagi's suggestion, the solution was to use from . import test1 instead of import test1 in test2.py. The dot specifies that the file to be imported is located in the same directory as the current file, a double dot would specify that the file to be imported is located in the parent directory, etc. I found this article to be helpful https://realpython.com/absolute-vs-relative-python-imports/
i have a project structure like this:
python-modinfo
|--funcs
|--__init__.py
|--funcs.py
|--modules
|--File-Operations
|os_info.py
This is __init__.py:
from funcs.funcs import *
This is os_info.py:
import funcs.funcs
and this is terminal output:
Traceback (most recent call last):
File "os_info.py", line 1, in <module>
import funcs.funcs
ModuleNotFoundError: No module named 'funcs'
What is my mistake while creating a python package
If you are just looking for a quick solution, do this in os_info.py:
import sys
sys.path.append('../../funcs')
from funcs import *
os_info.py won't know where funcs.py is located, therefore you need to tell it by providing the location.
Try the __init__.py as from . import funcs as the funcs.py Is In Current Directory
Also try the same data in os_info.py
I get an ImportError because of wrong path when doing multiple imports in Python. For example with these files:
folder1
first.py
folder2
second.py
folder3
third.py
first.py imports the class in second.py
second.py imports the class in third.py
There's no problem with python ./folder2/second.py but python first.py gives me an ImportError
Traceback (most recent call last):
File "first.py", line 1, in <module>
from folder2.second import SecondClass
File "home/test/folder2/second.py", line 1, in <module>
from folder3.third import ThirdClass
ImportError: No module named 'folder3'
It seems first.py executes the import folder3.third.ThirdClass of second.py when it tries to import SecondClass from second.py, and because folder3 isn't in its path it raises an error.
If I change the import path in second.py from from folder3.third import ThirdClass to from folder2.folder3.third import ThirdClass, first.py works, but obviously second.py doesn't work anymore.
Is there a way to solve this?
Edit: Adding
import sys
sys.path.append("./folder1")
in first.py solves the problem.
It would have worked as is with Python 2, in Python 3 you need to use relative imports withing the package, on other word your import line the second.py should read:
from .folder3.third import ThirdClass
Now a slight complication. I am not sure where your package boundary is, if your first.py is still part of it or is the main module (meant to be passed directly to the python interpreter). Relative imports are based on module name (__main__ for the main one), main module must use absolute imports.
This has two implications for you. What do import statements in first.py look like. And also another form your import in second.py could look like, because based on where you package start, apart from the above give example, you could also say (first.py is your main module):
from folder2.folder3.third import ThirdClass
or (folder1 and first.py are already part of the package and imported from a main module):
from folder1.folder2.folder3.third import ThirdClass
All the fun details are as always in the corresponding PEP-328.
suppose I have a folder structure that looks like this:
.
├── A
│ ├── a.py
│ └── b.py
└── main.py
The files have the following content:
b.py:
class BClass:
pass
a.py:
from b import BClass
main.py:
from A import a
If I run python3.3 A/a.py or python3.3 B/b.by, there are no errors. However, if I run python3.3 main.py, the following error occurs:
Traceback (most recent call last):
File "main.py", line 1, in <module>
from A import a
File "/tmp/python_imports/A/a.py", line 1, in <module>
from b import BClass
ImportError: No module named 'b'
Changing the import-line in a.py to import A.b works, but obviously python3.3 A/a.py will fail then. I am not actually interested in running python3.3 A/a.py but I want the module to be importable from multiple locations. Therefore a.py should import b.py regardless of where a.py is imported.
How can this issue be resolved?
Besides the __init__.py I mentioned in my comment which is mandatory for packages, you need to import the sibling module relatively:
from .b import BClass
Then it also works in Python 3.
Alternatively you can of course import the full name:
from A.b import BClass
But then your module isn't relocatable as easily within your package tree.
In neither way, though, you are able to use a.py as a standalone. To achieve this you would need to surround the import statement with try/except and try a different version in case the first one fails:
try:
from .b import BClass
except ValueError:
from b import BClass
But that is understandable. In a larger system, modules might depend on other modules somewhere in the package, otherwise they maybe should not be part of a package but standalone. And if there are such dependencies, using a module as if it was a standalone will of course be a problem.
You need an __init__.py file (empty will be just fine) in the A directory. Otherwise, python won't recognize it as a package.
Now you're A is a package, you should use either absolute imports or explicit relative imports. In this case, in A/a.py either use from A.b import BClass or from .b import BClass.
I have the following directory structure:
test1/
test1/a.py
test1/test2/b.py
b.py needs to import a class in a.py. So I can add the following line to b.py before importing a.
sys.path.append(os.path.dirname(sys.argv[0]) + "/..")
This works and I can invoke b.py from any directory and it is able to import a.
But this fails when I write a script in another directory to invoke this file using execfile().
I tried relative imports but I get a "Attempted Relative Import in Non-Package error"
from ..a import someclass as cls
I have __init__.py in both test1, test2
Does someone have an idea how to make it work?
Is PYTHONPATH the way to go?
The problem is that execfile will evaluate the file you are calling as pure python code. Every relative import statement inside of b.py (and any package module imported by it) will have to remain true to your calling script.
One solution is to not use any relative import paths within the package. Make sure test1 package is on your PYTHONPATH as well.
b.py
from test1 import a
With test1 in your PYTHONPATH, the import of a should success in your execfile
>>> import sys
>>> sys.path.append('/path/to/parent/of_test1')
>>> execfile('/path/to/parent/of_test1/test1/test2/b.py')