Is it ok to use a module referencing with more than two dots in a path? Like in this example:
# Project structure:
# sound
# __init__.py
# codecs
# __init__.py
# echo
# __init__.py
# nix
# __init__.py
# way1.py
# way2.py
# way2.py source code
from .way1 import echo_way1
from ...codecs import cool_codec
# Do something with echo_way1 and cool_codec.
UPD: Changed the example. And I know, this will work in a practice. But is it a common method of importing or not?
update Nov. 24,2020
If you wanna dig deeper in python's relative-import, I strongly recommend you this answer.
Is it ok to use a module referencing with more than two dots in a path?
Yes. You can use multiple dots in relative import path, but it is only feasible when using from xxx import yyy syntax, not import xxx syntax. Moreover, single dot, two dots and three dots mean current directory, parent directory and grandparent directory respectively, and so on.
And I know, this will work in a practice. But is it a common method of importing or not?
It depends. If your project has complex directory structure, using absolute import would be "disgusting". For example,
from sub1.sub2.sub3.sub4.sub5 import yourmethod
. In this case, using relative import will make your code clean and neat. Maybe look like
from ...sub5 import yourmethod
From PEP8:
Absolute imports are recommended, as they are usually more readable and tend to be better behaved (or at least give better error messages) if the import system is incorrectly configured (such as when a directory inside a package ends up on sys.path):
import mypkg.sibling
from mypkg import sibling
from mypkg.sibling import example
However, explicit relative imports are an acceptable alternative to absolute imports, especially when dealing with complex package layouts where using absolute imports would be unnecessarily verbose:
from . import sibling
from .sibling import example
Standard library code should avoid complex package layouts and always use absolute imports.
Related
root
- Module_1
- utils.py
- Module_2
- basic.py
I have a couple of functions in utils.py which I want to use in basic.py, I tried every possible option but not able to do it, I referred many answers, I don't want to use sys path. Is it possible, thank u in advance (:
If I have to add __init__.py, where all should I add it??
In fact there's multiple ways.
I personally would suggest a way, where you do not have to mess with the environment variable PYTHONPATH and where you do not have to change sys.path
The simplest suggestion and preferred if possible in your context
If you can, then just move Module_2/basic.py one level up, so that it is located in the parent directory of Module_1 and Module_2.
create an empty file Module_1/__init__.py
and use following import in basic.py
from Module_1.utils import function
My second preferred suggestion:
I'd suggest to create two empty files.
Module_1/__init__.py
Module_2/__init__.py
By the way I hope, that the real directory names do not contain any uppercase characters. This is strongly discouraged. compared to python2 python 3 contains no (or almost no) module with uppercase letters anymore. So I'd try to keep the same spirit.
then in basic just write
from Module_1.utils import function
!!! But:
Instead of typing
python Module_2/basic.py
you had to be in the parent directory of Module_1 and Module_2
python -m Module_2.basic
or if under a unix like operation system you could also type
PYTHONPATH="/path/to/parent_of_Mdule_1" python Module_2/basic.py
My not so preferred suggestion:
It looks simpler but is a little more magic and might not be desired with bigger projects in a bigger contents.
It seems simpler, but for bigger projects you might get lost and when you start using coding style checkers like flake8 you will get coding style warnings
Just add at the beginning of Module_2/basic.py following lines.
import os
import sys
TOPDIR = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.insert(0, TOPDIR)
and import then the same way as in my first solution
from Module_1.utils import function
If you want to make a custom package importable the, put __init__.py in that dir. In your case, __init__.py should be in Module_1 dir. Similarly, if you want utilise the code of basic.py somewhere else then put __init__.py in Module_2 dir.
try add __init__.py in Module_1 folder and import in basic.py like this:
from Module_1.utils import func
Let's say I have a folder structure as below.
project/
-> app/
--> __init__.py (has db = SQLAlchemy(app))
--> model.py
I need to import db in model.py. I can either import it using
from app import db
or
from . import db
Is there a difference between the two? Does one method have any advantages over the other method?
Absolute imports are preferred because they are quite clear and straightforward. It is easy to tell exactly where the imported resource is, just by looking at the statement. In fact, pep8 explicitly recommends absolute imports.
Sometimes, however, absolute imports can get quite verbose, depending on the complexity of the directory structure. Imagine having a statement like this:
from package1.subpackage2.subpackage3.subpackage4.module5 import function6
This looks ridiculous! Right?
So, Relative imports comes into picture. A relative import specifies the resource to be imported relative to the current location—that is, the location where the import statement is.
Above complex import statement becomes:
from ..subpackage4.module5 import function6
Hope this helps!
I have a Python package called Util. It includes a bunch of files. Here is the include statements on top of one of the files in there:
from config_util import ConfigUtil #ConfigUtil is a class inside the config_util module
import error_helper as eh
This works fine when I run my unit tests.
When I install Util in a virtual environment in another package everything breaks. I will need to change the import statements to
from Util.config_util import ConfigUtil
from Util import error_helper as eh
and then everything works as before. So is there any point in using the first form or is it safe to say that it is better practice to always use the second form?
If there is no point in using the first form, then why is it allowed?
Just wrong:
from config_util import ConfigUtil
import error_helper as eh
It will only work if you happen to be in the directory Util, so that the imports resolve in the current working directory. Or you have messed with sys.path using some bad hack.
Right (using absolute imports):
from Util.config_util import ConfigUtil
import Util.error_helper as eh
Also right (using relative imports):
from .config_util import ConfigUtil
import .error_helper as eh
There is no particular advantage to using relative imports, only a couple of minor things I can think of:
Saves a few bytes in the source file (so what / who cares?)
Enables you to rename the top level without editing import statements in source code (...but how often do you do that?)
For your practical problems, maybe this answer can help you.
Regarding your direct question: there's not a lot to it, but they let you move files and rename containing directories more easily. You may also prefer relative imports for stylistic reasons; I sure do.
The semantics are the same if the paths are correct. If your module is foo.bar, then from foo.bar.baz import Baz and from .baz import Baz are the same. If they don't do the same, then you're likely calling your Python file as a script (python foo/bar.py), in which case it will be module __main__ instead of foo.bar.
ADVICE REQUEST: Best Practices for Python Imports
I need advice re: how major projects do Python imports and the standard / Pythonic way to set this up.
I have a project Foobar that's a subproject of another, much larger and well-used project Dammit. My project directory looks like this:
/opt/foobar
/opt/foobar/bin/startFile.py
/opt/foobar/bin/tests/test_startFile.py
/opt/foobar/foobarLib/someModule.py
/opt/foobar/foobarLib/tests/test_someModule.py
So, for the Dammit project, I add to PYTHONPATH:
export PYTHONPATH=$PYTHONPATH:/opt/foobar
This lets me add a very clear, easy-to-track-down import of:
from foobarLib.someModule import SomeClass
to all 3 of:
* Dammit - using foobarLib in the import makes everyone know it's not in Dammit project, it's in Foobar.
* My startFile.py is an adjunct process that has the same pythonpath import.
* test_someModule.py has an explicit import, too.
PROBLEM: This only works for foobarLib, not bin.
I have tests I want to run on in /opt/foobar/bin/tests/test_startFile.py. But, how to set up the path and do the import? Should I do a relative import like this:
PROJ_ROOT = os.path.abspath(os.path.join(os.path.dirname(__file__),".."))
sys.path.insert(0, PROJ_ROOT)
Or, should I rename the bin dir to be foobarBin so I do the import as:
from foobarBin.startFile import StartFile
I'm tempted to do the following:
/opt/foobarProject/foobar
/opt/foobarProject/foobar/bin/startFile.py
/opt/foobarProject/foobar/bin/tests/test_startFile.py
/opt/foobarProject/foobar/foobarLib/someModule.py
/opt/foobarProject/foobar/foobarLib/tests/test_someModule.py
then, I can do all my imports with:
import foobar.bin.startFile
import foobar.lib.someModule
Basically, for large Python projects (AND THEIR ADJUNCTS), it seems to me we have goals of:
minimize number of dirs added to pythonpath;
make it obvious where imports are coming from. That is, people use 'import lib.thing' a lot, and if there's more than one directory in the pythonpath named 'lib',
it's troublesome / non-obvious where that is, short of invoking
python and searching sys.modules, etc.
minimize number of times we add paths to sys.path since that's runtime and somewhat non-obvious.
I've used Python a long time, and been part of projects that do it various ways, some of them more stupid and some less so. I guess I'm wondering if there is a best-practices here, and the reason for it? Is my convention of adding foobarProject layer okay, good, bad, or just one more way among many?
I'm currently coding an app which is basically structured that way :
main.py
+ Package1
+--- Class1.py
+--- Apps
+ Package2
+--- Class1.py
+--- Apps
So I have two questions :
First, inside both packages, there are modules needed by all Apps, eg : re. Is there a way I can import a module for the whole package at once, instead of importing it in every file that needs it ?
And, as you can see, Class1 is used in both packages. Is there a good way to share it between both packages to avoid code duplication ?
I would strongly recommend against doing this: by separating the imports from the module that uses the functionality, you make it more difficult to track dependencies between modules.
If you really want to do it though, one option would be to create a new module called common_imports (for example) and have it do the imports you are after.
Then in your other modules, add the following:
from common_imports import *
This should give you all the public names from that module (including all the imports).
To answer your second question, if your two modules named Class1.py are in fact the same, then you should not copy it to both packages. Place it in a package which will contain only code which is common to both, and then import it. It is absolutely not necessary to copy the file and try to maintain each change in both copies.
Q1:
You Must find a way to import your package, thus you have two choices:
(Please correct me if I'm wrong or not thorough)
1. Look at James' solution, which you need to define a class, put all the modules inside and finally import them to your Sub-classes
2. Basically import nothing to your Main class, but instead, import only once to your Sub-classes
For example:(inside A_1 subclass)
import re
def functionThatUseRe(input):
pass
Then inside your main class, just do
try:
from YourPackage import A_1 #windows
except:
import A_1 #MAC OSX
A_1.functionThatUseRe("")
And you completely avoided importing modules multiple times
Q2: put your class1.py in the same directory with your main class, or move it to another folder, in Package1(&2).Apps
import Class1
Start using the code from there