I'm currently coding an app which is basically structured that way :
main.py
+ Package1
+--- Class1.py
+--- Apps
+ Package2
+--- Class1.py
+--- Apps
So I have two questions :
First, inside both packages, there are modules needed by all Apps, eg : re. Is there a way I can import a module for the whole package at once, instead of importing it in every file that needs it ?
And, as you can see, Class1 is used in both packages. Is there a good way to share it between both packages to avoid code duplication ?
I would strongly recommend against doing this: by separating the imports from the module that uses the functionality, you make it more difficult to track dependencies between modules.
If you really want to do it though, one option would be to create a new module called common_imports (for example) and have it do the imports you are after.
Then in your other modules, add the following:
from common_imports import *
This should give you all the public names from that module (including all the imports).
To answer your second question, if your two modules named Class1.py are in fact the same, then you should not copy it to both packages. Place it in a package which will contain only code which is common to both, and then import it. It is absolutely not necessary to copy the file and try to maintain each change in both copies.
Q1:
You Must find a way to import your package, thus you have two choices:
(Please correct me if I'm wrong or not thorough)
1. Look at James' solution, which you need to define a class, put all the modules inside and finally import them to your Sub-classes
2. Basically import nothing to your Main class, but instead, import only once to your Sub-classes
For example:(inside A_1 subclass)
import re
def functionThatUseRe(input):
pass
Then inside your main class, just do
try:
from YourPackage import A_1 #windows
except:
import A_1 #MAC OSX
A_1.functionThatUseRe("")
And you completely avoided importing modules multiple times
Q2: put your class1.py in the same directory with your main class, or move it to another folder, in Package1(&2).Apps
import Class1
Start using the code from there
Related
The Challenge
I am working on a Python project that will act as a translation layer for the SCPI command line interface on scientific instruments.
It is similar to the concept described in Multi layer package in python however my situation is slightly more complex and I can't seem to figure out how to make it work.
The following is a representation of my project structure (I am happy to change it if it is required):
Some things to keep in mind
The names of the files in translator_lib.instruments.supplierX.moduleX are only class1.py, class2.py and class3.py, the rest of the filename is for reference to describe where they are derived from.
Many of the modules and classes inside of supplier1 and supplier2 have the same names (as in the example).
Every directory contains a __init__.py file
The __init__.py files (based on the example layout) look as follows (Note I only have one module for testing)
translator_lib\__init__.py
from .instruments import supplier1
__all__ = ['supplier1']
translator_lib\instruments\__init__.py
from .supplier1 import module1
__all__ = ['module1']
What I'm trying to do
Compile three libraries called my_translator_lib, supplier1_translator_lib and supplier2_translator_lib.
Reason
The development team would import my_translator_lib to do what they need to, but if we want to send sample code to supplier1, we want to send them the supplier1_translator_lib and they should only be able to import supplier1_translator_lib
Example 1 : Developer
from translator_lib.instruments import supplier1
from translator_lib.instruments import supplier2
class DoStuff:
__init__(self):
self.sup1_class1 = supplier1.module1.class1.class1()
self.sup2_class1 = supplier2.module1.class1.class1()
Example 2 : Supplier 1
from supplier1_translator_lib import module1
from supplier1_translator_lib import module2
class DoStuff:
__init__(self):
self.class1 = module1.class1.class1()
self.class2 = module1.class2.class2()
I've tried multiple combinations and sections from How to create a Python library and Deep dive: Create and publish your first Python library. I manage to create a library and install it, but ultimately I can only import my_translator_lib and nothing else is visible or available.
Any help in this regard would be truly appreciated.
have you created the __init__.py files?
you need to have then in every subfolder to your module.
Files named __init__.py are used to mark directories on disk as Python package directories. Try to follow this:
mydir/spam/__init__.py
mydir/spam/module.py
If this not solve your problem, as a last resource you can try to sys.path.append('path_to_other_modules')
root
- Module_1
- utils.py
- Module_2
- basic.py
I have a couple of functions in utils.py which I want to use in basic.py, I tried every possible option but not able to do it, I referred many answers, I don't want to use sys path. Is it possible, thank u in advance (:
If I have to add __init__.py, where all should I add it??
In fact there's multiple ways.
I personally would suggest a way, where you do not have to mess with the environment variable PYTHONPATH and where you do not have to change sys.path
The simplest suggestion and preferred if possible in your context
If you can, then just move Module_2/basic.py one level up, so that it is located in the parent directory of Module_1 and Module_2.
create an empty file Module_1/__init__.py
and use following import in basic.py
from Module_1.utils import function
My second preferred suggestion:
I'd suggest to create two empty files.
Module_1/__init__.py
Module_2/__init__.py
By the way I hope, that the real directory names do not contain any uppercase characters. This is strongly discouraged. compared to python2 python 3 contains no (or almost no) module with uppercase letters anymore. So I'd try to keep the same spirit.
then in basic just write
from Module_1.utils import function
!!! But:
Instead of typing
python Module_2/basic.py
you had to be in the parent directory of Module_1 and Module_2
python -m Module_2.basic
or if under a unix like operation system you could also type
PYTHONPATH="/path/to/parent_of_Mdule_1" python Module_2/basic.py
My not so preferred suggestion:
It looks simpler but is a little more magic and might not be desired with bigger projects in a bigger contents.
It seems simpler, but for bigger projects you might get lost and when you start using coding style checkers like flake8 you will get coding style warnings
Just add at the beginning of Module_2/basic.py following lines.
import os
import sys
TOPDIR = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.insert(0, TOPDIR)
and import then the same way as in my first solution
from Module_1.utils import function
If you want to make a custom package importable the, put __init__.py in that dir. In your case, __init__.py should be in Module_1 dir. Similarly, if you want utilise the code of basic.py somewhere else then put __init__.py in Module_2 dir.
try add __init__.py in Module_1 folder and import in basic.py like this:
from Module_1.utils import func
Is it ok to use a module referencing with more than two dots in a path? Like in this example:
# Project structure:
# sound
# __init__.py
# codecs
# __init__.py
# echo
# __init__.py
# nix
# __init__.py
# way1.py
# way2.py
# way2.py source code
from .way1 import echo_way1
from ...codecs import cool_codec
# Do something with echo_way1 and cool_codec.
UPD: Changed the example. And I know, this will work in a practice. But is it a common method of importing or not?
update Nov. 24,2020
If you wanna dig deeper in python's relative-import, I strongly recommend you this answer.
Is it ok to use a module referencing with more than two dots in a path?
Yes. You can use multiple dots in relative import path, but it is only feasible when using from xxx import yyy syntax, not import xxx syntax. Moreover, single dot, two dots and three dots mean current directory, parent directory and grandparent directory respectively, and so on.
And I know, this will work in a practice. But is it a common method of importing or not?
It depends. If your project has complex directory structure, using absolute import would be "disgusting". For example,
from sub1.sub2.sub3.sub4.sub5 import yourmethod
. In this case, using relative import will make your code clean and neat. Maybe look like
from ...sub5 import yourmethod
From PEP8:
Absolute imports are recommended, as they are usually more readable and tend to be better behaved (or at least give better error messages) if the import system is incorrectly configured (such as when a directory inside a package ends up on sys.path):
import mypkg.sibling
from mypkg import sibling
from mypkg.sibling import example
However, explicit relative imports are an acceptable alternative to absolute imports, especially when dealing with complex package layouts where using absolute imports would be unnecessarily verbose:
from . import sibling
from .sibling import example
Standard library code should avoid complex package layouts and always use absolute imports.
I have a file structure like this:
dir_a
__init__.py
mod_1.py
mod_2.py
dir_b
__init__.py
my_mod.py
I want to dynamically import every module in dir_a in my_mod.py, and in order to do that, I have the following inside the __init__.py of dir_a:
import os
import glob
__all__ = [os.path.basename(f)[:-3] for f in glob.glob(os.path.dirname(os.path.abspath(__file__)) + "/*.py")]
But when I do:
from dir_a import *
I get ImportError: No module named dir_a
I wonder what causes this, is it because the parent directory containing dir_a and dir_b has to be in PYTHONPATH? In addition, the above approach is dynamic only in the sense that my_mod.py picks up everything when it is run, but if a new module is added to dir_a during its run this new module won't be picked up. So is it possible to implement a truly dynamic importing mechanism?
The answer to the first part of your question is a trivial yes: you cannot import dir_a when the directory containing dir_a isn't in the search path.
For the second part, I don't believe that is possible in general - you could listen for 'file created' OS signals in dir_a, see whether the created file is a .py, and add it to dir_a.__all__ - but that is complicated, probably expensive, and doesn't retroactively add the new thing to the global namespace, since from foo import * only sees what is in foo.__all__ at import time. And changing that would be error-prone - it allows your namespace to change at any time based on an unpredictable external event. Say you do this:
from dir_a import *
bar = 5
And then a bar.py gets added to dir_a. You can't write a "truly dynamic importer" without considering that situation, and deciding what you want to have happen.
I am importing a lot of different scripts, so at the top of my file it gets cluttered with import statements, i.e.:
from somewhere.fileA import ...
from somewhere.fileB import ...
from somewhere.fileC import ...
...
Is there a way to move all of these somewhere else and then all I have to do is import that file instead so it's just one clean import?
I strongly advise against what you want to do. You are doing the global include file mistake again. Although only one module is importing all your modules (as opposed to all modules importing the global one), the remaining point is that if there's a valid reason for all those modules to be collected under a common name, fine. If there's no reason, then they should be kept as separate includes. The reason is documentation. If I open your file, and see only one import, I don't get any information about what is imported and where it comes from. If on the other hand, I have the list of imports, I know at a glance what is needed and what not.
Also, there's another important error I assume you are doing. When you say
from somewhere.fileA import ...
from somewhere.fileB import ...
from somewhere.fileC import ...
I assume you are importing, for example, a class, like this
from somewhere.fileA import MyClass
this is wrong. This alternative solution is much better
from somewhere import fileA
<later>
a=fileA.MyClass()
Why? two reasons: first, namespacing. If you have two modules having a class named MyClass, you would have a clash. Second, documentation. Suppose you use the first option, and I find in your code the following line
a=MyClass()
now I have no idea where this MyClass comes from, and I will have to grep around all your files in order to find it. Having it qualified with the module name allows me to immediately understand where it comes from, and immediately find, via a /search, where stuff coming from the fileA module is used in your program.
Final note: when you say "fileA" you are doing a mistake. There are modules (or packages), not files. Modules map to files, and packages map to directories, but they may also map to egg files, and you may even create a module having no file at all. This is naming of concepts, and it's a lateral issue.
Of course there is; just create a file called myimports.py in the same directory where your main file is and put your imports there. Then you can simply use from myimports import * in your main script.