Why can I import successfully without __init__.py? - python

What exactly is the use of __init__.py? Yes, I know this file makes a directory into an importable package. However, consider the following example:
project/
foo/
__init__.py
a.py
bar/
b.py
If I want to import a into b, I have to add following statement:
sys.path.append('/path_to_foo')
import foo.a
This will run successfully with or without __init__.py. However, if there is not an sys.path.append statement, a "no module" error will occur, with or without __init__.py. This makes it seem lik eonly the system path matters, and that __init__.py does not have any effect.
Why would this import work without __init__.py?

__init__.py has nothing to do with whether Python can find your package. You've run your code in such a way that your package isn't on the search path by default, but if you had run it differently or configured your PYTHONPATH differently, the sys.path.append would have been unnecessary.
__init__.py used to be necessary to create a package, and in most cases, you should still provide it. Since Python 3.3, though, a folder without an __init__.py can be considered part of an implicit namespace package, a feature for splitting a package across multiple directories.
During import processing, the import machinery will continue to
iterate over each directory in the parent path as it does in Python
3.2. While looking for a module or package named "foo", for each directory in the parent path:
If <directory>/foo/__init__.py is found, a regular package is imported and returned.
If not, but <directory>/foo.{py,pyc,so,pyd} is found, a module is imported and returned. The exact list of extension varies by platform
and whether the -O flag is specified. The list here is
representative.
If not, but <directory>/foo is found and is a directory, it is recorded and the scan continues with the next directory in the parent
path.
Otherwise the scan continues with the next directory in the parent path.
If the scan completes without returning a module or package, and at
least one directory was recorded, then a namespace package is created.

If you really want to avoid __init__.py for some reason, you don't sys.path. Rather, create a module object and set its __path__ to a list of directories.

if I want to import a into b, I have to add following statement:
No! You'd just say: import foo.a. All this is provided you run the entire package at once using python -m main.module where main.module is the entry point to your entire application. It imports all other modules, and the modules that import more modules will try to look for them from the root of this project. For instance, foo.bar.c will import as foo.bar.b
Then it seems that only the system path matters and init.py does not have any effect.
You need to modify sys.path only when you are importing modules from locations that are not in your project, or the places where python looks for libraries. __init__.py not only makes a folder look like a package, it also does a few more things like "export" objects to outside world (__all__)

When you import something it has to either:
Retrieve an already loaded module or
Load the module that was imported
When you do import foo and python finds a folder called foo in a folder on your sys.path then it will look in that folder for an __init__.py to be considered the top level module.
(Note that if the package is not on your sys.path then you would need to append it's location to be able to import it.)
If that is not present it will look for a __init__.pyc version possibly in the __pycache__ folder, if that is also missing then that folder foo is not considered a loadable python package. If no other options for foo are found then an ImportError is raised.
If you try deleting the __init__.pyc file as well you will see that the the initializer script for a package is indeed necessary.

Related

File not found on import when the same script is imported onto two other python scripts [duplicate]

This is a python newbie question:
I have the following directory structure:
test
-- test_file.py
a
-- b
-- module.py
where test, a and b are folders. Both test and a are on the same level.
module.py has a class called shape, and I want to instantiate an instance of it in test_file.py. How can I do so?
I have tried:
from a.b import module
but I got:
ImportError: No module named a.b
What you want is a relative import like:
from ..a.b import module
The problem with this is that it doesn't work if you are calling test_file.py as your main module. As stated here:
Note that both explicit and implicit relative imports are based on the name of the current module. Since the name of the main module is always "main", modules intended for use as the main module of a Python application should always use absolute imports.
So, if you want to call test_file.py as your main module, then you should consider changing the structure of your modules and using an absolute import, else just use the relative import from above.
The directory a needs to be a package. Add an __init__.py file to make it a package, which is a step up from being a simple directory.
The directory b also needs to be a subpackage of a. Add an __init__.py file.
The directory test should probably also be a package. Hard to say if this is necessary or not. It's usually a good idea for every directory of Python modules to be a formal package.
In order to import, the package needs to be on sys.path; this is built from the PYTHONPATH environment variable. By default the installed site-packages and the current working directory are (effectively) the only two places where a package can be found.
That means that a must either be installed, or, your current working directory must also be a package one level above a.
OR, you need to set your PYTHONPATH environment variable to include a.
http://docs.python.org/tutorial/modules.html#the-module-search-path
http://docs.python.org/using/cmdline.html#envvar-PYTHONPATH
Also, http://docs.python.org/library/site.html for complete information on how sys.path is built.
The first thing to do would be to quickly browse the official docs on this.
To make a directory a package, you'll have to add a __init__.py file. This means that you'll have such a file in the a and b directories. Then you can directly do an
import a.b.module
But you'll have to refer to it as a.b.module which is tedious so you can use the as form of the import like so
import a.b.module as mod #shorter name
and refer to it as mod.
Then you can instantiate things inside mod using the regular conventions like mod.shape().
There are a few other subtleties. Please go through the docs for details.

Unable to import class even though I already have __init__.py files

I'm trying to import a class in a different directory to another file, but can't seem to get it to work. I know this question has been asked a lot and I have looked through multiple stackoverflow solutions and at https://docs.python.org/3/tutorial/modules.html#packages
1: Importing files from different folder
2: import python file in another directory failed
I want to try to just use the method containing just __init__.py file instead of doing an import sys
My directory structure is as follows:
django_vue/
__init__.py
devices/
__init__.py
models.py
lib/
__init__.py
my_file.py
I'm trying to import the class Device from /django_vue/devices/models.py to /django_vue/lib/my_file.py by:
from devices.models import Device
However when I do that I still get the error:
from devices.models import Device
ModuleNotFoundError: No module named 'devices'
I'm not sure what I'm dong wrong since I already have the __init__ file in both directories. Any help is appreciated. Also I'm running python 3.6.
This is the folder structure I'm working with.
.
└── django_vue
├── devices
│   └── models.py
└── lib
└── file.py
When you run
$ python file.py
python has no way of knowing what's outside the directory.
python can't go back and then into devices/ just like that.
The easiest way to solve this would be to add the folder devices/ to sys.path. When python imports a module, it searches for the module from sys.path. Adding the path to devices/ would make it available for imports.
Here are my files.
# models.py
Device = 'device'
# file.py
import sys
sys.path.append('..') # adds the parent dir (which is django-vue/) to path
# django-vue dir has devices/ so now this is available for imports
# importing this works now
from devices.models import Device
print(Device)
Output
django_vue/lib$ python3 file.py
device
Think about it your are inside my_file.py and import something called devices.
How can python know where the name devices has come from.
It won't search your entire Drive for that module/package
Relative Import
use a relative import instead. write from ..devices.models import Device. This is telling python to go up one directory to the parent directory and that's where it will find the devices package. Your lib module should now work as a module
If you however run the my_file.py package directly (as in python C:/django_vue/lib/my_file.py)
You will still get an error. Not the same error; the new error will be something like
ImportError: attempted relative import with no known parent package
This is happening because you are actually running my_file.py
If you think about it why would you want to run my_file.py by itself when it is clearly a part of a package. Maybe you are just testing to see if the import works when you use your package. The problem with this is that it makes it seem like your packages relative imports don't work even though this actually works.
Create a main.py in django_vue and write from lib import my_file. This will run your my_file.py and you will notice there is no error.
What's happening here
Have you heard of __package__?
if you put print(str(__package__)) in your my_file.py and run my_file.py directly you will see that it prints None.
However if you run main.py (that you just created) you will see that when It reaches my_file.py, __package__ will actually be defined to something.
Ahhh... you see now it all makes sense; The error you originally got said something about no known parent package. If __package__ is undefined that means there is no relative parent package because the file was obviously run directly instead of as part of a package.
Consider Absolute imports
you also might want to consider using absolute imports because if you are working on the package you might change it directory structure while developing. so you need to keep changing the import references on the affected files.
Although you can find IDE's with python extensions that automatically to this as you change your directory. I believe VS Code does this automatically.
Replace the __init__ files with __main__.

Do I need to add my project directory to the system path in every script to import a function from another directory?

I'm trying to keep a data science project well-organized so I've created a directory inside my src directory called utils that contains a file called helpers.py, which contains some helper functions that will be used in many scripts. What is the best practice for how I should import func_name from src/utils/helpers.py into a file in a totally different directory, such as src/processing/clean_data.py?
I see answers to this question, and I've implemented a solution that works, but this feels ugly:
sys.path.append(os.path.dirname(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))))
Am I doing this right? Do I need to add this to every script that wants to import func_name, like train_model.py?
My current project folder structure:
myproject
/notebooks
notebook.ipynb
/src
/processing
clean_data.py
/utils
helpers.py
/models
train_model.py
__init__.py
Example files:
# clean_data.py
import os
import sys
sys.path.append(os.path.dirname(os.path.dirname(os.path.dirname(os.path.realpath(__file__))))))
from src.utils.helpers import func_name
func_name()
# helpers.py
def func_name():
print('I'm a helper function.')
The correct way to do it is to use __init__.py, setup.py and the setuptools Python package:
myPackage/
myPackage/
__init__.py
setup.py
This link has all the steps.
First of all, let me describe you the differences between a Python module & a Python package so that both of us are on the same page. ✌
A module is a single .py file (or files) that are imported under one import and used. ✔
import aModuleName
# Here 'aModuleName' is just a regular .py file.
Whereas, a package is a collection of modules in directories that give a package hierarchy. A package contains a distinct __init__.py file. ✔
from aPackageName import aModuleName
# Here 'aPackageName` is a folder with a `__init__.py` file
# and 'aModuleName', which is just a regular .py file.
Therefore, when we have a project directory named proj-dir of the following structure ⤵
proj-dir
--|--__init__.py
--package1
--|--__init__.py
--|--module1.py
--package2
--|--__init__.py
--|--module2.py
🔎 Notice that I've also added an empty __init__.py into the proj-dir itself which makes it a package too.
👍 Now, if you want to import any python object from module2 of package2 into module1 of package1, then the import statement in the file module1.py would be
from package2.module2 import object2
# if you were to import the entire module2 then,
from package2 import module2
I hope this simple explanation clarifies your doubts on Python imports' mechanism and solves the problem. If not then do comment here. 😊
First of all let me clarify you that importing an entire module, if you are going to use a part of it, then is not a good idea. Instead of that you can use from to import specific function under a library/package. By doing this, you make your program efficient in terms of memory and performance.
To know more refer these:
'import module' or 'from module import'
difference between import and from
Net let us look into the solution.
Before starting off with the solution, let me clarify you the use of __init__.py file. It just tells the python interpreter that the *.py files present there are importable which means they are modules and are/maybe a part of a package.
So, If you have N no of sub directories you have to put __init__.py file in all those sub directories such that they can also be imported. Inside __init__.py file you can also add some additional information like which path should be included, default functions,variables,scope,..etc. To know about these just google about __init__.py file or take some python library and go through the same __init__.py file to know about it. (Here lies the solution)
More Info:
modules
Be pythonic
So as stated by #Sushant Chaudhary your project structure should be like
proj-dir
--|--__init__.py
--package1
--|--__init__.py
--|--module1.py
--package2
--|--__init__.py
--|--module2.py
So now, If I put __init__.py file under my directory like above, Will
it be importable and work fine?
yes and no.
Yes :
If you are importing the modules within that project/package directory.
for example in your case
you are importing package1.module1 in pakage2.module2 as from package1 import module1.
Here you have to import the base dir inside the sub modules, Why? the project will run fine if you are running the module from the same place. i.e: inside package2 as python module2.py, But will throw ModuleNotFoundError If you run the module from some other directory. i.e: any other path except under package2 for example under proj-dir as python package2/module2.py. This is what happening in your case. You are running the module from project-dir.
So How to fix this?
1- You have to append basedir path to system path in module2.py as
from sys import path
dir_path = "/absolute/path/to/proj-dir"
sys.path.insert(0, dir_path)
So that module2 will be able to find package1 (and module1 inside it).
2- You have to add all the sub module paths in __init__.py file under proj-dir.
For example:
#__init__.py under lxml
# this is a package
def get_include():
"""
Returns a list of header include paths (for lxml itself, libxml2
and libxslt) needed to compile C code against lxml if it was built
with statically linked libraries.
"""
import os
lxml_path = __path__[0]
include_path = os.path.join(lxml_path, 'includes')
includes = [include_path, lxml_path]
for name in os.listdir(include_path):
path = os.path.join(include_path, name)
if os.path.isdir(path):
includes.append(path)
return includes
This is the __init__.py file of lxml (a python library for parsing html,xml data). You can refer any __init__.py file under any python libraries having sub modules.ex (os,sys). Here I've mentioned lxml because I thought it will be easy for you to understand. You can even check __init__.py file under other libraries/packages. Each will have it's own way of defining the path for submodules.
No
If you are trying to import modules outside the directory. Then you have to export the module path such that other modules can find them into environment variables. This can be done directly by appending absolute path of the base dir to PYTHONPATH or to PATH.
To know more:
PATH variables in OS
PYTHONPATH variable
So to solve your problem, include the paths to all the sub modules in __init__.py file under proj-dir and add the /absolute/path/to/proj-dir either to PYTHONPATH or PATH.
Hope the answer explains you about usage of __init__.py and solves your problem.
On Linux, you can just add the path to the parent folder of your src directory to ~/.local/lib/python3.6/site-packages/my_modules.pth. See
Using .pth files. You can then import modules in src from anywhere on your system.
NB1: Replace python3.6 by any version of Python you want to use.
NB2: If you use Python2.7 (don't know for other versions), you will need to create __init__.py (empty) files in src/ and src/utils.
NB3: Any name.pth file is ok for my_modules.pth.
Yes, you can only import code from installed packages or from files in you working directory or subdirectories.
the way I see it, your problem would be solved if you would have your module or package installed, like an yother package one installs and then imports (numpy, xml, json etc.)
I also have a package I constantly use in all my projects, ulitilies, and I know it's a pain with the importing.
here is a description on how to How to package a python application to make it pip-installable:
https://marthall.github.io/blog/how-to-package-a-python-app/
Navigate to your python installation folder
Navigate to lib
Navigate to site-packages
Make a new file called any_thing_you_want.pth
Type .../src/utils/helpers.py inside that file with your favorite text editor
Note: the ellipsis before scr/utils/helpers.py will look something like: C:/Users/blahblahblah/python_folders/scr... <- YOU DO NEED THIS!
This is a cheap way out but it keeps code clean, and is the least complicated. The downside is, for every folder your modules are in, example.pth will need them. Upside: works with Windows all the way up to Windows 10

How to import a simple class in working directory with python?

I read various answer (relative import), but none work to import a simple class. I have this structure:
__init__.py (empty)
my_script.py
other_script.py
my_class.py
I want to use my_class.py in both script (my_script.py and other_script.py). Like all answer suggest, I just use:
from .my_class import My_class
but I always get
SystemError: Parent module '' not loaded, cannot perform relative import
I am using Python 3.5 and PyCharm 2016.1.2. Does I need to configure the __init__.py? How can I import a simple class?
Edit
All the files are in the working directory. I just use the Pycharm to run and I wasn't having problem until try to import the class.
Ensure that your current working directory is not the one that contains these files. For example, if the path to __init__.py is spam/eggs/__init__.py, make sure you are working in directory spam—or alternatively, that /path/to/spam is in sys.path and you are working in some third place. Either way, do not attempt to work in directory eggs. To test your code, you say import eggs.
The reasoning is as follows. Since you're using __init__.py you clearly want to treat this collection of files as a package. To work as a package, a directory must fulfill both the following criteria:
it contains a file called __init__.py
its parent directory is part of the search path (or the parent directory is your current working directory); in effect, a package directory must masquerade as a file, i.e. be findable in exactly the same way that a single-file module might be found.
If you're working inside the package directory you may not be fulfilling criterion 2. You can do a straightforward import my_class from there, certainly, but the "relative import" thing you're trying is a feature that only a fully-working package will support.

Importing correctly with pytest

I just got set up to use pytest with Python 2.6. It has worked well so far with the exception of handling "import" statements: I can't seem to get pytest to respond to imports in the same way that my program does.
My directory structure is as follows:
src/
main.py
util.py
test/
test_util.py
geom/
vector.py
region.py
test/
test_vector.py
test_region.py
To run, I call python main.py from src/.
In main.py, I import both vector and region with
from geom.region import Region
from geom.vector import Vector
In vector.py, I import region with
from geom.region import Region
These all work fine when I run the code in a standard run. However, when I call "py.test" from src/, it consistently exits with import errors.
Some Problems and My Solution Attempts
My first problem was that, when running "test/test_foo.py", py.test could not "import foo.py" directly. I solved this by using the "imp" tool. In "test_util.py":
import imp
util = imp.load_source("util", "util.py")
This works great for many files. It also seems to imply that when pytest is running "path/test/test_foo.py" to test "path/foo.py", it is based in the directory "path".
However, this fails for "test_vector.py". Pytest can find and import the vector module, but it cannot locate any of vector's imports. The following imports (from "vector.py") both fail when using pytest:
from geom.region import *
from region import *
These both give errors of the form
ImportError: No module named [geom.region / region]
I don't know what to do next to solve this problem; my understanding of imports in Python is limited.
What is the proper way to handle imports when using pytest?
Edit: Extremely Hacky Solution
In vector.py, I changed the import statement from
from geom.region import Region
to simply
from region import Region
This makes the import relative to the directory of "vector.py".
Next, in "test/test_vector.py", I add the directory of "vector.py" to the path as follows:
import sys, os
sys.path.append(os.path.realpath(os.path.dirname(__file__)+"/.."))
This enables Python to find "../region.py" from "geom/test/test_vector.py".
This works, but it seems extremely problematic because I am adding a ton of new directories to the path. What I'm looking for is either
1) An import strategy that is compatible with pytest, or
2) An option in pytest that makes it compatible with my import strategy
So I am leaving this question open for answers of these kinds.
The issue here is that Pytest walks the filesystem to discover files that contain tests, but then needs to generate a module name that will cause import to load that file. (Remember, files are not modules.)
Pytest comes up with this test package name by finding the first directory at or above the level of the file that does not include an __init__.py file and declaring that the "basedir" for the module tree containing a module generated from this file. It then adds the basedir to sys.path and imports using the module name that will find that file relative to the basedir.
There are some implications of this of which you should beware:
The basepath may not match your intended basepath in which case the module will have a name that doesn't match what you would normally use. E.g., what you think of as geom.test.test_vector will actually be named just test_vector during the Pytest run because it found no __init__.py in src/geom/test/ and so added that directory to sys.path.
You may run into module naming collisions if two files in different directories have the same name. For example, lacking __init__.py files anywhere, adding geom/test/test_util.py will conflict with test/test_util.py because both are loaded as import test_util.py, with both test/ and geom/test/ in the path.
The system you're using here, without explicit __init__.py modules, is having Python create implicit namespace packages for your directories. (A package is a module with submodules.) Ideally we'd configure Pytest with a path from which it would also generate this, but it doesn't seem to know how to do that.
The easiest solution here is simply to add empty __init__.py files to all of the subdirectories under src/; this will cause Pytest to import everything using package/module names that start with directory names under src/.
The question How do I Pytest a project using PEP 420 namespace packages? discusses other solutions to this.
import looks in the following directories to find a module:
The home directory of the program. This is the directory of your root script. When you are running pytest your home directory is where it is installed (/usr/local/bin probably). No matter that you are running it from your src directory because the location of your pytest determines your home directory. That is the reason why it doesn't find the modules.
PYTHONPATH. This is an environment variable. You can set it from the command line of your operating system. In Linux/Unix systems you can do this by executing: 'export PYTHONPATH=/your/custom/path' If you wanted Python to find your modules from the test directory you should include the src path in this variable.
The standard libraries directory. This is the directory where all your libraries are installed.
There is a less common option using a pth file.
sys.path is the result of combining the home directory, PYTHONPATH and the standard libraries directory. What you are doing, modifying sys.path is correct. It is something I do regularly. You could try using PYTHONPATH if you don't like messing with sys.path
If you include an __init__.py file inside your tests directory, then when the program is looking to set a home directory it will walk 'upwards' until it finds one that does not contain an init file. In this case src/.
From here you can import by saying :
from geom.region import *
you must also make sure that you have an init file in any other subdirectories, such as the other nested test directory
I was wondering what to do about this problem too. After reading this post, and playing around a bit, I figured out an elegant solution. I created a file called "test_setup.py" and put the following code in it:
import sys, os
sys.path.append(os.path.dirname(os.path.abspath(__file__)))
I put this file in the top-level directory (such as src). When pytest is run from the top-level directory, it will run all test files including this one since the file is prefixed with "test". There are no tests in the file, but it is still run since it begins with "test".
The code will append the current directory name of the test_setup.py file to the system path within the test environment. This will be done only once, so there are not a bunch of things added to the path.
Then, from within any test function, you can import modules relative to that top-level folder (such as import geom.region) and it knows where to find it since the src directory was added to the path.
If you want to run a single test file (such as test_util.py) instead of all the files, you would use:
pytest test_setup.py test\test_util.py
This runs both the test_setup and test_util code so that the test_setup code can still be used.
Are so late to answer that question but usining python 3.9 or 3.10 u just need to add __init__.py folder in tests folders.
When u add this file python interprets this folders as a module.
Wold be like this
src/
main.py
util.py
test/
__init__.py
test_util.py
geom/
vector.py
region.py
test/
__init__.py
test_vector.py
test_region.py
so u just run pytest.
Sorry my poor english
Not the best solution, but maybe the fastest one:
cd path/python_folder
python -m pytest python_file.py

Categories

Resources