I am trying to import a file from the main directory using python. I tried several methods but no luck :(.
My file directory
|_ api
|__init__.py
|_api1.py
|_api2.py
|_api3.py
|_api4.py
|_api5.py
|_api6.py
|__init__.py
|_api.py
|_db_ops.py
Challenge
I am trying to import some methods of api.py into api1.py and api2.py.
The solutions I have found were:
1. Add __ini__.py file
I have added them as you can see but did not work.
2. import using `import api`
did not worked rather throws an error **ModuleNotFoundError**
3. import using 'from . import api'
throws an error "cannot import from unknown module"
4. even tried to create a parent folder **test** and then tried with import test.apicall but still did not worked!
Any help.
Thanks
first of all, the file name is __init__.py and not __ini__.py.
by default, you only can import directory from the same level or below where you are using the python command.
example:
myproject/
project/
__init__.py
foo.py
bar.py
if the terminal in myproject and i run python project/foo.py, foo can import bar.py , but if i am myproject/project and i run python foo.py, foo cannot import bar.py
to change the default you can use
import sys
sys.path.append('path of my dir')
It depends on what you want to do. If you want to run a script in a lower-level folder, Python will not let you do it. On the other hand, let's say for simplicity you have the folder structure
mypackage/
subpackage/
__init__.py
api1.py
__init__.py
api.py
If api.py looks like the following
def foo(a):
print(a)
then in api1.py you can have a relative import like
from ..api import foo as f
Then you can't, however, run api1.py as a script, but you can use f as a function to extend things in api1.py or write a script with the content
from mypackage.subpackage.api1 import f
so in that case, it will look like the function foo was in api1.py.
I'll note that this was heavily inspired by the Python-documentation on relative imports... (https://docs.python.org/3/reference/import.html)
I have a directory structure like this.
Chatbot/
utils/
abc.py
projects/
proj1/
utils/
__init__.py
data_process.py
components/
class1.py
I have two utils folders in my structure, one at the top level and one inside my projects folder.
Now I want to import data_process.py file inside class1.py. So I tried like this
from utils.data_process import DataProcess
but it is referencing the top-level utils folder and even VSCode is not recognizing it. I tried creating __init__.py file inside the utils folder but still did not work.
I tried with empty __init__.py and then placing this content
from . import data_process
__all__ = ['data_proces']
then
from .data_process import DataPreprocess
__all__ = ['DataPreprocess']
then I tried
from ..utils.data_process import DataProcess
VSCode is recognizing this but it is not working and throws the error
ValueError: attempted relative import beyond top-level package
Even I tried changing the name utils to some other name but still the same issue
How can I solve this?
A python module is defined by a special file called __init__.py and this file should be present in all the subdirectories. So, your file structure would be:
Chatbot/
__init__.py
utils/
__init__.py
abc.py
projects/
__init__.py
proj1/
__init__.py
utils/
__init__.py
data_process.py
components/
__init__.py
class1.py
class2.py
Now, you can do a relative import like:
Use . for importing something within the same directory. Example:
# file Chatbot/projects/proj1/components/class2.py
from .class1 import *
Similarly, use .. for two-level, and ... for three-level, and so on!
For instance:
# file Chatbot/projects/proj1/components/class2.py
from ..utils.data_process import * # import from data_process.py
# there will be four "." earlier I made a counting mistake
from ....utils.abc import * # import something from abc.py
Coding Convention
When writing a python module/package, you might want to follow PEP8 Convention.
Moreover, I believe you are trying to do different projects using your package Chatbot, is that correct? Then, it is a good practice that you set PYTHONPATH for Chatbot and do all the projects, and imports seperately.
EDIT: Uploading Dummy File
I'm able to work on this project seamlessly even with using two utils directories. Project Structure:
main file and its output:
# file main.py
from chatbot.utils.abc import hello as a1
from chatbot.projects.proj1.components.class1 import super_hello
from chatbot.projects.proj1.utils.data_process import hello as b1
print(a1(), b1())
print(super_hello())
Similarly, if you have chatbot under PYTHONPATH you can call the project from anywhere from your device.
Code details and files are added into my github account for details.
sys.path is where Python searches to find modules and packages and It does it in order.
So you can put ...../proj1/ at the beginning of the list, when python start searching it will find the utils folder in that path first !
import sys
sys.path.insert(0, r'...../proj1/')
But this cause another problem, python always find utils in that folder first, that's not what you want.
solution number 1
Absolute import:
from Chatbot.projects.proj1.utils.data_process import DataProcess
solution number 2
use relative imports which is mentioned by #Mr.Hobo.
here's a picture of my directory structure:
parts.py
machine/
__init__.py
parts.py
I have a directory (a package) called machine
in it there is __init__.py and parts.py
at the same level as machine, there is a file named parts.py
in parts.py, the code looks like this:
#parts.py
class Parts(object):
pass
in machine.parts the code looks like this
#machine.parts
from parts import Parts
class MachineParts(Parts):
pass
When I try to import machine.parts, I get an import error. I don't want to change my directory structure. How should I fix this and retain good PEP8 style?
You should make it a package by adding top-level __init__.py and giving some meaningful name to top-level directory:
mypackage
__init__.py
parts.py
machine/
__init__.py
parts.py
Then, use absolute imports:
#machine.parts
from mypackage.parts import Parts
class MachineParts(Parts):
pass
Since import supports relative imports, try:
from ..parts import Parts
Another option is to use an absolute import:
from appname.parts import Parts
As mentioned in How to import a Python class that is in a directory above?
This question already has answers here:
Python3 correct way to import relative or absolute?
(2 answers)
Closed 2 years ago.
I have the following directory:
mydirectory
├── __init__.py
├── file1.py
└── file2.py
I have a function f defined in file1.py.
If, in file2.py, I do
from .file1 import f
I get the following error:
SystemError: Parent module '' not loaded, cannot perform relative
import
Why? And how to make it work?
Launching modules inside a package as executables is a bad practice.
When you develop something you either build a library, which is intended to be imported by other programs and thus it doesn't make much sense to allow executing its submodules directly, or you build an executable in which case there's no reason to make it part of a package.
This is why in setup.py you distinguish between packages and scripts. The packages will go under site-packages while the scripts will be installed under /usr/bin (or similar location depending on the OS).
My recommendation is thus to use the following layout:
/
├── mydirectory
| ├── __init__.py
| ├── file1.py
└── file2.py
Where file2.py imports file1.py as any other code that wants to use the library mydirectory, with an absolute import:
from mydirectory.file1 import f
When you write a setup.py script for the project you simply list mydirectory as a package and file2.py as a script and everything will work. No need to fiddle with sys.path.
If you ever, for some reason, really want to actually run a submodule of a package, the proper way to do it is to use the -m switch:
python -m mydirectory.file1
This loads the whole package and then executes the module as a script, allowing the relative import to succeed.
I'd personally avoid doing this. Also because a lot of people don't even know you can do this and will end up getting the same error as you and think that the package is broken.
Regarding the currently accepted answer, which says that you should just use an implicit relative import from file1 import f because it will work since they are in the same directory:
This is wrong!
It will not work in python3 where implicit relative imports are disallowed and will surely break if you happen to have installed a file1 module (since it will be imported instead of your module!).
Even if it works the file1 will not be seen as part of the mydirectory package. This can matter.
For example if file1 uses pickle, the name of the package is important for proper loading/unloading of data.
When launching a python source file, it is forbidden to import another file, that is in the current package, using relative import.
In documentation it is said:
Note that relative imports are based on the name of the current module. Since the name of the main module is always "__main__", modules intended for use as the main module of a Python application must always use absolute imports.
So, as #mrKelley said, you need to use absolute import in such situation.
since file1 and file2 are in the same directory, you don't even need to have an __init__.py file. If you're going to be scaling up, then leave it there.
To import something in a file in the same directory, just do like this
from file1 import f
i.e., you don't need to do the relative path .file1 because they are in the same directory.
If your main function, script, or whatever, that will be running the whole application is in another directory, then you will have to make everything relative to wherever that is being executed.
myproject/
mypackage
├── __init__.py
├── file1.py
├── file2.py
└── file3.py
mymainscript.py
Example to import from one file to another
#file1.py
from myproject import file2
from myproject.file3 import MyClass
Import the package example to the mainscript
#mymainscript.py
import mypackage
https://docs.python.org/3/tutorial/modules.html#packages
https://docs.python.org/3/reference/import.html#regular-packages
https://docs.python.org/3/reference/simple_stmts.html#the-import-statement
https://docs.python.org/3/glossary.html#term-import-path
The variable sys.path is a list of strings that determines the interpreter’s search path for modules. It is initialized to a default path taken from the environment variable PYTHONPATH, or from a built-in default if PYTHONPATH is not set. You can modify it using standard list operations:
import sys
sys.path.append('/ufs/guido/lib/python')
sys.path.insert(0, '/ufs/guido/myhaxxlib/python')
Inserting it at the beginning has the benefit of guaranteeing that the path is searched before others (even built-in ones) in the case of naming conflicts.
What is __init__.py for in a Python source directory?
It used to be a required part of a package (old, pre-3.3 "regular package", not newer 3.3+ "namespace package").
Here's the documentation.
Python defines two types of packages, regular packages and namespace packages. Regular packages are traditional packages as they existed in Python 3.2 and earlier. A regular package is typically implemented as a directory containing an __init__.py file. When a regular package is imported, this __init__.py file is implicitly executed, and the objects it defines are bound to names in the package’s namespace. The __init__.py file can contain the same Python code that any other module can contain, and Python will add some additional attributes to the module when it is imported.
But just click the link, it contains an example, more information, and an explanation of namespace packages, the kind of packages without __init__.py.
Files named __init__.py are used to mark directories on disk as Python package directories.
If you have the files
mydir/spam/__init__.py
mydir/spam/module.py
and mydir is on your path, you can import the code in module.py as
import spam.module
or
from spam import module
If you remove the __init__.py file, Python will no longer look for submodules inside that directory, so attempts to import the module will fail.
The __init__.py file is usually empty, but can be used to export selected portions of the package under more convenient name, hold convenience functions, etc.
Given the example above, the contents of the init module can be accessed as
import spam
based on this
In addition to labeling a directory as a Python package and defining __all__, __init__.py allows you to define any variable at the package level. Doing so is often convenient if a package defines something that will be imported frequently, in an API-like fashion. This pattern promotes adherence to the Pythonic "flat is better than nested" philosophy.
An example
Here is an example from one of my projects, in which I frequently import a sessionmaker called Session to interact with my database. I wrote a "database" package with a few modules:
database/
__init__.py
schema.py
insertions.py
queries.py
My __init__.py contains the following code:
import os
from sqlalchemy.orm import sessionmaker
from sqlalchemy import create_engine
engine = create_engine(os.environ['DATABASE_URL'])
Session = sessionmaker(bind=engine)
Since I define Session here, I can start a new session using the syntax below. This code would be the same executed from inside or outside of the "database" package directory.
from database import Session
session = Session()
Of course, this is a small convenience -- the alternative would be to define Session in a new file like "create_session.py" in my database package, and start new sessions using:
from database.create_session import Session
session = Session()
Further reading
There is a pretty interesting reddit thread covering appropriate uses of __init__.py here:
http://www.reddit.com/r/Python/comments/1bbbwk/whats_your_opinion_on_what_to_include_in_init_py/
The majority opinion seems to be that __init__.py files should be very thin to avoid violating the "explicit is better than implicit" philosophy.
There are 2 main reasons for __init__.py
For convenience: the other users will not need to know your functions' exact location in your package hierarchy (documentation).
your_package/
__init__.py
file1.py
file2.py
...
fileN.py
# in __init__.py
from .file1 import *
from .file2 import *
...
from .fileN import *
# in file1.py
def add():
pass
then others can call add() by
from your_package import add
without knowing file1's inside functions, like
from your_package.file1 import add
If you want something to be initialized; for example, logging (which should be put in the top level):
import logging.config
logging.config.dictConfig(Your_logging_config)
The __init__.py file makes Python treat directories containing it as modules.
Furthermore, this is the first file to be loaded in a module, so you can use it to execute code that you want to run each time a module is loaded, or specify the submodules to be exported.
Since Python 3.3, __init__.py is no longer required to define directories as importable Python packages.
Check PEP 420: Implicit Namespace Packages:
Native support for package directories that don’t require __init__.py marker files and can automatically span multiple path segments (inspired by various third party approaches to namespace packages, as described in PEP 420)
Here's the test:
$ mkdir -p /tmp/test_init
$ touch /tmp/test_init/module.py /tmp/test_init/__init__.py
$ tree -at /tmp/test_init
/tmp/test_init
├── module.py
└── __init__.py
$ python3
>>> import sys
>>> sys.path.insert(0, '/tmp')
>>> from test_init import module
>>> import test_init.module
$ rm -f /tmp/test_init/__init__.py
$ tree -at /tmp/test_init
/tmp/test_init
└── module.py
$ python3
>>> import sys
>>> sys.path.insert(0, '/tmp')
>>> from test_init import module
>>> import test_init.module
references:
https://docs.python.org/3/whatsnew/3.3.html#pep-420-implicit-namespace-packages
https://www.python.org/dev/peps/pep-0420/
Is __init__.py not required for packages in Python 3?
Although Python works without an __init__.py file you should still include one.
It specifies that the directory should be treated as a package, so therefore include it (even if it is empty).
There is also a case where you may actually use an __init__.py file:
Imagine you had the following file structure:
main_methods
|- methods.py
And methods.py contained this:
def foo():
return 'foo'
To use foo() you would need one of the following:
from main_methods.methods import foo # Call with foo()
from main_methods import methods # Call with methods.foo()
import main_methods.methods # Call with main_methods.methods.foo()
Maybe there you need (or want) to keep methods.py inside main_methods (runtimes/dependencies for example) but you only want to import main_methods.
If you changed the name of methods.py to __init__.py then you could use foo() by just importing main_methods:
import main_methods
print(main_methods.foo()) # Prints 'foo'
This works because __init__.py is treated as part of the package.
Some Python packages actually do this. An example is with JSON, where running import json is actually importing __init__.py from the json package (see the package file structure here):
Source code: Lib/json/__init__.py
In Python the definition of package is very simple. Like Java the hierarchical structure and the directory structure are the same. But you have to have __init__.py in a package. I will explain the __init__.py file with the example below:
package_x/
|-- __init__.py
|-- subPackage_a/
|------ __init__.py
|------ module_m1.py
|-- subPackage_b/
|------ __init__.py
|------ module_n1.py
|------ module_n2.py
|------ module_n3.py
__init__.py can be empty, as long as it exists. It indicates that the directory should be regarded as a package. Of course, __init__.py can also set the appropriate content.
If we add a function in module_n1:
def function_X():
print "function_X in module_n1"
return
After running:
>>>from package_x.subPackage_b.module_n1 import function_X
>>>function_X()
function_X in module_n1
Then we followed the hierarchy package and called module_n1 the function. We can use __init__.py in subPackage_b like this:
__all__ = ['module_n2', 'module_n3']
After running:
>>>from package_x.subPackage_b import *
>>>module_n1.function_X()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named module_n1
Hence using * importing, module package is subject to __init__.py content.
__init__.py will treat the directory it is in as a loadable module.
For people who prefer reading code, I put Two-Bit Alchemist's comment here.
$ find /tmp/mydir/
/tmp/mydir/
/tmp/mydir//spam
/tmp/mydir//spam/__init__.py
/tmp/mydir//spam/module.py
$ cd ~
$ python
>>> import sys
>>> sys.path.insert(0, '/tmp/mydir')
>>> from spam import module
>>> module.myfun(3)
9
>>> exit()
$
$ rm /tmp/mydir/spam/__init__.py*
$
$ python
>>> import sys
>>> sys.path.insert(0, '/tmp/mydir')
>>> from spam import module
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named spam
>>>
It facilitates importing other python files. When you placed this file in a directory (say stuff)containing other py files, then you can do something like import stuff.other.
root\
stuff\
other.py
morestuff\
another.py
Without this __init__.py inside the directory stuff, you couldn't import other.py, because Python doesn't know where the source code for stuff is and unable to recognize it as a package.
An __init__.py file makes imports easy. When an __init__.py is present within a package, function a() can be imported from file b.py like so:
from b import a
Without it, however, you can't import directly. You have to amend the system path:
import sys
sys.path.insert(0, 'path/to/b.py')
from b import a
One thing __init__.py allows is converting a module to a package without breaking the API or creating extraneous nested namespaces or private modules*. This helps when I want to extend a namespace.
If I have a file util.py containing
def foo():
...
then users will access foo with
from util import foo
If I then want to add utility functions for database interaction, and I want them to have their own namespace under util, I'll need a new directory**, and to keep API compatibility (so that from util import foo still works), I'll call it util/. I could move util.py into util/ like so,
util/
__init__.py
util.py
db.py
and in util/__init__.py do
from util import *
but this is redundant. Instead of having a util/util.py file, we can just put the util.py contents in __init__.py and the user can now
from util import foo
from util.db import check_schema
I think this nicely highlights how a util package's __init__.py acts in a similar way to a util module
* this is hinted at in the other answers, but I want to highlight it here
** short of employing import gymnastics. Note it won't work to create a new package with the same name as the file, see this
If you're using Python 2 and want to load siblings of your file you can simply add the parent folder of your file to your system paths of the session. It will behave about the same as if your current file was an init file.
import os
import sys
dir_path = os.path.dirname(__file__)
sys.path.insert(0, dir_path)
After that regular imports relative to the file's directory will work just fine. E.g.
import cheese
from vehicle_parts import *
# etc.
Generally you want to use a proper init.py file instead though, but when dealing with legacy code you might be stuck with f.ex. a library hard-coded to load a particular file and nothing but. For those cases this is an alternative.
__init__.py : It is a Python file found in a package directory, it is invoked when the package or a module in the package is imported. You can use this to execute package initialization code, i.e. whenever the package is imported the python statements are executed first before the other modules in this folder gets executed. It is similar to main function of c or Java program, but this exists in the Python package module (folder) rather than in the core Python file.
also it has access to global variables defined in this __init__.py file as when the module is imported into Python file.
for eg.
I have a __init__.py file in a folder called pymodlib, this file contains the following statements:
print(f'Invoking __init__.py for {__name__}')
pystructures = ['for_loop', 'while__loop', 'ifCondition']
When I import this package pymodlib in my solution module or notebook or python console:
These two statements get executed while importing.
So in the log or console you would see the following output:
>>> import pymodlib
Invoking __init__.py for pymodlib
in the next statement of python console: I can access the global variable:
>> pymodlib.pystructures
it gives the following output:
['for_loop', 'while__loop', 'ifCondition']
Now, from Python 3.3 onwards the use of this file has been optional to make folder a Python module. So you can skip from including it in the python module folder.