Python-Django: module import syntax conflicting - python

I hope this is not a duplicate, I couldn't find any other answer.
Going straight to the point, my problem is as follows.
I have a project in Django where django-apps use external custom modules.
This is the structure:
Project_dir/
- core/
- module_1.py
- module_2.py
- django_project/
- __init__.py
- settings.py
- urls.py
- wsgi.py
- django_app_A/
- views.py
- manage.py
The problem is that I need to import some classes and methods of moudule_2 in module_1, and I would do so by simply, in module_1,
from module_2 import foo
When I run module_1 for testing, everything works fine. Nonetheless, I need to import module_1 in django_app_A/views.py, and I would do so by
from core.module_1 import Bar
Here's the problem: if I have another relative import in module_1, as I have, I will get a
ModuleNotFoundError: No module named 'module_2'
UNLESS I use in module_1 the syntax
from .module_2 import foo
By doing so, the Django app will work fine and page will properly load, but at the same time I "break" the module_1, as I won't be able to run it stand-alone anymore, as I will get a
ModuleNotFoundError: No module named '__main__.module_2'
I have no idea how to fix this conflict and make both import syntax work at the same time.
Any clues? Am I missing something?
Thanks in advance

You should use absolute imports as much as you can.
from core.module_2 import foo

I can't be sure, but it sounds like a circular import problem to me.
Do you need the import to be on the "main" level? If you import module 2 inside of a class or function, simply write
def function_in_question():
import module_1
return module_1.whatever()
Another thing to look for: are you using it in a way that is actually circular? A function in module_2 using a function in module_1 that uses the function in module_2?

Related

Importing classes defined in the same module

I am having trouble using my classes that I've defined in a module. I've looked at this stackoverlfow post, and the answer seems to be "you don't need imports." This is definitely not the behavior I'm experiencing. I'm using Python 3.3. Here is my directory structure:
root/
__init__.py
mlp/
__init__.py
mlp.py
layers/
__init__.py
hidden_layer.py
dropout_layer.py
My problem is this: the class defined in dropout_layer.py extends the class in hidden_layer.py, but when I try to import hidden_layer, I sometimes get an error depending on the directory I execute my code from. For instance, from layers.hidden_layer import HiddenLayer then I run my code if I execute it from root/mlp. This import does not work, however, if I execute my code from root. This is strange behavior to me. How can I get this working correctly?
My only non-empty __init__.py file is in root/mlp/layers/:
# root/mlp/layers/__init__.py
__all__ = ['hidden_layer', 'dropout_layer']
In Python 3 you can prepend a . for an import relative to the location of the current module:
from .hidden_layer import HiddenLayer

Django import module: Unable to import module by name

I have sub-directories inside my django app folder, and on each I was trying to call a module. The issue that I am having is that I am able to import a module by using * but not by name which produces an error "Exception Value: cannot import name [my module]"
from foo import Bar # throws error
from foo import * # works
I dont know if I am missing anything on my settings.py but definitely I have included the app directory on my INSTALLED_APPS and also I have init.py on each directory. I also tried to check if my app folder is included on python paths and it was included.
Any help will be appreciated. THanks in advance
I expect you are thinking in terms of Java. In Python, you import things by module, not class name. So if a directory foo contains a file bar.py which defines a class Bar, you must do from foo.bar import Bar, not from foo import Bar.
After looking over my directories I found out that I have a file that has the same name as my app/Foo. after removing this it started working.
Foo/
- Bar.py
Process/
- Foo.py # deleted this
- views.py
from Foo import Bar #Foo.py was overriding the app/Foo that I was trying to call
Thanks for all your response!
if this is in django, try doing
from . import form
from . import models
from . import views
This should solve the issue.
Apart from that, fist make sure you have a init.py file in the directory.

Lazy loading module imports in an __init__.py file python

I was wondering if anyone had any suggestions for lazy loading imports in an init file? I currently have the following folder structure:
/mypackage
__init__.py
/core
__init__.py
mymodule.py
mymodule2.py
The init.py file in the core folder with the following imports:
from mymodule import MyModule
from mymodule2 import MyModule2
This way I can just do:
from mypackage.core import MyModule, MyModule2
However, in the package init.py file, I have another import:
from core.exc import MyModuleException
This has the effect that whenever I import my package in python, MyModule and MyModule2 get imported by default because the core init.py file has already been run.
What I want to do, is only import these modules when the following code is run and not before:
from mypackage.core import MyModule, MyModule2
Any ideas?
Many thanks.
Unless I'm mistaking your intentions, this is actually possible but requires some magic.
Basically, subclass types.ModuleType and override __getattr__ to import on demand.
Check out the Werkzeug init.py for an example.
You can't. Remember that when python imports it executes the code in the module. The module itself doesn't know how it is imported hence it cannot know whether it has to import MyModule(2) or not.
You have to choose: allow from mypackage.core import A, B and from core.exc import E does the non-needed imports (x)or do not import A and B in core/__init__.py, hence not allowing from mypackage.core import A, B.
Note: Personally I would not import MyModule(2) in core/__init__.py, but I'd add an all.py module that does this, so the user can do from mypackage.core.all import A, B
and still have from mypackage.core.exc import TheException not loading the unnecessary classes.
(Actually: the all module could even modify mypackage.core and add the classes to it, so that following imports of the kind from mypackage.core import MyModule, MyModule2 work, but I think this would be quite obscure and should be avoided).
If your modules structure is like:
/mypackage
__init__.py
/core
__init__.py
MyModule.py
MyModule2.py
or:
/mypackage
__init__.py
/core
__init__.py
/MyModule
__init__.py
/MyModule2
__init__.py
then feel free to use
from mypackage.core import MyModule, MyModule2
without importing them in __init__.py under mypackage/core
Not sure if it applies here but in general lazy loading of modules can be done using the Importing package.
Works like this:
from peak.util.imports import lazyModule
my_module = lazyModule('my_module')
Now my module is only really imported when you use it the first time.
You may use follow code in __init__ in module:
import apipkg
apipkg.initpkg(__name__, {
'org': {
'Class1': "secure._mypkg:Class1",
'Class2': "secure._mypkg2:Class2",
}
})
I realize that this question was posted a very long time ago and since then there has been some helpful updates to solve lazy loading submodules. So for anyone else looking for how to solve this we have great options available now.
Specifically PEP 562
Here is an excerpt from that article:
Another widespread use case for getattr would be lazy submodule imports. > Consider a simple example:
# lib/__init__.py
import importlib
__all__ = ['submod', ...]
def __getattr__(name):
if name in __all__:
return importlib.import_module("." + name, __name__)
raise AttributeError(f"module {__name__!r} has no attribute {name!r}")
# lib/submod.py
print("Submodule loaded")
class HeavyClass:
...
# main.py
import lib
lib.submod.HeavyClass # prints "Submodule loaded"

Relative imports from __init__ in multi-file Django apps

I have a Django project located at /var/django/project/ where /var/django/ is in the PATH
within that project I have:
___init__.py
manage.py
utils/
__init__.py
tools.py
utils/__init__.py contains a function named get_preview
utils/tools.py contains a function named get_related
How can utils/__init__.py import get_related from utils/tools.py?
How can utils/tools.py import get_preview from utils/__init_.py?
I have tried relative imports as well as static imports but seem to get an error in tools.py when I try to from project.utils import get_preview
Yeah, this is bad structure. You gotta watch out here with creating a circular import between the two files.
About circular imports.
You can't (and shouldn't). You are structuring your code very poorly if files in your module are referencing code in the __init__.py associated with it. Either move both functions into __init__.py or both of them out of __init__.py or put them into separate modules. Those are your only options.
You can do it, you just need to make one of the imports happen at runtime to avoid the circular import.
For example, __init__.py:
from project.utils.tools import get_related
def get_preview():
# ...
and tools.py:
def get_related():
from project.utils import get_preview
# ...
get_preview()

How do I make these relative imports work in Python 3?

I have a directory structure that looks like this:
project/
__init__.py
foo/
__init.py__
first.py
second.py
third.py
plum.py
In project/foo/__init__.py I import classes from first.py, second.py and third.py and put them in __all__.
There's a class in first.py named WonderfulThing which I'd like to use in second.py, and want to import by importing * from foo. (It's outside of the scope of this question why I'd like to do so, assume I have a good reason.)
In second.py I've tried from .foo import *, from foo import * and from . import * and in none of these cases is WonderfulThing imported. I also tried from ..foo import *, which raises an error "Attempted relative import beyond toplevel package".
I've read the docs and the PEP, and I can't work out how to make this work. Any assistance would be appreciated.
Clarification/Edit: It seems like I may have been misunderstanding the way __all__ works in packages. I was using it the same as in modules,
from .first import WonderfulThing
__all__ = [ "WonderfulThing" ]
but looking at the docs again it seems to suggest that __all__ may only be used in packages to specify the names of modules to be imported by default; there doesn't seem to be any way to include anything that's not a module.
Is this correct?
A non-wildcard import failed (cannot import name WonderfulThing). Trying from . import foo failed, but import foo works. Unfortunately, dir(foo) shows nothing.
Edit: I did misunderstand the question: No __all__ is not restricted to just modules.
One question is why you want to do a relative import. There is nothing wrong with doing from project.foo import *, here. Secondly, the __all__ restriction on foo won't prevent you from doing from project.foo.first import WonderfulThing, or just from .first import WonderfulThing, which still will be the best way.
And if you really want to import a a lot of things, it's probably best to do from project import foo, and then use the things with foo.WonderfulThing instead for doing an import * and then using WonderfulThing directly.
However to answer your direct question, to import from the __init__ file in second.py you do this:
from . import WonderfulThing
or
from . import *

Categories

Resources