I have a project which uses the MVC pattern.
In folder "models" I have quite many classes, each class is now has its own file. But I feel like it's not convenient, because every time I need to use a class I have to import it separately. E.g. I have many of the followings in my app source:
from models.classX import classX
from models.classY import classY
If I want to import everything at once, something like from models import * I found that I can put all sorts of import in models/__init__.py. But is it the pythonic way to do it ? What is the convention ?
Python is not java; please avoid the one-file-per-class pattern. If you can't change it, you can import all of them from a submodule of your models package:
# all.py: convenient import of all the needed classes
from models.classX import classX
from models.classY import classY
...
Then in your code you can write:
import my.package.models.all as models # or from my.package.models.all import *
and proceed to use models.classX, models.classY, etc.
Most pythonic way is one that you're already using. You can alleviate importing by grouping your classes in modules. For example, in Django usually all application models are in a single file.
From python docs:
Although certain modules are designed to export only names that follow certain patterns when you use import *, it is still considered bad practise in production code.
Firstly, you should rename your classes and modules so that they don't match, and follow PEP8:
models/
classx.py
class ClassX
classy.py
class ClassY
Then, I'd got with this in models/__init__.py:
from models.classx import ClassX
from models.classy import ClassY
Meaning in your main code, you can do any one of:
from models import *
x = ClassX()
from models import ClassX
x = ClassX()
import models
x = models.ClassX()
Related
I would love to use a schema that looks something like the following in FastAPI:
from __future__ import annotations
from typing import List
from pydantic import BaseModel
class Project(BaseModel):
members: List[User]
class User(BaseModel):
projects: List[Project]
Project.update_forward_refs()
but in order to keep my project structure clean, I would ofc. like to define these in separate files. How could I do this without creating a circular reference?
With the code above the schema generation in FastAPI works fine, I just dont know how to separate it out into separate files. In a later step I would then instead of using attributes use #propertys to define the getters for these objects in subclasses of them. But for the OpenAPI doc generation, I need this combined - I think.
There are three cases when circular dependency may work in Python:
Top of module: import package.module
Bottom of module: from package.module import attribute
Top of function: works both
In your situation, the second case "bottom of module" will help.
Because you need to use update_forward_refs function to resolve pydantic postponed annotations like this:
# project.py
from typing import List
from pydantic import BaseModel
class Project(BaseModel):
members: "List[User]"
from user import User
Project.update_forward_refs()
# user.py
from typing import List
from pydantic import BaseModel
class User(BaseModel):
projects: "List[Project]"
from project import Project
User.update_forward_refs()
Nonetheless, I would strongly discourage you from intentionally introducing circular dependencies
Just put all your schema imports at the bottom of the file, after all classes, and call update_forward_refs().
#1/4
from __future__ import annotations # this is important to have at the top
from pydantic import BaseModel
#2/4
class A(BaseModel):
my_x: X # a pydantic schema from another file
class B(BaseModel):
my_y: Y # a pydantic schema from another file
class C(BaseModel):
my_z: int
#3/4
from myapp.schemas.x import X # related schemas we import after all classes
from myapp.schemas.y import Y
#4/4
A.update_forward_refs() # tell the system that A has a related pydantic schema
B.update_forward_refs() # tell the system that B has a related pydantic schema
# for C we don't need it, because C has just an integer field.
NOTE:
Do this in every file that has schema imports.
That will enable you make any combination without circular import problems.
NOTE 2:
People usually put the imports and update_forward_refs() after every class, and then report that it doesn't work. That is usually because if an app is complex, you do not know what import is calling which class and when. Therefore, if you put it at the bottom, you are sure that every class will be 'scanned' and visible for others.
To me, the other answers don't seem to solve this on a satisfactory level due to ignoring the locals in modules. Here is a straightforward way to make that work on separate files:
user.py
from typing import TYPE_CHECKING, List
from pydantic import BaseModel
if TYPE_CHECKING:
from project import Project
class User(BaseModel):
projects: List['Project']
project.py
from typing import TYPE_CHECKING, List
from pydantic import BaseModel
if TYPE_CHECKING:
from user import User
class Project(BaseModel):
members: List['User']
main.py
from project import Project
from user import User
# Update the references that are as strings
Project.update_forward_refs(User=User)
User.update_forward_refs(Project=Project)
# Example: Projects into User and Users into Project
Project(
members=[
User(
projects=[
Project(members=[])
]
)
]
)
This works if you run the main.py. If you are building a package, you may put that content to an __init__.py file that is high enough in the structure to not have circular import problem.
Note how we passed the User=User and Project=Project to update_forward_refs. This is because the module scopes where these classes are don't have references to each other (as if they did, there would be circular import). Therefore we pass them in main.py when updating the references as there we don't have the circular import problem.
NOTE: About type checking
If if TYPE_CHECKING: patterns are unfamiliar, they are basically if blocks that are never True on runtime (running your code) but they are used by code analysis (IDEs) to highlight the types. Those blocks are not needed for the example to work but are highly recommended as otherwise, it's hard to read the code, find out where these classes actually are defined and fully utilize code analysis tools.
If I want to split the models/schema into separate files, I will create extra files for the ProjectBase model and UserBase model so the Project model and User model could inherit from them. I will do like this:
#project_base.py
from pydantic import BaseModel
class ProjectBase(BaseModel):
id: int
title: str
class Config:
orm_mode=True
#user_base.py
from pydantic import BaseModel
class UserBase(BaseModel):
id: int
title: str
class Config:
orm_mode=True
#project.py
from typing import List
from .project_base import ProjectBase
from .user_base import UserBase
class Project(ProjectBase):
members: List[UserBase] = []
#user.py
from typing import List
from .project_base import ProjectBase
from .user_base import UserBase
class User(UserBase):
projects: List[ProjectBase] = []
note: for this method the orm_mode must be put in the ProjectBase and UserBase, so it can read by Project and User even if it is not a dict
I am working on a django project. In views.py I invoked a function from custom.py . The problem is it is working nicely. But I didn't write something like:
from .custom import *
or
import custom
In a sentence, there is not the the word 'custom' anywhere in views.py.
Why it's working? Is there any other way to import module that i don't know about? What is it?
Or any kind of django trick?
addition:
The custom.py also did not import views.py
custom.py has imported models.py
It's possible that your views.py file is importing from another module which imports the function you are using.
models.py
from .custom import my_function
views.py
from .models import *
# my_function will be available
This occurs because the import * statement imports all functions and classes, including anything that has already been imported into that module. There is some information about it in the python docs.
You can avoid this by only importing the specific functions and classes you need, eg:
views.py
from .models import PersonModel, ProductModel
the code is not been working due to the error in assigning the path properly please help me with that.
Try this in your models:
#Remove the import statement: from blog.models import sighinmodel
#Then, inside your model
user = models.ForeignKey('blog.sighinmodel' , on_delete = None)
Also, I would like to point out that this is not the correct way of importing other modules in your models.py .
You should do like this:
from appname.models import ModelName
#for importing from another module's models.
There is no need for relative path names in import statements in Django. from appname.module import function/class works fine for nearly all the cases until cyclic redundancy occurs, in which you have to take one among many methods. One is the way I mentioned above:
Method 1: Simply put this inside the ModelClass. Don't import anything.
user = models.ForeignKey('blog.sighinmodel' , on_delete = None)
Method 2(when cyclic import condition is not arising)
from blog.models import sighinmodel
class SomeModel(models.Model):
user = models.ForeignKey(sighinmodel , on_delete = None)
NOTE: The above will work only if a cyclic import isn't occurring. In case the cyclic import condition is occurring, switch back to the first method of declaration.
Hope this helps. Thanks.
This error is coming because relative imports are not allowed beyond top level package. Your blog is itself a module so if you import your model from there it would work.
from blog.models import User, sighinmodel
I would also suggest you to use CamelCase for your models name since they are classes for naming conventions.
Is it possible to include libraries/packages in only one location?
class Sample( db.Model ):
randomText = db.StringProperty( multiline = True )
--
from google.appengine.ext import db
from project.models import Sample
class SampleHandler( ):
def get( self ):
xamp = Sample.all( )
Since the handler already imports db from the google.appengine.ext library/package, and then imports the model i'd assume you don't have to include it again in the model itself. However, it looks like I have to, any way?
Anyone care to explain?
You need to import modules where they are used.
If your models module uses the google.appengine.ext.db module, you need to import it there, not in your handler module.
Importing things creates a reference to that 'thing' in your module namespace, so that the code there can find it when using it. db is the local name by which you get to use the object defined in google.appengine.ext.
If your handler uses the same object, it needs to import that still. If by importing models all names used by models suddenly where available in your handler module too, you'd end up with name conflicts and hard-to-debug errors all over the place.
Vice versa, if only importing google.appengine.ext.db in your handler module and not in your models module were to work, you'd need to import all the dependencies of given module together with the module itself. This quickly becomes unworkable, as you'd need to document all the things your models module requires just to be able to use it.
I am looking for what type of code would I put in __init__.py files and what are the best practices related to this. Or, is it a bad practice in general ?
Any reference to known documents that explain this is also very much appreciated.
Libraries and frameworks usually use initialization code in __init__.py files to neatly hide internal structure and provide a uniform interface to the user.
Let's take the example of Django forms module. Various functions and classes in forms module are defined in different files based on their classification.
forms/
__init__.py
extras/
...
fields.py
forms.py
widgets.py
...
Now if you were to create a form, you would have to know in which file each function is defined and your code to create a contact form will have to look something like this (which is incovenient and ugly).
class CommentForm(forms.forms.Form):
name = forms.fields.CharField()
url = forms.fields.URLField()
comment = forms.fields.CharField(widget=forms.widgets.Textarea)
Instead, in Django you can just refer to various widgets, forms, fields etc. directly from the forms namespace.
from django import forms
class CommentForm(forms.Form):
name = forms.CharField()
url = forms.URLField()
comment = forms.CharField(widget=forms.Textarea)
How is this possible? To make this possible, Django adds the following statement to forms/__init__.py file which import all the widgets, forms, fields etc. into the forms namespace.
from widgets import *
from fields import *
from forms import *
from models import *
As you can see, this simplifies your life when creating the forms because now you don't have to worry about in where each function/class is defined and just use all of these directly from forms namespace. This is just one example but you can see examples like these in other frameworks and libraries.
One of the best practices in that area is to import all needed classes from your library (look at mongoengine, for example). So, a user of your library can do this:
from coollibrary import OneClass, SecondClass
instead of
from coollibrary.package import OneClass
from coollibrary.anotherpackage import SecondClass
Also, good practice is include in __init__.py version constant
For convenience: The other users will not need to know your functions' exactly location.
your_package/
__init__.py
file1.py/
file2.py/
...
fileN.py
# in __init__.py
from file1 import *
from file2 import *
...
from fileN import *
# in file1.py
def add():
pass
then others can call add() by
from your_package import add
without knowing file1, like
from your_package.file1 import add
Put something for initializing. For example, the logging(this should put in the top level):
import logging.config
logging.config.dictConfig(Your_logging_config)