using isintance on a pydantic model - python

I am expecting multiple data types as input to a function & want to take a specific action if its a pydantic model (pydantic model here means class StartReturnModel(BaseModel)).
In case of model instance I can check it, using isinstance(model, StartReturnModel) or isinstance(model, BaseModel) to identify its a pydantic model instance.
Based on the below test program I can see that type(StartReturnModel) returns as ModelMetaclass. Can I use this to identify a pydantic model? or is there any better way to do it?
from pydantic.main import ModelMetaclass
from typing import Optional
class StartReturnModel(BaseModel):
result: bool
pid: Optional[int]
print(type(StartReturnModel))
print(f"is base model: {bool(isinstance(StartReturnModel, BaseModel))}")
print(f"is meta model: {bool(isinstance(StartReturnModel, ModelMetaclass))}")
res = StartReturnModel(result=True, pid=500045)
print(f"\n{type(res)}")
print(f"is start model(res): {bool(isinstance(res, StartReturnModel))}")
print(f"is base model(res): {bool(isinstance(res, BaseModel))}")
print(f"is meta model(res): {bool(isinstance(res, ModelMetaclass))}")
*****Output****
<class 'pydantic.main.ModelMetaclass'>
is base model: False
is meta model: True
<class '__main__.StartReturnModel'>
is start model(res): True
is base model(res): True
is meta model(res): False

Yes you can use it, but why not use isinstance or issubclass.

After you expanded a bit in the comment thread, it is clear that you have a fundamental gap in understanding of Python classes and metaclasses. The topics have been discussed at length on SO, so I'll just refer you to the search function for details, but the short answer to your particular question is this:
from pydantic import BaseModel
from pydantic.main import ModelMetaclass
class MyModel(BaseModel):
x: int
y: str
obj = MyModel(x=1, y="a")
cls = MyModel
print(f"{isinstance(obj, BaseModel)=}")
print(f"{isinstance(obj, MyModel)=}")
print(f"{issubclass(cls, BaseModel)=}")
print(f"{issubclass(cls, MyModel)=}")
print(f"{cls is MyModel=}")
print(f"{isinstance(cls, ModelMetaclass)=}") # just to illustrate
print(f"{isinstance(cls, type)=}") # just to illustrate
Output:
isinstance(obj, BaseModel)=True
isinstance(obj, MyModel)=True
issubclass(cls, BaseModel)=True
issubclass(cls, MyModel)=True
cls is MyModel=True
isinstance(cls, ModelMetaclass)=True
isinstance(cls, type)=True
You should avoid using pydantic.main.ModelMetaclass because it is currently not (fully) exposed publicly as you correctly noted. And as you can see from the code above, there is simply no need to deal with it for you.
If you have a function that is supposed to handle both instances of a model class and specific classes that inherit from BaseModel, that could look like this:
from typing import Union
from pydantic import BaseModel
def do_stuff(obj_or_cls: Union[BaseModel, type[BaseModel]]) -> None:
if isinstance(obj_or_cls, BaseModel):
print(f"Got an instance of the model `{obj_or_cls.__class__.__name__}`")
elif isinstance(obj_or_cls, type) and issubclass(obj_or_cls, BaseModel):
print(f"Got a model subclass called `{obj_or_cls.__name__}`")
else:
raise TypeError
class MyModel(BaseModel):
x: int
y: str
obj = MyModel(x=1, y="a")
cls = MyModel
do_stuff(obj)
do_stuff(cls)
Output:
Got an instance of the model `MyModel`
Got a model subclass called `MyModel`

Related

Pydantic Recursive Models

I am refencing the answer on this other stackoverflow post on using the Typing library Literal to specify a unique array to strings to validate the data with Pydantic but I am running into a problem of calling another class recursively.
This is what my code looks like:
from pydantic import BaseModel, PydanticValueError, ValidationError, validator
from typing import Literal,Optional
ACTION_TYPE_MAPPING = Literal["read", "write", "release"]
OBJECT_TYPE_MAPPING = Literal["multiStateValue", "multiStateInput", "multiStateOutput",
"analogValue", "analogInput", "analogOutput",
"binaryValue", "binaryInput", "binaryOutput"]
BOOLEAN_ACTION_MAPPING = Literal["active", "inactive"]
# MAIN MODEL
class BacnetRequestModel(BaseModel):
action_type: ACTION_TYPE_MAPPING
object_type: OBJECT_TYPE_MAPPING
object_instance: int
value: Optional[ValueModel(object_type)] <---- MESSED UP HERE, how to call ValueModel?
class ValueModel(BaseModel):
multiStateValue: Optional[int]
multiStateInput: Optional[int]
multiStateOutput: Optional[int]
analogValue: Optional[int]
analogInput: Optional[int]
analogOutput: Optional[int]
binaryValue: Optional[BOOLEAN_ACTION_MAPPING]
binaryInput: Optional[BOOLEAN_ACTION_MAPPING]
binaryOutput: Optional[BOOLEAN_ACTION_MAPPING]
test = BacnetRequestModel(action_type="write",
object_type="binaryOutput",
object_instance="3",
value = "active"
)
How do I call the class ValueModel based on the objectType that was inputted to the function where in this case it was binaryOutput that should only accept a value of BOOLEAN_ACTION_MAPPING. Any tips help not a lot of wisdom here...
Traceback is:
value = Optional[ValueModel(object_type)]
NameError: name 'ValueModel' is not defined

Graphene input error for Pydantic models with discriminator while generating Input object schema

I am using pydantic validations for my requirements and it uses discriminator. I am writing GraphQL APIs and want to convert those pydantic models into graphene input objects.
Below is my code.
from graphene_pydantic import PydanticInputObjectType, PydanticObjectType
import graphene
from typing import Literal, Union
from pydantic import BaseModel, Field
class Cat(BaseModel):
pet_type: Literal['cat']
meows: int
class Dog(BaseModel):
pet_type: Literal['dog']
barks: float
class Lizard(BaseModel):
pet_type: Literal['reptile', 'lizard']
scales: bool
class Model(BaseModel):
pet: Union[Cat, Dog, Lizard] = Field(..., discriminator='pet_type')
n: int
print(Model(pet={'pet_type': 'dog', 'barks': 3.14, 'eats': 'biscuit'}, n=1))
class Input(PydanticInputObjectType):
class Meta:
model = Model
# exclude specified fields
exclude_fields = ("id",)
class Output(PydanticObjectType):
class Meta:
model = Model
# exclude specified fields
exclude_fields = ("id",)
class CreateAnimal(graphene.Mutation):
class Arguments:
input = Input()
output = Output
#staticmethod
def mutate(parent, info, input):
print(input)
# save model here
return input
class Mutation(graphene.ObjectType):
createPerson = CreateAnimal.Field()
schema = graphene.Schema(mutation=Mutation)
print(schema)
I tried by commenting on the discriminator code and it's working fine but I need those validations for graphql also. If I run that code it's throwing the below error.
File "\AppData\Local\Programs\Python\Python310\lib\site-packages\graphql\type\definition.py", line 1338, in fields raise TypeError(f"{self.name} fields cannot be resolved. {error}")
TypeError: Input fields cannot be resolved. The input field type must be a GraphQL input type.
Can someone help me with this?
I am using graphene-pydantic for this.
Hey i havent tested your code but i think this what you need:
class CreateAnimal(graphene.Mutation):
class Arguments:
input = graphene.Argument(Input)
output = graphene.Field(Output)

Pydantic validate subfields on assignment

I'm trying to make sure one of my objects used is always in a correct state. For this I should validate not only on creation but also on assignment, and also on the sub field assignments. Here is a basic example:
from typing import Optional
from pydantic import BaseModel, root_validator
class SubModel(BaseModel):
class Config:
validate_assignment = True
min: Optional[int]
max: Optional[int]
class TestModel(BaseModel):
class Config:
validate_assignment = True
field_1: Optional[SubModel]
#root_validator
def validate(cls, values):
field = values.get("field_1")
if field and field.min and field.max:
if field.min > field.max:
raise ValueError("error")
return values
If I now call
model = TestModel(field_1=SubModel(min=2, max=1))
or
model = TestModel()
field_1 = SubModel(min=2, max=1)
the validation is triggered and the ValueError is raised, which is fine.
But if I do the following
model = TestModel()
field_1 = SubModel()
field_1.min = 2
field_1.max = 1
no validation is triggered.
I know that I could do the validation on SubModel level but in my case (which is a little bit more complex than the basic code shows) I don't want every object of type SubModel to have min <= max but only the one field used in TestModel. Therefor moving the validator to the SubModel is no option for me.
Does anyone have an idea on how to trigger the validator of TestModel when assigning min and max on field_1?
Thank you in advance!

Nested dataclass initialization

I have a JSON object that reads:
j = {"id": 1, "label": "x"}
I have two types:
class BaseModel:
def __init__(self, uuid):
self.uuid = uuid
class Entity(BaseModel):
def __init__(self, id, label):
super().__init__(id)
self.name = name
Note how id is stored as uuid in the BaseModel.
I can load Entity from the JSON object as:
entity = Entity(**j)
I want to re-write my model leveraging dataclass:
#dataclass
class BaseModel:
uuid = str
#dataclass
class Entity:
name = str
Since my JSON object does not have the uuid, entity = Entitye(**j) on the dataclass-based model will throw the following error:
TypeError: __init__() got an unexpected keyword argument 'id'
The "ugly" solutions I can think of:
Rename id to uuid in JSON before initialization:
j["uuid"] = j.pop("id")
Define both id and uuid:
#dataclass
class BaseModel:
uuid = str
#dataclass
class Entity:
id = str
name = str
# either use:
uuid = id
# or use this method
def __post_init__(self):
super().uuid = id
Is there any cleaner solution for this kind of object initialization in the dataclass realm?
might be ruining the idea of removing the original __init__ but how about writing a function to initialize the data class?
def init_entity(j):
j["uuid"] = j.pop("id")
return Entity(**j)
and in your code entity = initEntity(j)
I think the answer here might be to define a classmethod that acts as an alternative constructor to the dataclass.
from dataclasses import dataclass
from typing import TypeVar, Any
#dataclass
class BaseModel:
uuid: str
E = TypeVar('E', bound='Entity')
#dataclass
class Entity(BaseModel):
name: str
#classmethod
def from_json(cls: type[E], **kwargs: Any) -> E:
return cls(kwargs['id'], kwargs['label']
(For the from_json type annotation, you'll need to use typing.Type[E] instead of type[E] if you're on python <= 3.8.)
Note that you need to use colons for your type-annotations within the main body of a dataclass, rather than the = operator, as you were doing.
Example usage in the interactive REPL:
>>> my_json_dict = {'id': 1, 'label': 'x'}
>>> Entity.from_json(**my_json_dict)
Entity(uuid=1, name='x')
It's again questionable how much boilerplate code this saves, however. If you find yourself doing this much work to replicate the behaviour of a non-dataclass class, it's often better just to use a non-dataclass class. Dataclasses are not the perfect solution to every problem, nor do they try to be.
Simplest solution seems to be to use an efficient JSON serialization library that supports key remappings. There are actually tons of them that support this, but dataclass-wizard is one example of a (newer) library that supports this particular use case.
Here's an approach using an alias to dataclasses.field() which should be IDE friendly enough:
from dataclasses import dataclass
from dataclass_wizard import json_field, fromdict, asdict
#dataclass
class BaseModel:
uuid: int = json_field('id', all=True)
#dataclass
class Entity(BaseModel):
name: str = json_field('label', all=True)
j = {"id": 1, "label": "x"}
# De-serialize the dictionary object into an `Entity` instance.
e = fromdict(Entity, j)
repr(e)
# Entity(uuid=1, name='x')
# Assert we get the same object when serializing the instance back to a
# JSON-serializable dict.
assert asdict(e) == j

Pydantic enum field does not get converted to string

I am trying to restrict one field in a class to an enum. However, when I try to get a dictionary out of class, it doesn't get converted to string. Instead it retains the enum. I checked pydantic documentation, but couldn't find anything relevant to my problem.
This code is representative of what I actually need.
from enum import Enum
from pydantic import BaseModel
class S(str, Enum):
am = 'am'
pm = 'pm'
class K(BaseModel):
k: S
z: str
a = K(k='am', z='rrrr')
print(a.dict()) # {'k': <S.am: 'am'>, 'z': 'rrrr'}
I'm trying to get the .dict() method to return {'k': 'am', 'z': 'rrrr'}
You need to use use_enum_values option of model config:
use_enum_values
whether to populate models with the value property of enums, rather than the raw enum. This may be useful if you want to serialise model.dict() later (default: False)
from enum import Enum
from pydantic import BaseModel
class S(str, Enum):
am='am'
pm='pm'
class K(BaseModel):
k:S
z:str
class Config:
use_enum_values = True # <--
a = K(k='am', z='rrrr')
print(a.dict())
You can use FastAPI's jsonable_encoder:
from enum import Enum
from pydantic import BaseModel
from fastapi.encoders import jsonable_encoder
class S(str, Enum):
am = 'am'
pm = 'pm'
class K(BaseModel):
k: S
z: str
a = K(k='am', z='rrrr')
print(jsonable_encoder(a)) # {'k': 'am', 'z': 'rrrr'}
You can refer to the value using the left hand side of the enum key. Because they are constants I tend to uppercase them as well, but you of course don't have to. Using the left hand side is the expected normal use pattern:
from enum import Enum
from pydantic import BaseModel
class S(str, Enum):
am = 'am'
pm = 'pm'
class K(BaseModel):
k: S
z: str
a = K(k=am, z='rrrr')
print(a.dict())

Categories

Resources