How create dict from Pydantic (Enum)? - python

I have some object with Pydantic's class.
I have to create dict from object.
from enum import Enum
from pydantic import BaseModel
class StatusId(Enum):
ACTIVE: int = 1
PASSIVE: int = 2
class Pamagite(BaseModel):
status_id: StatusId = StatusId.ACTIVE
another_field: str = "another_field"
If I try do like this:
pamagite = Pamagite().dict()
I will get
pamagite = {'status_id': <StatusId.ACTIVE: 1>, 'another_field': 'another_field'}
I expected that pamagite will be equally to
pamagite = {'status_id': 1, 'another_field': 'another_field'}
How I can do this?

Use use_enum_values = True option.
whether to populate models with the value property of enums, rather than the raw enum. This may be useful if you want to serialise model.dict() later (default: False)
from enum import Enum
from pydantic import BaseModel
class StatusId(Enum):
ACTIVE: int = 1
PASSIVE: int = 2
class Pamagite(BaseModel):
status_id: StatusId = StatusId.ACTIVE
another_field: str = "another_field"
class Config:
use_enum_values = True
pamagite = Pamagite().dict()
print(pamagite) # "{'status_id': 1, 'another_field': 'another_field'}"

here,
class Pamagite(BaseModel):
status_id: StatusId = StatusId.ACTIVE.value
another_field: str = "another_field"
Calling the dict over the class object will give you
{'status_id': 1, 'another_field': 'another_field'}
Calling StatusId.ACTIVE.value will give you 1 and calling StatusId.ACTIVE.name will give you ACTIVE in case if you want to use ACTIVE instead of 1

Related

How to subclass a frozen dataclass

I have inherited the Customer dataclass. This identifies a customer in the customer DB table.
Customer is used to produce summary statistics for transactions pertaining to a given customer. It is hashable, hence frozen.
I require a SpecialCustomer (a subclass of Customer) it has an extra property: special_property. Most of the properties inherited from Customer will be set to fixed values. This customer does not exist in the Customer table.
I wish to utilise code which has been written for Customer. Without special_property we will not be able to distinguish between special customers.
How do I instantiate SpecialCustomer?
Here is what I have. I know why this doesn't work. Is there some way to do this?:
from dataclasses import dataclass
#dataclass(frozen=True, order=True)
class Customer:
property: str
#dataclass(frozen=True, order=True)
class SpecialCustomer(Customer):
special_property: str = Field(init=False)
def __init__(self, special_property):
super().__init__(property="dummy_value")
self.special_property = special_property
s = SpecialCustomer(special_property="foo")
Error:
E dataclasses.FrozenInstanceError: cannot assign to field 'special_property'
<string>:4: FrozenInstanceError
Why not like this?
from dataclasses import dataclass
from pydantic.dataclasses import dataclass as pydantic_dataclass
#dataclass(frozen=True, order=True)
class Customer:
prop: str
#pydantic_dataclass(frozen=True, order=True, kw_only=True)
class SpecialCustomer(Customer):
special_prop: str
prop: str = "dummy_value"
print(SpecialCustomer(special_prop="foo"))
Output: SpecialCustomer(prop='dummy_value', special_prop='foo')
Problem is that without kw_only=True we cannot have a non-default field after a default one and with the weird approach taken by dataclasses, prop is still considered before special_prop, even though it is re-declared after it...
Dataclasses are just very restrictive. Basically, if you want anything other vanilla, you'll have a bad time. If you were willing/able to switch to something like attrs instead, those are much more flexible and also lightweight. Normal Pydantic models too of course, but they are less light-weight.
If the problem with my suggested solution is that it still allows users of the SpecialCustomer class to set arbitrary values for prop, you could prevent that with an additional check in __post_init__. That would of course be annoying, if there are many fields that should be fixed, but I fail to see any other way to construct this.
Something like this:
...
#pydantic_dataclass(frozen=True, order=True, kw_only=True)
class SpecialCustomer(Customer):
special_prop: str
prop1: str = "dummy_value"
prop2: int = 123
prop3: tuple[float, float] = (3.14, 0.)
def __post_init__(self) -> None:
assert self.prop1 == "dummy_value"
assert self.prop2 == 123
assert self.prop3 == (3.14, 0.)
print(SpecialCustomer(special_prop="foo"))
try:
SpecialCustomer(prop1="something", special_prop="bar")
except AssertionError as e:
print("No! Bad user.")
Alternatively, since this is a Pydantic class, you could define validators for the fixed fields that do essentially the same thing.
PS: Possible attrs solution
from dataclasses import dataclass
from attrs import define, field
#dataclass(frozen=True, order=True)
class Customer:
prop1: str
prop2: int
prop3: tuple[float, float]
#define(frozen=True, order=True)
class SpecialCustomer(Customer):
prop1: str = field(default="dummy_value", init=False)
prop2: int = field(default=123, init=False)
prop3: tuple[float, float] = field(default=(3.14, 0.), init=False)
special_prop: str
if __name__ == "__main__":
import json
from attrs import asdict
s = SpecialCustomer("foo")
print(json.dumps(asdict(s), indent=4))
print(isinstance(s, Customer))
print(hash(s))
try:
SpecialCustomer(prop1="abc", special_prop="bar")
except TypeError as e:
print(repr(e))
Output:
{
"prop1": "dummy_value",
"prop2": 123,
"prop3": [
3.14,
0.0
],
"special_prop": "foo"
}
True
6587449294214520366
TypeError("SpecialCustomer.__init__() got an unexpected keyword argument 'prop1'")

Unmutable field on python's dataclass

I have this dataclass:
#dataclass
class Couso:
nome: str
date: str = field(default=datetime.now(), init = False)
id_: str = field(default=key())
Being key() a simple function that returns a str on len 32.
And when i create multiple classes of it (without specifing the id_ obviously) they all share the same id_
But why does it work this way? I cant understand.
Also, would this happen again with the attribute date?
key is called before field is called to create the field, so that every instance will have the same default id_ attribute. It's the same as if you had written
x = key()
#dataclass
class Couso:
...
id_ : str = field(default=x)
If you want to call key each time you create a new instance, use default_factory instead.
id_: str = field(default_factory=key) # key is not called; it's passed as an object.
The same goes for datetime.now:
date: str = field(default_factory=datetime.now, init = False)

Nested dataclass initialization

I have a JSON object that reads:
j = {"id": 1, "label": "x"}
I have two types:
class BaseModel:
def __init__(self, uuid):
self.uuid = uuid
class Entity(BaseModel):
def __init__(self, id, label):
super().__init__(id)
self.name = name
Note how id is stored as uuid in the BaseModel.
I can load Entity from the JSON object as:
entity = Entity(**j)
I want to re-write my model leveraging dataclass:
#dataclass
class BaseModel:
uuid = str
#dataclass
class Entity:
name = str
Since my JSON object does not have the uuid, entity = Entitye(**j) on the dataclass-based model will throw the following error:
TypeError: __init__() got an unexpected keyword argument 'id'
The "ugly" solutions I can think of:
Rename id to uuid in JSON before initialization:
j["uuid"] = j.pop("id")
Define both id and uuid:
#dataclass
class BaseModel:
uuid = str
#dataclass
class Entity:
id = str
name = str
# either use:
uuid = id
# or use this method
def __post_init__(self):
super().uuid = id
Is there any cleaner solution for this kind of object initialization in the dataclass realm?
might be ruining the idea of removing the original __init__ but how about writing a function to initialize the data class?
def init_entity(j):
j["uuid"] = j.pop("id")
return Entity(**j)
and in your code entity = initEntity(j)
I think the answer here might be to define a classmethod that acts as an alternative constructor to the dataclass.
from dataclasses import dataclass
from typing import TypeVar, Any
#dataclass
class BaseModel:
uuid: str
E = TypeVar('E', bound='Entity')
#dataclass
class Entity(BaseModel):
name: str
#classmethod
def from_json(cls: type[E], **kwargs: Any) -> E:
return cls(kwargs['id'], kwargs['label']
(For the from_json type annotation, you'll need to use typing.Type[E] instead of type[E] if you're on python <= 3.8.)
Note that you need to use colons for your type-annotations within the main body of a dataclass, rather than the = operator, as you were doing.
Example usage in the interactive REPL:
>>> my_json_dict = {'id': 1, 'label': 'x'}
>>> Entity.from_json(**my_json_dict)
Entity(uuid=1, name='x')
It's again questionable how much boilerplate code this saves, however. If you find yourself doing this much work to replicate the behaviour of a non-dataclass class, it's often better just to use a non-dataclass class. Dataclasses are not the perfect solution to every problem, nor do they try to be.
Simplest solution seems to be to use an efficient JSON serialization library that supports key remappings. There are actually tons of them that support this, but dataclass-wizard is one example of a (newer) library that supports this particular use case.
Here's an approach using an alias to dataclasses.field() which should be IDE friendly enough:
from dataclasses import dataclass
from dataclass_wizard import json_field, fromdict, asdict
#dataclass
class BaseModel:
uuid: int = json_field('id', all=True)
#dataclass
class Entity(BaseModel):
name: str = json_field('label', all=True)
j = {"id": 1, "label": "x"}
# De-serialize the dictionary object into an `Entity` instance.
e = fromdict(Entity, j)
repr(e)
# Entity(uuid=1, name='x')
# Assert we get the same object when serializing the instance back to a
# JSON-serializable dict.
assert asdict(e) == j

Best way to flatten and remap ORM to Pydantic Model

I am using Pydantic with FastApi to output ORM data into JSON. I would like to flatten and remap the ORM model to eliminate an unnecessary level in the JSON.
Here's a simplified example to illustrate the problem.
original output: {"id": 1, "billing":
[
{"id": 1, "order_id": 1, "first_name": "foo"},
{"id": 2, "order_id": 1, "first_name": "bar"}
]
}
desired output: {"id": 1, "name": ["foo", "bar"]}
How to map values from nested dict to Pydantic Model? provides a solution that works for dictionaries by using the init function in the Pydantic model class. This example shows how that works with dictionaries:
from pydantic import BaseModel
# The following approach works with a dictionary as the input
order_dict = {"id": 1, "billing": {"first_name": "foo"}}
# desired output: {"id": 1, "name": "foo"}
class Order_Model_For_Dict(BaseModel):
id: int
name: str = None
class Config:
orm_mode = True
def __init__(self, **kwargs):
print(
"kwargs for dictionary:", kwargs
) # kwargs for dictionary: {'id': 1, 'billing': {'first_name': 'foo'}}
kwargs["name"] = kwargs["billing"]["first_name"]
super().__init__(**kwargs)
print(Order_Model_For_Dict.parse_obj(order_dict)) # id=1 name='foo'
(This script is complete, it should run "as is")
However, when working with ORM objects, this approach does not work. It appears that the init function is not called. Here's an example which will not provide the desired output.
from pydantic import BaseModel, root_validator
from typing import List
from sqlalchemy.orm import relationship
from sqlalchemy import Column, Integer, String, ForeignKey
from sqlalchemy.dialects.postgresql import ARRAY
from sqlalchemy.ext.declarative import declarative_base
Base = declarative_base()
from pydantic.utils import GetterDict
class BillingOrm(Base):
__tablename__ = "billing"
id = Column(Integer, primary_key=True, nullable=False)
order_id = Column(ForeignKey("orders.id", ondelete="CASCADE"), nullable=False)
first_name = Column(String(20))
class OrderOrm(Base):
__tablename__ = "orders"
id = Column(Integer, primary_key=True, nullable=False)
billing = relationship("BillingOrm")
class Billing(BaseModel):
id: int
order_id: int
first_name: str
class Config:
orm_mode = True
class Order(BaseModel):
id: int
name: List[str] = None
# billing: List[Billing] # uncomment to verify the relationship is working
class Config:
orm_mode = True
def __init__(self, **kwargs):
# This __init__ function does not run when using from_orm to parse ORM object
print("kwargs for orm:", kwargs)
kwargs["name"] = kwargs["billing"]["first_name"]
super().__init__(**kwargs)
billing_orm_1 = BillingOrm(id=1, order_id=1, first_name="foo")
billing_orm_2 = BillingOrm(id=2, order_id=1, first_name="bar")
order_orm = OrderOrm(id=1)
order_orm.billing.append(billing_orm_1)
order_orm.billing.append(billing_orm_2)
order_model = Order.from_orm(order_orm)
# Output returns 'None' for name instead of ['foo','bar']
print(order_model) # id=1 name=None
(This script is complete, it should run "as is")
The output returns name=None instead of the desired list of names.
In the above example, I am using Order.from_orm to create the Pydantic model. This approach seems to be the same that is used by FastApi when specifying a response model. The desired solution should support use in the FastApi response model as shown in this example:
#router.get("/orders", response_model=List[schemas.Order])
async def list_orders(db: Session = Depends(get_db)):
return get_orders(db)
Update:
Regarding MatsLindh comment to try validators, I replaced the init function with a root validator, however, I'm unable to mutate the return values to include a new attribute. I suspect this issue is because it is a ORM object and not a true dictionary. The following code will extract the names and print them in the desired list. However, I can't see how to include this updated result in the model response:
#root_validator(pre=True)
def flatten(cls, values):
if isinstance(values, GetterDict):
names = [
billing_entry.first_name for billing_entry in values.get("billing")
]
print(names)
# values["name"] = names # error: 'GetterDict' object does not support item assignment
return values
I also found a couple other discussions on this problem that led me to try this approach:
https://github.com/samuelcolvin/pydantic/issues/717
https://gitmemory.com/issue/samuelcolvin/pydantic/821/744047672
What if you override the from_orm class method?
class Order(BaseModel):
id: int
name: List[str] = None
billing: List[Billing]
class Config:
orm_mode = True
#classmethod
def from_orm(cls, obj: Any) -> 'Order':
# `obj` is the orm model instance
if hasattr(obj, 'billing'):
obj.name = obj.billing.first_name
return super().from_orm(obj)
I really missed the handy Django REST Framework serializers while working with the FastAPI + Pydantic stack... So I wrangled with GetterDict to allow defining field getter function in the Pydantic model like this:
class User(FromORM):
fullname: str
class Config(FromORM.Config):
getter_dict = FieldGetter.bind(lambda: User)
#staticmethod
def get_fullname(obj: User) -> str:
return f'{obj.firstname} {obj.lastname}'
where the magic part FieldGetter is implemented as
from typing import Any, Callable, Optional, Type
from types import new_class
from pydantic import BaseModel
from pydantic.utils import GetterDict
class FieldGetter(GetterDict):
model_class_forward_ref: Optional[Callable] = None
model_class: Optional[Type[BaseModel]] = None
def __new__(cls, *args, **kwargs):
inst = super().__new__(cls)
if cls.model_class_forward_ref:
inst.model_class = cls.model_class_forward_ref()
return inst
#classmethod
def bind(cls, model_class_forward_ref: Callable):
sub_class = new_class(f'{cls.__name__}FieldGetter', (cls,))
sub_class.model_class_forward_ref = model_class_forward_ref
return sub_class
def get(self, key: str, default):
if hasattr(self._obj, key):
return super().get(key, default)
getter_fun_name = f'get_{key}'
if not (getter := getattr(self.model_class, getter_fun_name, None)):
raise AttributeError(f'no field getter function found for {key}')
return getter(self._obj)
class FromORM(BaseModel):
class Config:
orm_mode = True
getter_dict = FieldGetter

Pydantic enum field does not get converted to string

I am trying to restrict one field in a class to an enum. However, when I try to get a dictionary out of class, it doesn't get converted to string. Instead it retains the enum. I checked pydantic documentation, but couldn't find anything relevant to my problem.
This code is representative of what I actually need.
from enum import Enum
from pydantic import BaseModel
class S(str, Enum):
am = 'am'
pm = 'pm'
class K(BaseModel):
k: S
z: str
a = K(k='am', z='rrrr')
print(a.dict()) # {'k': <S.am: 'am'>, 'z': 'rrrr'}
I'm trying to get the .dict() method to return {'k': 'am', 'z': 'rrrr'}
You need to use use_enum_values option of model config:
use_enum_values
whether to populate models with the value property of enums, rather than the raw enum. This may be useful if you want to serialise model.dict() later (default: False)
from enum import Enum
from pydantic import BaseModel
class S(str, Enum):
am='am'
pm='pm'
class K(BaseModel):
k:S
z:str
class Config:
use_enum_values = True # <--
a = K(k='am', z='rrrr')
print(a.dict())
You can use FastAPI's jsonable_encoder:
from enum import Enum
from pydantic import BaseModel
from fastapi.encoders import jsonable_encoder
class S(str, Enum):
am = 'am'
pm = 'pm'
class K(BaseModel):
k: S
z: str
a = K(k='am', z='rrrr')
print(jsonable_encoder(a)) # {'k': 'am', 'z': 'rrrr'}
You can refer to the value using the left hand side of the enum key. Because they are constants I tend to uppercase them as well, but you of course don't have to. Using the left hand side is the expected normal use pattern:
from enum import Enum
from pydantic import BaseModel
class S(str, Enum):
am = 'am'
pm = 'pm'
class K(BaseModel):
k: S
z: str
a = K(k=am, z='rrrr')
print(a.dict())

Categories

Resources