I am trying to update only given fields in an Object. In this case the object is a user whose schema looks like this:
class BasicUser(BaseModel):
timestamp: datetime = Field(default_factory=datetime.now)
name: str = Field(...)
surname: str = Field(...)
email: EmailStr = Field(...)
phone: constr(
strip_whitespace=True,
regex=r"^[\+]?[(]?[0-9]{3}[)]?[-\s\.]?[0-9]{3}[-\s\.]?[0-9]{4,6}$",
)
age: PositiveInt = Field(...)
This user is only a part of the entire document. The full document looks like this:
I am using FastAPI and I want to receive only the fields that need updating from the frontend,
my endpoint looks like this:
#router.put('/user/{id}', response_description='Updates a user', response_model=DocumentWithoutPassword)
async def updateUser (id: str, user: BasicUserUpdate = Body(...)):
new_user = {k: v for k, v in user.dict().items() if v is not None}
new_user = jsonable_encoder(new_user)
if len(new_user) >= 1:
update_result = await dbConnection.update_one({"_id": id}, {"$set": { "user" : new_user}})
the body of the request can include any number of fields (includes only the fields that need updating)
for example the body could look like:
{
"email" : "abcd123#gmail.com"
}
or
{
"email" : "abcd123#gmail.com",
"phone" : "+123456789"
}
The problem with the above code is that when the request arrives, instead of updating the fields, it overwrites the entire user with only the email (if the email was sent) or the email and phone (if they were both sent).
So my question is how can I update specific values in user without overwriting everything
e.g. I send {"email" : "abcd123#gmail.com"} as the body, but instead of updating the email and leaving everything else as-is.
This is the result:
Assuming that the generated new_user object does indeed only contain fields to change that are nested inside of the user field in the document (which sounds like it is the case), then probably the most straightforward option here is to use an aggregation pipeline to describe the update. This approach allows you to access a variety of pipeline operators notably the $mergeObjects operator. Changing the following line:
update_result = await dbConnection.update_one({"_id": id}, {"$set": { "user" : new_user}})
To something like:
update_result = await dbConnection.update_one({"_id": id}, [ { $set: { user: { $mergeObjects: [ "$user", new_user ] } } } ])
Should yield the results that you want. See a demonstration of it at this playground link.
One general question does come to mind. If these documents represent users, then is there any particular value in nesting all of the fields underneath a parent user field? It probably doesn't make a big difference one way or another in the end, but certainly the fact that this question was asked helps demonstrate some minor additional friction that can be encountered by nesting data (especially if it is unnecessary).
Related
Let's say I have a route that allows clients to create a new user
(pseudocode)
#app.route("POST")
def create_user(user: UserScheme, db: Session = Depends(get_db)) -> User:
...
and my UserScheme accepts a field such as an email. I would like to be able to set some settings (for example max_length) globally in a different model Settings. How do I access that inside a scheme? I'd like to access the db inside my scheme.
So basically my scheme should look something like this (the given code does not work):
class UserScheme(BaseModel):
email: str
#validator("email")
def validate_email(cls, value: str) -> str:
settings = get_settings(db) # `db` should be set somehow
if len(value) > settings.email_max_length:
raise ValueError("Your mail might not be that long")
return value
I couldn't find a way to somehow pass db to the scheme. I was thinking about validating such fields (that depend on db) inside my route. While this approach works somehow, the error message itself is not raised on the specific field but rather on the entire form, but it should report the error for the correct field so that frontends can display it correctly.
One option to accept arbitrary JSON objects as input, and then construct a UserScheme instance manually inside the route handler:
#app.route(
"POST",
response_model=User,
openapi_extra={
"requestBody": {
"content": {
"application/json": {
"schema": UserScheme.schema(ref_template="#/components/schemas/{model}")
}
}
}
},
)
def create_user(request: Request, db: Session = Depends(get_db)) -> User:
settings = get_settings(db)
user_data = request.json()
user_schema = UserScheme(settings, **user_data)
Note that this idea was borrowed from https://stackoverflow.com/a/68815913/2954547, and I have not tested it myself.
In order to facilitate the above, you might want to redesign this class so that the settings object itself as an attribute on the UserScheme model, which means that you don't ever need to perform database access or other effectful operations inside the validator, while also preventing you from instantiating a UserScheme without some kind of sensible settings in place, even if they are fallbacks or defaults.
class SystemSettings(BaseModel):
...
def get_settings(db: Session) -> SystemSettings:
...
EmailAddress = typing.NewType('EmailAddress', st)
class UserScheme(BaseModel):
settings: SystemSettings
if typing.TYPE_CHECKING:
email: EmailAddress
else:
email: str | EmailAddress
#validator("email")
def _validate_email(cls, value: str, values: dict[str, typing.Any]) -> EmailAddress:
if len(value) > values['settings'].max_email_length:
raise ValueError('...')
return EmailAddress(value)
The use of tyipng.NewType isn't necessary here, but I think it's a good tool in situations like this. Note that the typing.TYPE_CHECKING trick is required to make it work, as per https://github.com/pydantic/pydantic/discussions/4823.
I am trying to serialize a json data through serializers.Serializer
{
"data": {
"phoneNumber": "1234567890",
"countryCode": "+11",
"otp": "73146",
}
}
The sterilizer class I wrote for it
class VerifyOtpSerializer(serializers.Serializer):
phone_number = serializers.CharField(max_length=225, source='phoneNumber', required=True)
country_code = serializers.CharField(max_length=225, source='countryCode', required=True)
otp = serializers.CharField(max_length=255, required=True)
and also
I don't know why source is not working, I tried the JSON in the picture below but still it's saying the field is required
source value is what the passed value's key will be changed into. So source value is expected to be on your Model.
The name of the attribute that will be used to populate the field.
What you really want is something that changes camel case payload into a snake case. Just use djangorestframework-camel-case and remove source from your serializer fields.
Your keys are wrong in the request. as Tom said the source should be an attribute of the model object. so you have to match keys in request and serializer
change phoneNumber > phone_number
change countryCode > country_code
The response object you are are sending to your serializer is in correct. The key of your request object should be exactly what you have defined in your serializer.
Try sending this to your serializer.
{
"data" : {
"phone_number":"1234567890",
"country_code":"+11",
"otp":"73146"
}
}
I am trying to make a membership API using django rest frameworks.I made a code and checked that the function was working properly. In my current code, if the data of the email, password, and username are empty, the message is given as follows.
{
"email": [
"This field is required."
],
"username": [
"This field is required."
],
}
But after talking about this with my team's client developers, they said it was better to give a unified message as below.
{
"message": "email field is required."
}
How can I customize the value like this? Here's my code.
class customSignUpView (GenericAPIView) :
serializer_class = customRegisterSerializer
def post (self, request) :
user = request.data
serializer = self.serializer_class(data=user)
serializer.is_valid(raise_exception=True)
serializer.save()
user = User.objects.get(email=serializer.data['email'])
token = RefreshToken.for_user(user).access_token
current_site = get_current_site(request).domain
relativeLink = reverse('emailVerify')
absurls = F'http://{current_site}{relativeLink}?token={token}'
email_body = F'Hi {user.username} Use link below to verify your email \n{absurls}'
data = {'email_body': email_body, 'to_email': user.email, 'email_subject': 'Verify your email'}
Util.send_email(data)
return Response({'message': 'check your email.'}, status=201)
you need to customize customRegisterSerializer further by adding a custome validate method to it, just try to do something like this
class YourSerializer(serializers.Serializer):
field1 = serializers.CharField(args)
...
fieldn = serializers.CharField(args)
def validate(self, data):
error = {}
if 'some_field' in data:
test field is valid here
if data['some_field'] is not valid:
error['some_field'] = 'your message'
.... ad nauseam
if error:
raise serializers.ValidationError(error)
return data
pass the arguments as the data parameter, and you should be able to customize everything however you want
First of all would like to say that the standard DRF approach to error messages is more universal, as it allows sending several messages for several fields in a unified way. I.e. that in the returned JSON key is the field name and value - the list of messages. Which also allows FE to display the messages next to the appropriate field.
Cause with the format you're trying to achieve comes the question of what kind of message to send if both email and username were not provided but are required (for e.g. should it be one message string or a list of "{field_name} is required" strings?).
But if you really need to achieve the approach you mentioned, let me elaborate on the answer by #vencaslac. So in your case the serializer should roughly look like:
class CustomRegisterSerializer(serializers.ModelSerializer):
...
def validate(self, data):
if not data.get("email"):
raise serializers.ValidationError({"message": "email field is required."})
return data
The validate() method is the same for both Serializer and ModelSerializer. You can find more info in the docs. But again, with this approach you need to figure out an answer to the question I mentioned above.
I want to run a POST call that creates a "group". Assume that "persons" will all be existent. Even if they aren't, an error is not a problem.
class Group(models.Model):
title = models.CharField(max_length=70)
persons = models.ManyToManyField(to=Person, blank=True)
file = models.FileField(upload_to=file_location, null=True, blank=True)
class GroupSerializer(serializers.ModelSerializer):
persons = serializers.PrimaryKeyRelatedField(many=True, queryset=Person.objects.all())
class Meta:
model = Group
fields = '__all__'
If I send a JSON like
{
"title": "Drama Club",
"persons": [1,2,3]
}
it will work. But since I cannot upload a file, I use FORM-DATA.
title: Drama Club
persons: [1,2,3]
file: <whatever the format is>
Now here comes the problem. IT DOES NOT WORK. It returns this error
{
"persons": [
"Incorrect type. Expected pk value, received str."
]
}
Even if I remove everything else, and just send persons: [1,2,3] as form-data, it returns the same error.
I really cannot understand this behavior. (I am using POSTMAN to check this)
It turns out that form-data does not take an array. So, instead of
persons : [1,2,3]
I will have to send
persons : 1
persons : 2
persons : 3
Django-rest-framework will do the rest.
Putting this answer here, because it took me a very long time to figure it out.
(if there is a way to send an array in form-data, without special parsing on the backend, I would love to know)
If you want to upload a file and send the JSON payload as well, take a look at DRF MultipartParser
I'm trying to query for some data using the gcloud api that I just discovered. I'd like to query for a KeyPropery. e.g.:
from google.appengine.ext import ndb
class User(ndb.Model):
email = ndb.StringProperty()
class Data(ndb.Model):
user = ndb.KeyProperty('User')
data = ndb.JsonProperty()
In GAE, I can query this pretty easily assuming I have a user's key:
user = User.query(User.email == 'me#domain.com').get()
data_records = Data.query(Data.user == user.key).fetch()
I'd like to do something similar using gcloud:
from gcloud import datastore
client = datastore.Client(project='my-project-id')
user_qry = client.query(kind='User')
user_qry.add_filter('email', '=', 'me#domain.com')
users = list(user_qry.fetch())
user = users[0]
data_qry = client.query(kind='Data')
data_qry.add_filter('user', '=', user.key) # This doesn't work ...
results = list(data_qry.fetch()) # results = []
Looking at the documentation for add_filter, it doesn't appear that Entity.key is a supported type:
value (int, str, bool, float, NoneType, :classdatetime.datetime) – The value to filter on.
Is it possible to add filters for key properties?
I've done a bit more sleuthing to try to figure out what is really going on here. I'm not sure that this is helpful for me to understand this issue at the present, but maybe it'll be helpful for someone else.
I've mocked out the underlying calls in the respective libraries to record the protocol buffers that are being serialized and sent to the server. For GAE, it appears to be Batch.create_async in the datastore_query module.
For gcloud, it is the datastore.Client.connection.run_query method. Looking at the resulting protocol buffers (anonymized), I see:
gcloud query pb.
kind {
name: "Data"
}
filter {
composite_filter {
operator: AND
filter {
property_filter {
property {
name: "user"
}
operator: EQUAL
value {
key_value {
partition_id {
dataset_id: "s~app-id"
}
path_element {
kind: "User"
name: "user_string_id"
}
}
}
}
}
}
}
GAE query pb.
kind: "Data"
Filter {
op: 5
property <
name: "User"
value <
ReferenceValue {
app: "s~app-id"
PathElement {
type: "User"
name: "user_string_id"
}
}
>
multiple: false
>
}
The two libraries are using different versions of the proto as far as I can tell, but the data being passed looks very similar...
This is a subtle bug with your use of the ndb library:
All ndb properties accept a single positional argument that specifies the property's name in Datastore
Looking at your model definition, you'll see user = ndb.KeyProperty('User'). This isn't actually saying that the user property is a key of a User entity, but that it should be stored in Datastore with the property name User. You can verify this in your gae protocol buffer query where the property name is (case sensitive) User.
If you want to limit the key to a single kind, you need to specify it using the kind option.
user = ndb.KeyProperty(kind="User")
The KeyProperty also supports:
user = ndb.KeyProperty(User) # User is a class here, not a string
Here is a description of all the magic.
As it is now, your gcloud query is querying for the wrong cased user and should be:
data_qry = client.query(kind='Data')
data_qry.add_filter('User', '=', user.key)