Mongodb aggregate query with condition - python

I have to perform aggregate on mongodb in python and unable to do so.
Below is the structure of mongodb document extracted:
{'Category': 'Male',
'details' :[{'name':'Sachin','height': 6},
{'name':'Rohit','height': 5.6},
{'name':'Virat','height': 5}
]
}
I want to return the height where name is Sachin by the aggregate function. Basically my idea is to extract data by $match apply condition and aggregate at the same time with aggregate function. This can be easily done by doing in 3 steps with if statements but i'm looking to do in 1 aggregate function.
Please note: there is not fixed length of 'details' value.
Let me know if any more explanation is needed.

You can do a $filter to achieve
db.collection.aggregate([
{
$project: {
details: {
$filter: {
input: "$details",
cond: {
$eq: [
"$$this.name",
"Sachin"
]
}
}
}
}
}
])
Working Mongo playground
If you use in find, but you need to be aware of positional operator
db.collection.find({
"details.name": "Sachin"
},
{
"details.$": 1
})
Working Mongo playground
If you need to make it as object, you can simply use $arrayElemAr with $ifNull

Related

How do I update fields for multiple elements in an array with different values in MongoDB?

I have data of the form:
{
'_id': asdf123b51234
'field2': 0
'array': [
0: {
'unique_array_elem_id': id
'nested_field': {
'new_field_i_want_to_add': value
}
}
...
]
}
I have been trying to update like this:
for doc in update_dict:
collection.find_one_and_update(
{'_id':doc['_id']},
{'$set': {
'array.$[elem].nested_field.new_field_i_want_to_add':doc['new_field_value']
}
},
array_filters=[{'elem.unique_array_elem_id':doc['unique_array_elem_id']}]
But it is painfully slow. Updating all of my data will take several days running continuously. Is there a way to update this nested field for all array elements for a given document at once?
Thanks a lot

how to add values using mongodb aggregation

is there any way to add values via aggregation
like db.insert_one
x = db.aggregate([{
"$addFields": {
"chat_id": -10013345566,
}
}])
i tried this
but this code return nothing and values are not updated
i wanna add the values via aggregation
cuz aggregation is way faster than others
sample document :
{"_id": 123 , "chat_id" : 125}
{"_id": 234, "chat_id" : 1325}
{"_id": 1323 , "chat_id" : 335}
expected output :
alternative to db.insert_one() in mongodb aggregation
You have to make use of $merge stage to save output of the aggregation to the collection.
Note: Be very very careful when you use $merge stage as you can accidentally replace the entire document in your collection. Go through the complete documentation of this stage before using it.
db.collection.aggregate([
{
"$match": {
"_id": 123
}
},
{
"$addFields": {
"chat_id": -10013345566,
}
},
{
"$merge": {
"into": "collection", // <- Collection Name
"on": "_id", // <- Merge operation match key
"whenMatched": "merge" // <- Operation to perform when matched
}
},
])
Mongo Playground Sample Execution

Minimizing and aggregating a multi-level dictionary

Given a multi-level dictionary (#levels are unknown beforehand), I want to modify this dictionary to be no more than 3 level.
For example, below you can see a 5 level dictionary as an input:
{ K11:
{K21:
{ K31:
{ K41: VAL41,
K42: VAL42
},
K32: VAL32,
K33:
{K43:
{K51:V51}
}
}
}
}
The desired output is the following 3 level dict:
{ K11:
{K21:
{ K31.K41: VAL41,
K31.K42: VAL42,
K32: VAL32,
K33.K43.K51: V51
}
}
}
Basically starting from level 4+, I want to combine the keys all together and assign them at level 3 (Lets assume that combinations of these keys are always unique ...)
Any idea how to implement such method? I'm trying to implement a recursive function that will keep digging to last level, and then somehow to rebuild the dictionary backwards - however so far no success.
I'll appreciate if you can share your thoughts, thanks!
Try using json_normalize in pandas like this
from pandas import json_normalize
d = {'K11':
{'K21':
{'K31':
{'K41': 'VAL41',
'K42': 'VAL42'
},
'K32': 'VAL32',
'K33':
{'K43':
{'K51': 'V51'}
}
}
}
}
print(json_normalize(d['K11']['K21'], max_level=2).to_dict('records'))

How to group nested fields in mongodb aggregation

My document is a little complicated, it is like:
{
'_id': 1,
'words':
{
'a':
{
'positions': [1,2,3],
'count':3,
},
'and':
{
'positions': [4, 5],
'count': 2,
}
}
}
I have many ducuments that contains very same words, I want aggregate all the fields in words. To give it me like:
{
'a': 5 #a's sum count
'and': 6 #and's sum count
}
I read this article, MongoDB - group composite key with nested fields, but unluckily, the structure is different, what they do is group the array field, not the nested field.
Any advice? Hope your answer and help, thanks in advance.
You can try following aggregation:
db.col.aggregate([
{
$project: {
wordsAsKeyValuePairs: {
$objectToArray: "$words"
}
}
},
{
$unwind: "$wordsAsKeyValuePairs"
},
{
$group: {
_id: "$wordsAsKeyValuePairs.k",
count: { $sum: "$wordsAsKeyValuePairs.v.count"}
}
}
])
To aggregate over your fields you need to use $objectToArray to decompose object into array of key-value pairs. Then we're just $unwinding such array to be able to group by k which is single word and sum all the counts.

MongoDB: find non exisiting or specific value

I know about $exists in MongoDB but I don't know how to combine it with an OR logic within find().
I want to find all transactions where the base_currency field is either not existing or has a specific value. At the same time trade_currency must have a specific value. Here's what I tried but doesn't work.
txs = db.transactions.find({
'base_currency': { $or: [{ $exists: true }, { $eq: base_currency }]},
'trade_currency': currency
}).sort([('datetime_closed', 1)])
You can use $and combined with $or like this:
db.transactions.find({
"$and": [
{"trade_currency" : currency},
{"$or": [{"base_currency": {$exists: false}}, {"base_currency":base_currency }]},
]
})
If you want to check that field is missing you have to use $exists: false

Categories

Resources