Add dict into Item dynamodb python - python

I have an item register on dynamodb with the next structure:
{
"OwnerID":"12312wqeq",
"license":"23423werwegdf",
"MaintenanceList":{
"10-11-2018":{
"garage" : "lopcars",
"city" : "NY",
"country" "USA",
"location" : "1929-1927 Fulton St Brooklyn"
}
}
}
I need to add a new Maintenance to the list, and I tried this:
response=table.update_item(
Key={
"OwnerID":"12312wqeq",
"license":"23423werwegdf",'
}
,UpdateExpression = "SET #d1=:dt",
ExpressionAttributeValues = {
':dt' : "12-11-2019":{
"garage" : "Crazycars",
"city" : "NY",
"country" "USA",
"location" : "120 E Suffolk Ave Central Islip"
}
}
},
ExpressionAttributeNames={
'#d1' : 'MaintenanceList'
},
ReturnValues="UPDATED_NEW"
)
but overwrite the attribute MaintenanceList and I need it to look like this after update:
{
"OwnerID":"12312wqeq",
"license":"23423werwegdf",
"MaintenanceList":{
"10-11-2018":{
"garage" : "lopcars",
"city" : "NY",
"country" "USA",
"location" : "1929-1927 Fulton St Brooklyn"
},
"12-11-2019":{
"garage" : "Crazycars",
"city" : "NY",
"country" "USA",
"location" : "120 E Suffolk Ave Central Islip"
}
}
}

The SET MaintenanceList=:dt expression indeed replaces the value of the the attributed called MaintenanceList. If you wanted the content of this attribute to be a hash table, and add to it, you need to update it using a nested atribute path as explained in this DynamoDB documentation. For example do something like SET #d1.#date=:dt.
However, note that keeping a hash table inside a single attribute's value is problematic - its total size is strictly limited (to 400KB) and you'll also pay for the entire item size every time you read or write a small part of it.

Related

How to know the last time a product has been updated MongoD?

I am working on a project where I want to filter by the products that hasn't been updated in 2 months or a determinated date.(that don't have a new item price in the last 2 months or any other date I want to)
I want to do the script in python.
All my db are json that follow this estructure:
And to access it i do mongo_client[db_name][coll_name] and then i normally use .find() or .aggregate()
{
"_id" : ObjectId("6188f511091533324af78fbf"),
"market" : "x",
"product" : "apple",
"item_price_history" : [
{
"item_price" : 219.0,
"date" : ISODate("2021-04-08T15:30:43.000Z")
},
{
"item_price" : 248.0,
"date" : ISODate("2021-04-22T08:02:28.000Z")
}
Do you have any idea of how can I do that? I use the lastest version of Python and Robo 3T-1.4
Thanks in advance
You can look at the data in the item_price_history to check when that field was last updated. But you don't seem to have a way to track when the other field were updated.
Going forward, you could try adding a pre-save hooks to store the last updated datetime if you're using an ODM like MongoEngine.
Refer pre_save method here.
import dateutil.parser
list_to_sort = {
"market" : "x",
"product" : "apple",
"item_price_history" : [
{
"item_price" : 219.0,
"date" : "2021-04-08T15:30:43.000Z"
},
{
"item_price" : 248.0,
"date" : "2021-04-22T08:02:28.000Z"
}]
}
candidates = list_to_sort.values()
for item in candidates:
if isinstance(item,list):
list_to_sort = item
def myfunc(item):
time = dateutil.parser.parse(item["date"])
return time.timestamp()
list_to_sort.sort(key=myfunc)
print(list_to_sort)
this will sort the list based on custom function myfunc

Extracting and updating a dictionary from array of dictinaries in MongoDB

I have a structure like this:
{
"id" : 1,
"user" : "somebody",
"players" : [
{
"name" : "lala",
"surname" : "baba",
"player_place" : "1",
"start_num" : "123",
"results" : {
"1" : { ... }
"2" : { ... },
...
}
},
...
]
}
I am pretty new to MongoDB and I just cannot figure out how to extract results for a specific user (in this case "somebody", but there are many other users and each has an array of players and each player has many results) for a specific player with start_num.
I am using pymongo and this is the code I came up with:
record = collection.find(
{'user' : name}, {'players' : {'$elemMatch' : {'start_num' : start_num}}, '_id' : False}
)
This extracts players with specific player for a given user. That is good, but now I need to get specific result from results, something like this:
{ 'results' : { '2' : { ... } } }.
I tried:
record = collection.find(
{'user' : name}, {'players' : {'$elemMatch' : {'start_num' : start_num}}, 'results' : result_num, '_id' : False}
)
but that, of course, doesn't work. I could just turn that to list in Python and extract what I need, but I would like to do that with query in Mongo.
Also, what would I need to do to replace specific result in results for specific player for specific user? Let's say I have a new result with key 2 and I want to replace existing result that has key 2. Can I do it with same query as for find() (just replacing method find with method replace or find_and_replace)?
You can replace a specific result and the syntax for that should be something like this,
assuming you want to replace the result with key 1,
collection.updateOne({
"user": name,
"players.start_num": start_num
},
{ $set: { "players.$.results.1" : new_result }})

Correctly referencing a JSON in Python. Strings versus Integers and Nested items

Sample JSON file below
{
"destination_addresses" : [ "New York, NY, USA" ],
"origin_addresses" : [ "Washington, DC, USA" ],
"rows" : [
{
"elements" : [
{
"distance" : {
"text" : "225 mi",
"value" : 361715
},
"duration" : {
"text" : "3 hours 49 mins",
"value" : 13725
},
"status" : "OK"
}
]
}
],
"status" : "OK"
}
I'm looking to reference the text value for distance and duration. I've done research but i'm still not sure what i'm doing wrong...
I have a work around using several lines of code, but i'm looking for a clean one line solution..
thanks for your help!
If you're using the regular JSON module:
import json
And you're opening your JSON like this:
json_data = open("my_json.json").read()
data = json.loads(json_data)
# Equivalent to:
data = json.load(open("my_json.json"))
# Notice json.load vs. json.loads
Then this should do what you want:
distance_text, duration_text = [data['rows'][0]['elements'][0][key]['text'] for key in ['distance', 'duration']]
Hope this is what you wanted!

How to store facepy data into mongodb?

I am using this approach to get the comments on page data.Its working fine,but I need to dump the data into MongoDB. Using this approach data is inserted but as a single document.I want to store that every comment should have a separate document with the information I am getting from the API.
from facepy import GraphAPI
import json
import pymongo
import json
connection = pymongo.MongoClient("mongodb://localhost")
facebook = connection.facebook
commen = facebook.comments
access = ''
#message
graph = GraphAPI(access)
page_id= 'micromaxinfo'
datas= graph.get(page_id+'/posts?fields=comments,created_time', page=True, retry=5)
posts=[]
for data in datas:
print data
commen.insert(data)
break
Output Stored in MongoDB:
{
"created_time" : "2015-11-04T08:04:14+0000",
"id" : "120735417936636_1090909150919253",
"comments" : {
"paging" : {
"cursors" : {
"after" : "WTI5dGJXVnVkRjlqZFhKemIzSTZNVEE1TVRReE5ESTVOelV6TlRRd05Ub3hORFEyTnpFNU5UTTU=",
"before" : "WTI5dGJXVnVkRjlqZFhKemIzSTZNVEE1TURrd09UVTRNRGt4T1RJeE1Eb3hORFEyTmpJME16Z3g="
}
},
"data" : [
{
"created_time" : "2015-11-04T08:06:21+0000",
"message" : "my favorite mobiles on canvas silver",
"from" : {
"name" : "Velchamy Alagar",
"id" : "828304797279948"
},
"id" : "1090909130919255_1090909580919210"
},
{
"created_time" : "2015-11-04T08:10:13+0000",
"message" : "Micromax mob. मैने कुछ दिन पहले Micromax Bolt D321 mob. खरिद लिया | Bt मेरा मोबा. बहुत गरम होता है Without internate. और internate MB कम समय मेँ ज्यादा खर्च होती है | कोई तो help करो.",
"from" : {
"name" : "Amit Gangurde",
"id" : "1637669796485258"
},
"id" : "1090909130919255_1090910364252465"
},
{
"created_time" : "2015-11-04T08:10:27+0000",
"message" : "Nice phones.",
"from" : {
"name" : "Nayan Chavda",
"id" : "1678393592373659"
},
"id" : "1090909130919255_1090910400919128"
},
{
"created_time" : "2015-11-04T08:10:54+0000",
"message" : "sir micromax bolt a089 mobile ki battery price kitna. #micromax mobile",
"from" : {
"name" : "Arit Singha Roy",
"id" : "848776351903695"
},
So technically I want to store only information coming in data field:
{
"created_time" : "2015-11-04T08:10:54+0000",
"message" : "sir micromax bolt a089 mobile ki battery price kitna. #micromax mobile",
"from" : {
"name" : "Arit Singha Roy",
"id" : "848776351903695"
}
How to get this into my database?
You can use the pentaho data integration open source ETL tool for this. I use it to store specific fields from the JSON output for tweets.
Select the fields you want to parse from the JSON and select an output as csv or table output in Oracle etc.
Hope this helps

How to Query this in MongoDB?

My items store in MongoDB like this :
{"ProductName":"XXXX",
"Catalogs" : [
{
"50008064" : "Apple"
},
{
"50010566" : "Box"
},
{
"50016422" : "Water"
}
]}
Now I want query all the items belong to Catalog:50008064,how to?
(the catalog id "50008064" , catalog name "Apple")
You cannot query this in an efficient manner and performance will decrease as your data grows. As such I would consider it a schema bug and you should refactor/migrate to the following model which does allow for indexing :
{"ProductName":"XXXX",
"Catalogs" : [
{
id : "50008064",
value : "Apple"
},
{
id : "50010566",
value : "Box"
},
{
id : "50016422",
value : "Water"
}
]}
And then index :
ensureIndex({'Catalogs.id':1})
Again, I strongly suggest you change your schema as this is a potential performance bottleneck you cannot fix any other way.
This should probably work according to the entry here, although this won't be very fast, as stated in in the link.
db.products.find({ "Catalogs.50008064" : { $exists: true } } )

Categories

Resources