how to scrape page inside the result card using BeatifulSoup? - python

<a href="https://magicpin.in/New-Delhi/Laxmi-Nagar/Restaurant/Bangalore-Ki-Famous-Biriyani/store/317630/?utm_source=search" data-type="around-merchant-card" onclick="sendEvent('web_searchlandingpage', 'click_search_merchant_card');">
<div class="merchant-logo-holder lazyloaded" data-bg="https://lh3.googleusercontent.com/UE8HJOOmV0hZsCbxbwQyha8UNa2ETY3-3zExps_mxUi-nbxVkgEBXo0kxghMHOUp87pge5tqIGrsU9Pu-iJ3h0_IETE=w64" style="background-image: url("https://lh3.googleusercontent.com/UE8HJOOmV0hZsCbxbwQyha8UNa2ETY3-3zExps_mxUi-nbxVkgEBXo0kxghMHOUp87pge5tqIGrsU9Pu-iJ3h0_IETE=w64");" title=", Laxmi Nagar, New Delhi"></div>
</a>
<div class="merchant-name-address">
<a href="https://magicpin.in/New-Delhi/Laxmi-Nagar/Restaurant/Bangalore-Ki-Famous-Biriyani/store/317630/?utm_source=search" data-type="around-merchant-card" onclick="sendEvent('web_searchlandingpage', 'click_search_merchant_card');">
<h3 class="merchant-name">
Bangalore Ki Famous Biriyani
</h3>
<h4 class="merchant-location">
Laxmi Nagar
, New Delhi
</h4>
</a>
<!--
<a href="" data-type="merchant_card_links" data-target="subcategory">
<h5 class="subcategory-tag"> </h5>
</a>
-->
</div>
<div class="rating" style="background-color: #8bcc00; border-color: #8bcc00;">
<img src="https://static.magicpin.com/samara/static/images/merchant/star-white.svg" class="star" onerror="this.onerror=null;this.alt='';recordBrokenImages(this,false,4);">
<span class="rating-value">3.7</span>
</div>
<section class="merchant-details">
<div class="cft-timing">
<article class="detail-heading cft-heading">Average Spent: </article>
<span class="detail-value">₹500</span>
</div>
<div class="merchant-attributes">
<div class="cover-holder">
<a href="https://magicpin.in/New-Delhi/Laxmi-Nagar/Restaurant/Bangalore-Ki-Famous-Biriyani/store/317630/?utm_source=search" data-type="around-merchant-card" onclick="sendEvent('web_searchlandingpage', 'click_search_merchant_card');">
<div class="merchant-cover lazyloaded" data-bg="https://lh3.googleusercontent.com/W70wpIcovlvssmLSpcyub4RjHABennVNRWxznclaxD7PEcZhdJrgygOwn5qJ3XrlYq9Yv90k-w2Ld0lTfaylBxIGmw=w512" style="background-image: url("https://lh3.googleusercontent.com/W70wpIcovlvssmLSpcyub4RjHABennVNRWxznclaxD7PEcZhdJrgygOwn5qJ3XrlYq9Yv90k-w2Ld0lTfaylBxIGmw=w512");" title=", Laxmi Nagar, New Delhi">
</div>
</a>
</div>
<div class="details">
<article class="detail-heading">Highlights: </article>
<span class="detail-value">
<span class="comma-separator">
Magic Weekend 14 16
</span>
<span class="comma-separator">
Mw 20200123
</span>
</span>
</div>
</div>
</section>
<div class="merchant-card-actions">
<div class="action claim-deal-button-react show-mb" data-sku="food_1158872_other" data-merchantname="Bangalore Ki Famous Biriyani" data-dealid="1164110">
<p class="cashback">
CASHBACK<span>upto 10.0% OFF</span>
</p>
</div>
<a class="cashback hide-mb action" target="_blank" href="https://magicpin.in/deal/?dealId=1164110&userId=5290338">
CASHBACK<span>upto 10.0% OFF</span>
</a>
</div>
</div>
page URL - https://magicpin.in/Delhi/search/?dist=10&query=biriyani&rt=3
this page contains some restaurants card now while scrapping the page in the loop I want to go inside the restaurant card and scrape the no. of reviews from inside it, I don't know how to do it I used this code to scrape front page
import requests
from bs4 import BeautifulSoup
import pandas as pd
url = "https://magicpin.in/Delhi/search/?dist=10&query=biriyani&rt=3" # URL of the website
header = {'User-Agent':'Mozilla/5.0 (X11; CrOS x86_64 8172.45.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.64 Safari/537.36'} # Temporary user agent
r = requests.get(url, headers=header)
soup = BeautifulSoup(r.content, 'html.parser')
divs = soup.find_all('div', class_ ="merchant-card-single")
for item in divs:
title = item.find('h3').text.strip() # restaurant name
loc = item.find('h4', class_ ="merchant-location").text.strip() # restaurant location
try: # used this try and except method because some restaurants are unrated and while scrpaping those we would run into an error
rating = item.find('div', class_="rating").text
except:
rating = None
pricce = item.find('div', class_="cft-timing").text.strip() # price for biriyani
biry_del = {
'name': title,
'location': loc,
'rating': rating,
'price': price
}
rest_list.append(biry_del)
I hope you guys understood please ask in comment for any confusion.

Note: Code in your question is not valid, variables are not defined or have typos and there is no expected result - So it is just a hint in the right direction
What happens?
Take a look into your soup, there is no <div> that contains the rating.
How to fix?
Select more specific - Rating is stored in a <span> with class rating-value:
rating = item.find('span', class_="rating-value").text
EDIT
Based on your comment you wanna switch to details and do things there - Just grab the url from <a> and perform another request:
url = item.find('a').get('href')
detailsSoup = BeautifulSoup(requests.get(url).text)
### look into your detailsSoup to find what you are searching for ...
To grab the raiting details it would be better to use the existing api - Extract merchands id from url and grab the json data:
url = item.find('a').get('href')
mid = url.split('/')[-2]
ratings = requests.get(f'https://magicpin.in/sam-api/merchants/get_merchant_reviews/?merchantUserId={mid}').json()['data']
Example
import requests
from bs4 import BeautifulSoup
import pandas as pd
url = "https://magicpin.in/Delhi/search/?dist=10&query=biriyani&rt=3" # URL of the website
header = {'User-Agent':'Mozilla/5.0 (X11; CrOS x86_64 8172.45.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.64 Safari/537.36'} # Temporary user agent
r = requests.get(url, headers=header)
soup = BeautifulSoup(r.content, 'html.parser')
rest_list=[]
divs = soup.find_all('div', class_ ="merchant-card-single")
for item in divs:
title = item.find('h3').text.strip() # restaurant name
loc = item.find('h4', class_ ="merchant-location").text.strip() # restaurant location
try: # used this try and except method because some restaurants are unrated and while scrpaping those we would run into an error
rating = item.find('span', class_="rating-value").text
except:
rating = None
price = item.find('div', class_="cft-timing").text.strip() # price for biriyani
url = item.find('a').get('href')
mid = url.split('/')[-2]
ratings = requests.get(f'https://magicpin.in/sam-api/merchants/get_merchant_reviews/?merchantUserId={mid}').json()['data']
biry_del = {
'name': title,
'location': loc,
'rating': rating,
'ratings': ratings,
'price': price
}
rest_list.append(biry_del)
print(rest_list)
Output
[{'name': 'BBC, Best Biriyani & Chicken', 'location': 'Satyaniketan\n , New Delhi', 'rating': '4.4', 'ratings': [{'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2021-12-02T23:17:47+05:30', 'post_id': 945109867, 'review': '', 'user': {'name': 'Gulzar', 'vanity': '2 Followers', 'badge': None, 'total_visits': 13, 'total_spent': 17641, 'user_id': '10663391', 'description': '', 'deeplink': 'magicpin://profileuser?userId=10663391', 'profile_pic': 'https://lh3.googleusercontent.com/V-R22m0t7GAJGn-bOhzjWXywVWk-GgwT75aBGh5Bq3ZXku8npkIGbQaBCKgnkCwuDFXRlrmbOBhEOTNu3-tXRwjdJOqfw_D_7OSzqDk8=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2021-08-03T14:49:12+05:30', 'post_id': 940277857, 'review': '', 'user': {'name': 'Dattatray Aadhav', 'vanity': '31 Followers', 'badge': None, 'total_visits': 318, 'total_spent': 636061, 'user_id': '6194299', 'description': '', 'deeplink': 'magicpin://profileuser?userId=6194299', 'profile_pic': 'https://lh3.googleusercontent.com/5eA-UBjZ6PBwDD2Pgb15xREya5D9Mma4PEU2Ka4G9MDMR1qwNKGrJO5Z6FMT63SplMo_o8V7xWYTg_oMdh3PeiC1bXUr=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2021-08-03T14:46:52+05:30', 'post_id': 940277730, 'review': "Didn't like the promotions & discounts at this place", 'user': {'name': 'Anshika', 'vanity': '9 Followers', 'badge': None, 'total_visits': 56, 'total_spent': 28936, 'user_id': '9555088', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9555088', 'profile_pic': 'https://lh3.googleusercontent.com/YXgpyigxHNH0OMNnOsYerRxfmHAxxDYd9nsxzSveKwEAV8wrJUYmwabrgu0Ah_GMd7Mup26WjbWtPu-kkTcby6ddCoPQZ1aOCioi1wY6=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 2, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2021-08-03T14:48:56+05:30', 'post_id': 940277460, 'review': 'Loved the quality of products, safety precautions, variety of options, and location of store at this place', 'user': {'name': 'Priyank Jain', 'vanity': '118 Followers', 'badge': None, 'total_visits': 389, 'total_spent': 143081, 'user_id': '1140253', 'description': '', 'deeplink': 'magicpin://profileuser?userId=1140253', 'profile_pic': 'https://lh3.googleusercontent.com/nL4eRFh8Hi-b59CP4cL2cO57olZLxnL7yTiZ6xpoNLG67FQsGtEm-eTrKE9EaZ_cRtcR2TbldWxtqU2oG8Ob0x7Aag=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2021-08-03T14:46:56+05:30', 'post_id': 940277155, 'review': 'Loved the quality of products, safety precautions, variety of options, location of store, and promotions & discounts at this place', 'user': {'name': 'Naman', 'vanity': '5 Followers', 'badge': None, 'total_visits': 105, 'total_spent': 65821, 'user_id': '6148749', 'description': '19MaleChill', 'deeplink': 'magicpin://profileuser?userId=6148749', 'profile_pic': 'https://lh3.googleusercontent.com/C5UMTYVWkz1mNFUKWDfYKtCKDHoSUIrHsifJ7kfWrmJUBxjjyLW1lS0gOs4H31dekHdAXESRwN8cnqKVvxbY4P8GpRihVnZflujJfAjS=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2021-01-28T11:56:12+05:30', 'post_id': 932109603, 'review': 'Loved the range of products, delivery time, and promotions & discounts at this place', 'user': {'name': 'sunil', 'vanity': '', 'badge': None, 'total_visits': 2, 'total_spent': 586, 'user_id': '10156497', 'description': '', 'deeplink': 'magicpin://profileuser?userId=10156497', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-18T16:31:55+05:30', 'post_id': 929075104, 'review': '', 'user': {'name': 'Vrj', 'vanity': '', 'badge': None, 'total_visits': 11, 'total_spent': 34353, 'user_id': '7455242', 'description': '', 'deeplink': 'magicpin://profileuser?userId=7455242', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 4, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-18T16:32:42+05:30', 'post_id': 929074944, 'review': '', 'user': {'name': 'Vaibhav Deshatavar', 'vanity': '2 Followers', 'badge': None, 'total_visits': 26, 'total_spent': 7571, 'user_id': '4271349', 'description': '', 'deeplink': 'magicpin://profileuser?userId=4271349', 'profile_pic': 'https://lh3.googleusercontent.com/LbU80bZNxLLQ_bhc2VqWc6CJXHyqcNCLpBg5YBKMFWogWepbIyen3rdQIZx6WRZjT5l1OR63OLpZOuSVnz2TGXpP4pY=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': 'Niceeee ', 'review_date': '2020-12-18T12:38:07+05:30', 'post_id': 929056653, 'review': 'Loved the range of products, quality of products, delivery time, safe packaging, and promotions & discounts at this place', 'user': {'name': 'JAGGU RAJ', 'vanity': '', 'badge': None, 'total_visits': 3, 'total_spent': 3877, 'user_id': '9567167', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9567167', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-18T09:44:06+05:30', 'post_id': 929047869, 'review': '', 'user': {'name': 'Ramya', 'vanity': '', 'badge': None, 'total_visits': 40, 'total_spent': 307540, 'user_id': '7769558', 'description': '', 'deeplink': 'magicpin://profileuser?userId=7769558', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-18T09:43:13+05:30', 'post_id': 929047672, 'review': '', 'user': {'name': 'Hemraj', 'vanity': '', 'badge': None, 'total_visits': 9, 'total_spent': 7636, 'user_id': '9840082', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9840082', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-18T09:41:35+05:30', 'post_id': 929046779, 'review': 'Loved the quality of service, ambience, safety precautions, and taste at this place', 'user': {'name': 'Rahul', 'vanity': '1 Followers', 'badge': None, 'total_visits': 17, 'total_spent': 6594, 'user_id': '8723539', 'description': '', 'deeplink': 'magicpin://profileuser?userId=8723539', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-17T20:44:06+05:30', 'post_id': 929030684, 'review': 'Loved the quality of products and delivery time at this place', 'user': {'name': 'Mohit', 'vanity': '', 'badge': None, 'total_visits': 4, 'total_spent': 1086, 'user_id': '9878699', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9878699', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:12:24+05:30', 'post_id': 928845159, 'review': 'Loved the quality of products, safety precautions, variety of options, location of store, and promotions & discounts at this place', 'user': {'name': 'Vishal Sorap', 'vanity': '11 Followers', 'badge': None, 'total_visits': 39, 'total_spent': 44997, 'user_id': '2379680', 'description': '', 'deeplink': 'magicpin://profileuser?userId=2379680', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:06:28+05:30', 'post_id': 928844059, 'review': '', 'user': {'name': 'Harshal Rane', 'vanity': '58 Followers', 'badge': None, 'total_visits': 286, 'total_spent': 1263268, 'user_id': '1615250', 'description': '', 'deeplink': 'magicpin://profileuser?userId=1615250', 'profile_pic': 'https://lh3.googleusercontent.com/9eADr5gEJeWHS17inAcfrbke3k-aYBzm-3f6JoP2Kzkljf2sP6-fHE5hxbMw7EI7Hk3q8eZa-gGE1Zes_bXLNDcrpuc=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 3, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': 'Everything \n', 'review_date': '2020-12-14T22:12:22+05:30', 'post_id': 928843503, 'review': 'Loved the quality, service, safety precautions, and promotions at this place', 'user': {'name': 'Aman Bhagat', 'vanity': '28 Followers', 'badge': None, 'total_visits': 48, 'total_spent': 33812, 'user_id': '6589071', 'description': '', 'deeplink': 'magicpin://profileuser?userId=6589071', 'profile_pic': 'https://lh3.googleusercontent.com/jtEV_8v2CjIcrJTPLhL_me_flmHxOwrCesP6Z7J24Y-jvYWWfVvOjUcCCUZ2cvVD55wtUGtn9YupoqSuFyttXGaC6aw=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:08:00+05:30', 'post_id': 928842848, 'review': 'Loved the range of products, quality of products, and delivery time at this place', 'user': {'name': 'Ashok Kumar Baghel', 'vanity': '', 'badge': None, 'total_visits': 79, 'total_spent': 78554, 'user_id': '9374043', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9374043', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:05:44+05:30', 'post_id': 928841967, 'review': '', 'user': {'name': 'Akash dutt', 'vanity': '5 Followers', 'badge': None, 'total_visits': 196, 'total_spent': 711777, 'user_id': '7182223', 'description': '', 'deeplink': 'magicpin://profileuser?userId=7182223', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:07:11+05:30', 'post_id': 928841156, 'review': 'Loved the quality of service, ambience, safety precautions, taste, and price, promotions, & discounts at this place', 'user': {'name': 'Raunak', 'vanity': '44 Followers', 'badge': None, 'total_visits': 250, 'total_spent': 81010, 'user_id': '9210632', 'description': 'I am no one interesting', 'deeplink': 'magicpin://profileuser?userId=9210632', 'profile_pic': 'https://lh3.googleusercontent.com/-RVyHuI5_SABCC0LtLtV2oSCYmTyAJWJ-bv9ZX0lQY7tv6cdJIK0sW7BYTni2Pd6lv7Bb-Te1COJcQa9-2uWf3nkTcY=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T21:57:57+05:30', 'post_id': 928840295, 'review': 'Loved the variety of options, safety precautions, helpful staff, and price, promotions, & discounts at this place', 'user': {'name': 'Prashanth', 'vanity': '8 Followers', 'badge': None, 'total_visits': 75, 'total_spent': 374044, 'user_id': '9821253', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9821253', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:02:35+05:30', 'post_id': 928840224, 'review': 'Loved the range of products at this place', 'user': {'name': 'Sushant P', 'vanity': '39 Followers', 'badge': None, 'total_visits': 407, 'total_spent': 358857, 'user_id': '4789975', 'description': '', 'deeplink': 'magicpin://profileuser?userId=4789975', 'profile_pic': 'https://lh3.googleusercontent.com/_AWRlRrTsNUdZDKyB4mTeFK99T2gM29ScDGVt8j1C5NemAprx_gw3OofqS3_cm4cELfpWWzIie55YYHYHTpLEQ2rpqs=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 4, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': 'Amazon is best ', 'review_date': '2020-12-14T21:55:10+05:30', 'post_id': 928839174, 'review': 'Loved the delivery time and promotions & discounts at this place', 'user': {'name': 'Sunny', 'vanity': '6 Followers', 'badge': None, 'total_visits': 24, 'total_spent': 40934, 'user_id': '9768929', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9768929', 'profile_pic': 'https://lh3.googleusercontent.com/dDT5_ubc0MRR6y541WX6kGbqBBcm7xqeFxBPRUZgTO-x8hLF7fEzEkKzgwhq1r8wPDck-xtZCj99PyrO_i8xQgCA_-KyVV00kS31wBfS=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:01:07+05:30', 'post_id': 928838737, 'review': 'Loved the quality of service, ambience, and taste at this place', 'user': {'name': 'Sohil Merchant', 'vanity': '5 Followers', 'badge': None, 'total_visits': 199, 'total_spent': 98483, 'user_id': '9222147', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9222147', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T21:52:45+05:30', 'post_id': 928838145, 'review': '', 'user': {'name': 'pinki', 'vanity': '1 Followers', 'badge': None, 'total_visits': 7, 'total_spent': 3207, 'user_id': '9836153', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9836153', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:10:12+05:30', 'post_id': 928838005, 'review': '', 'user': {'name': 'Sudhanshu Shekhar', 'vanity': '3 Followers', 'badge': None, 'total_visits': 78, 'total_spent': 522579, 'user_id': '8217943', 'description': '', 'deeplink': 'magicpin://profileuser?userId=8217943', 'profile_pic': 'https://lh3.googleusercontent.com/PduS8X6YNq_gCRe_1sQ0zi6LFDalhGoeDeV_M7BJg_szWV7TLeqm2QotxNMczJDjLEQzJjTq7H3tAEtHLiDobBBguLBc=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 2, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:01:58+05:30', 'post_id': 928837918, 'review': 'Loved the ambience and taste at this place', 'user': {'name': 'Sandip Raul', 'vanity': '43 Followers', 'badge': None, 'total_visits': 87, 'total_spent': 257686, 'user_id': '3179212', 'description': '', 'deeplink': 'magicpin://profileuser?userId=3179212', 'profile_pic': 'https://lh3.googleusercontent.com/3nl1P7BGqdGiVdEGzjJ24SnV-aPDSe8KUv38d_kvo6G82NefqsVvlMN4l9AbDPhldMlWDN7JUgjzQvp9gKNFr9Sz1jt7K4nskEOIkXU=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T21:58:06+05:30', 'post_id': 928837904, 'review': 'Loved the quality of products, safety precautions, variety of options, location of store, and promotions & discounts at this place', 'user': {'name': 'Killer', 'vanity': '7 Followers', 'badge': None, 'total_visits': 28, 'total_spent': 21599, 'user_id': '7412507', 'description': '', 'deeplink': 'magicpin://profileuser?userId=7412507', 'profile_pic': 'https://lh3.googleusercontent.com/_fhBywD7kvez24utn17OhvtbYK-jMvnL9frdFLPxV_vJk2lILIkLE6FRxW58mfTZRBHNB0im-nSGE6GSzLJ3ULJMsgzC=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:05:34+05:30', 'post_id': 928837839, 'review': 'Loved the range of products, quality of products, delivery time, safe packaging, and promotions & discounts at this place', 'user': {'name': 'Axat', 'vanity': '', 'badge': None, 'total_visits': 26, 'total_spent': 22707, 'user_id': '9571962', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9571962', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T21:52:34+05:30', 'post_id': 928837057, 'review': '', 'user': {'name': 'Manna11', 'vanity': '', 'badge': None, 'total_visits': 3, 'total_spent': 1209, 'user_id': '9840078', 'description': '', 'deeplink': 'magicpin://profileuser?userId=9840078', 'profile_pic': '', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 4, 'reply_date': '0001-01-01T00:00:00Z'}, {'rating_text': 'Outlet Rating', 'reply': '', 'merchant': None, 'review_title': '', 'review_date': '2020-12-14T22:10:41+05:30', 'post_id': 928836452, 'review': 'Loved the range of products at this place', 'user': {'name': 'Rushikesh Ingale', 'vanity': '23 Followers', 'badge': None, 'total_visits': 116, 'total_spent': 217734, 'user_id': '6911448', 'description': '', 'deeplink': 'magicpin://profileuser?userId=6911448', 'profile_pic': 'https://lh3.googleusercontent.com/-fb2w-cH1_0rjjoK2GYQDYr1tu_SAF_NyLN7eUHvTATCwO9fP6QEaV_wd6blNamjhJ5radva3axs03SDO46HVhYosPI=s120', 'is_following': False}, 'media_list': [{'image_url': '', 'aspect_ratio': 1}], 'rating': 5, 'reply_date': '0001-01-01T00:00:00Z'}], 'price': 'Average Spent: \n₹350'}]

Related

my list of dictionaries is not correctly filled

I'm trying to work with a list of dictionaries and I have a problem.
Here is my code:
for i in range(len(self.dict_trie_csv_infos)):
self.template_json.append(self.template_json[0])
#add values to json
for i in range(len(self.dict_trie_csv_infos)):
self.template_json[i]['id'] = self.dict_trie_csv_infos[i]['id']
print(self.template_json[i]['id'])
self.template_json[i]['name'] = self.dict_trie_csv_infos[i]['name']
self.template_json[i]['title'] = self.dict_trie_csv_infos[i]['event title']
self.template_json[i]['startDateTime'] = self.dict_trie_csv_infos[i]
#fill the smart contract now
self.template_json[i]['ticketCollection'][0]['collectionName'] = self.dict_trie_csv_infos[i]['smart_contract']['collectionName']
self.template_json[i]['ticketCollection'][0]['endTime'] = self.dict_trie_csv_infos[i]['smart_contract']['sale_params']['end_time']
self.template_json[i]['ticketCollection'][0]['pricePerToken'] = self.dict_trie_csv_infos[i]['smart_contract']['sale_params']['price_per_token']
self.template_json[i]['ticketCollection'][0]['totalTicketsCount'] = ''
self.template_json[i]['ticketCollection'][0]['soldTicketsCount'] = ''
print(self.template_json[0])
The first print prints 1 then 2, then 3... but self.template_json[0] prints my last element
and print(self.template_json) it gives me the same result every time.
{'id': '4', 'name': 'Stade de France', 'title': 'Coldplay World Tour', 'startDateTime': 1651615200, 'endDateTime': 1659045600, 'address': '', 'locationName': '93200 Saint-Denis', 'totalTicketsCount': '10000', 'assetUrl': 'https://coldplay.com/coldplay_asset.mp4', 'lineup': [''], 'ticketCollection': [{'collectionName': 'KT2jH58CPkYBe3bRuTCET6A4NhnosX2BAnp9', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': 50, 'maxMintPerUser': '', 'saleSize': '', 'endTime': 1659045600, 'totalTicketsCount': '', 'soldTicketsCount': ''}], 'adress': 'KT1ffDxCJH9EPimNm19ifBEgG9bFRgptJwop'}
there is my template_json before the loop:
[{'id': '', 'name': '', 'title': '', 'startDateTime': '', 'endDateTime': '', 'address': '', 'locationName': '', 'totalTicketsCount': '', 'assetUrl': '', 'lineup': [], 'ticketCollection': [{'collectionName': '', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': '', 'maxMintPerUser': '', 'saleSize': ''}]}, {'id': '', 'name': '', 'title':
'', 'startDateTime': '', 'endDateTime': '', 'address': '', 'locationName': '', 'totalTicketsCount': '', 'assetUrl': '', 'lineup': [], 'ticketCollection': [{'collectionName': '', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': '', 'maxMintPerUser': '', 'saleSize': ''}]}, {'id': '', 'name': '', 'title': '', 'startDateTime': '', 'endDateTime': '', 'address': '', 'locationName': '', 'totalTicketsCount': '', 'assetUrl': '', 'lineup': [], 'ticketCollection': [{'collectionName': '', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': '', 'maxMintPerUser': '', 'saleSize': ''}]}, {'id': '', 'name': '', 'title': '', 'startDateTime': '', 'endDateTime': '', 'address': '', 'locationName': '', 'totalTicketsCount': '', 'assetUrl': '', 'lineup': [], 'ticketCollection': [{'collectionName': '', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': '', 'maxMintPerUser': '', 'saleSize': ''}]}, {'id': '', 'name': '', 'title': '', 'startDateTime': '', 'endDateTime': '', 'address': '', 'locationName': '', 'totalTicketsCount': '', 'assetUrl': '', 'lineup': [], 'ticketCollection': [{'collectionName': '', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': '', 'maxMintPerUser': '', 'saleSize': ''}]}, {'id': '', 'name': '', 'title': '', 'startDateTime': '', 'endDateTime': '', 'address': '', 'locationName': '', 'totalTicketsCount': '', 'assetUrl': '', 'lineup': [], 'ticketCollection': [{'collectionName': '', 'scAddress': '', 'collectionAddress': '', 'pricePerToken': '', 'maxMintPerUser': '', 'saleSize': ''}]}]
and there is my "print(self.dict_tries_csv):
{'event_id': 1, 'collection_name': 'Mouse On', 'smart_contract': {'multisig': 'KT1Aer6TxNwoMJejoqsNP8TEN7J6STgMtJcA', 'sale_params': {'is_presale': False, 'metadata_list': [], 'price_per_token': 4, 'max_mint_per_user': 5, 'sale_size': 500, 'sale_currency': {'xtz': None}, 'start_time': 1656626400, 'end_time': 1657490400}, 'collectionName': 'KT1Apf8CPkYBe3bRuTCET6A4NhnosX2BAnp9', 'scAddress': 'KT1AKqxCJH9EPimNm1wo1BEgG9bFRgptJwkk'}, 'id': '1', 'event title': 'Mouse Party', 'event start date': '10/07/2022 18:30', 'event end date': '11/07/2022 01:00', 'name': "L'Astrolabe", 'address of the location': '1 Rue Alexandre Avisse 45000 Orléans', 'total ticket number': '500', 'maximum tickets per user': '5', 'sale start date': '01/07/2022', 'line up': 'Mehdi Maïzi-Rad Cartier-Squidji', 'asset url': 'https://photos.com/mouseparty.png', 'lineup': ['Mehdi Maïzi', 'Rad Cartier', 'Squidji']}
I expected to get id == 1
All your list elements are the same dictionary. You need to copy them.
from copy import deepcopy
for i in range(len(self.dict_trie_csv_infos)):
self.template_json.append(deepcopy(self.template_json[0]))

How to Transform data in a nested json?

I have the following data, and when I used json_flatten i was able to extract most of the data except for deliveryMethod.items and languages.items.
I also tried to use pd.json_normalize(a, record_path= 'deliveryMethod.items') but it doesn't seem to be working.
a = {'ID': '1', 'Name': 'ABC', 'Center': 'Center For Education', 'providerNameAr': 'ABC', 'city': {'id': 1, 'cityEn': 'LA', 'regionId': 0, 'region': None}, 'cityName': None, 'LevelNumber': 'ABCD', 'activityStartDate': '09/01/2020', 'activityEndDate': '09/02/2020', 'activityType': {'lookUpId': 2, 'lookUpEn': 'Course', 'code': None, 'parent': None, 'hasParent': False}, 'deliveryMethod': {'items': [{'lookUpId': 2, 'lookUpEn': 'online' 'code': None, 'parent': None, 'hasParent': False}]}, 'languages': {'items': [{'lookUpId': 1, 'lookUpEn': 'English', 'code': None, 'parent': None, 'hasParent': False}]}, 'activityCategory': {'lookUpId': 1, 'lookUpEn': 'Regular', 'code': None, 'parent': None, 'hasParent': False}, 'address': 'LA', 'phoneNumber': '-11111', 'emailAddress': 'ABCS#Gmail.com', 'isAllSpeciality': True, 'requestId': 23, 'parentActivityId': None, 'sppData': None}

Iterating through JSON array and appending values to pandas df based on condition

I have two datasets - one as a dataframe and the other as an array of JSON files.
Each line in the df has a string (folio number) that identifies a piece of land (Ex: '0101000000030'), and a date (in datetime) a permit was applied for.
Every JSON file in the array has a corresponding number identifying that land. It also has dates the property was sold, to whom it was sold, and the seller.
I need to take the folio number and the date the permit was applied for and run it through the array of JSON files until it finds the matching folio.
Then, it needs to extract the property's owner information by finding who owned the property when the permit was applied for and append it to the corresponding row in the df.
Desired Output
FirstSubmissionDate Folio PropertyOwner
05/17/2018 '0101000000030' blahblah
Input DF
FirstSubmissionDate Folio
05/17/2018 '0101000000030'
Input JSON
{'Additionals': {'AddtionalInfo': [{'Key': 'LAND USE AND RESTRICTIONS',
'Value': [{'InfoName': 'Community Development District',
'InfoValue': 'COUNTYGIS',
'Message': ''},
{'InfoName': 'Community Redevelopment Area',
'InfoValue': 'COUNTYGIS',
'Message': ''},
{'InfoName': 'Empowerment Zone', 'InfoValue': 'COUNTYGIS', 'Message': ''},
{'InfoName': 'Enterprise Zone', 'InfoValue': 'COUNTYGIS', 'Message': ''},
{'InfoName': 'Urban Development',
'InfoValue': 'COUNTYGIS',
'Message': ''},
{'InfoName': 'Zoning Code', 'InfoValue': 'COUNTYGIS', 'Message': ''},
{'InfoName': 'Existing Land Use',
'InfoValue': 'COUNTYGIS',
'Message': ''},
{'InfoName': 'Government Agencies and Community Services',
'InfoValue': 'http://gisweb.miamidade.gov/communityservices/CommunityServicesAll.html?x=&y=&bufferDistance=5&address=60 SE 2 ST',
'Message': ''}]},
{'Key': 'OTHER GOVERNMENTAL JURISDICTIONS',
'Value': [{'InfoName': 'Business Incentives',
'InfoValue': 'https://gisweb.miamidade.gov/businessincentive/default.aspx?searchtype=address&paramvalue=',
'Message': ''},
{'InfoName': 'Childrens Trust',
'InfoValue': 'https://www.thechildrenstrust.org/',
'Message': ''},
{'InfoName': 'City of Miami',
'InfoValue': 'http://www.miamigov.com/home/',
'Message': ''},
{'InfoName': 'Environmental Considerations',
'InfoValue': 'https://gisweb.miamidade.gov/environmentalconsiderations/default.aspx?searchtype=address&paramvalue=60 SE 2 ST',
'Message': ''},
{'InfoName': 'Florida Inland Navigation District',
'InfoValue': 'http://www.aicw.org',
'Message': ''},
{'InfoName': 'PA Bulletin Board',
'InfoValue': 'http://bbs.miamidade.gov/',
'Message': ''},
{'InfoName': 'Special Taxing District and Other Non-Ad valorem Assessment',
'InfoValue': 'http://www.miamidade.gov/Apps/PA/PAOnlineTools/Taxes/NonAdvalorem.aspx?folio=0101000000030',
'Message': ''},
{'InfoName': 'School Board',
'InfoValue': 'http://www.dadeschools.net/',
'Message': ''},
{'InfoName': 'South Florida Water Mgmt District',
'InfoValue': 'http://www.sfwmd.gov/portal/page/portal/sfwmdmain/home%20page',
'Message': ''},
{'InfoName': 'Tax Collector',
'InfoValue': 'http://www.miamidade.gov/taxcollector/',
'Message': ''}]}],
'FooterMessage': '',
'HeaderMessage': "* The information listed below is not derived from the Property Appraiser's Office records. It is provided for convenience and is derived from other government agencies."},
'Assessment': {'AssessmentInfos': [{'AssessedValue': 5587359,
'BuildingOnlyValue': 0,
'ExtraFeatureValue': 0,
'LandValue': 7618560,
'Message': None,
'TotalValue': 7618560,
'Year': 2021},
{'AssessedValue': 5079418,
'BuildingOnlyValue': 0,
'ExtraFeatureValue': 0,
'LandValue': 6963200,
'Message': None,
'TotalValue': 6963200,
'Year': 2020},
{'AssessedValue': 4617653,
'BuildingOnlyValue': 0,
'ExtraFeatureValue': 0,
'LandValue': 6963200,
'Message': None,
'TotalValue': 6963200,
'Year': 2019}],
'Messages': [{'Message': '', 'Year': 2021},
{'Message': '', 'Year': 2020},
{'Message': '', 'Year': 2019}]},
'Benefit': {'BenefitInfos': [{'Description': 'Non-Homestead Cap',
'Message': None,
'Seq': '5',
'TaxYear': 2021,
'Type': 'Assessment Reduction',
'Url': 'http://www.miamidade.gov/pa/property_value_cap.asp',
'Value': 2031201},
{'Description': 'Non-Homestead Cap',
'Message': None,
'Seq': '5',
'TaxYear': 2020,
'Type': 'Assessment Reduction',
'Url': 'http://www.miamidade.gov/pa/property_value_cap.asp',
'Value': 1883782},
{'Description': 'Non-Homestead Cap',
'Message': None,
'Seq': '5',
'TaxYear': 2019,
'Type': 'Assessment Reduction',
'Url': 'http://www.miamidade.gov/pa/property_value_cap.asp',
'Value': 2345547}],
'Messages': []},
'Building': {'BuildingInfos': [], 'Messages': []},
'ClassifiedAgInfo': {'Acreage': 0,
'CalculatedValue': 0,
'LandCode': None,
'LandUse': None,
'Message': None,
'UnitPrice': 0},
'Completed': True,
'District': 6,
'ExtraFeature': {'ExtraFeatureInfos': [], 'Messages': []},
'GeoParcel': None,
'Land': {'Landlines': [{'AdjustedUnitPrice': 465,
'CalculatedValue': 7618560,
'Depth': 0,
'FrontFeet': 0,
'LandUse': 'GENERAL',
'LandlineType': 'C',
'Message': None,
'MuniZone': 'T6-80-O',
'MuniZoneDescription': None,
'PAZoneDescription': 'COMMERCIAL',
'PercentCondition': 1,
'RollYear': 2021,
'TotalAdjustments': 1,
'UnitType': 'Square Ft.',
'Units': 16384,
'UseCode': '00',
'Zone': '6401'},
{'AdjustedUnitPrice': -1,
'CalculatedValue': -1,
'Depth': 0,
'FrontFeet': 0,
'LandUse': 'GENERAL',
'LandlineType': 'C',
'Message': None,
'MuniZone': 'T6-80-O',
'MuniZoneDescription': None,
'PAZoneDescription': 'COMMERCIAL',
'PercentCondition': 1,
'RollYear': 2020,
'TotalAdjustments': 1,
'UnitType': 'Square Ft.',
'Units': 16384,
'UseCode': '00',
'Zone': '6401'},
{'AdjustedUnitPrice': -1,
'CalculatedValue': -1,
'Depth': 0,
'FrontFeet': 0,
'LandUse': 'GENERAL',
'LandlineType': 'C',
'Message': None,
'MuniZone': 'T6-80-O',
'MuniZoneDescription': None,
'PAZoneDescription': 'COMMERCIAL',
'PercentCondition': 1,
'RollYear': 2019,
'TotalAdjustments': 1,
'UnitType': 'Square Ft.',
'Units': 16384,
'UseCode': '00',
'Zone': '6401'}],
'Messages': [{'Message': '', 'Year': 2021},
{'Message': 'The calculated values for this property have been overridden. Please refer to the Land, Building, and XF Values in the Assessment Section, in order to obtain the most accurate values.',
'Year': 2020},
{'Message': 'The calculated values for this property have been overridden. Please refer to the Land, Building, and XF Values in the Assessment Section, in order to obtain the most accurate values.',
'Year': 2019}]},
'LegalDescription': {'Description': 'MIAMI NORTH PB B-41|BEG 12.2FT W OF X OF S/L OF SE 2|ST & W/L OF SE 1 AVE TH S11.85FT|SWLY A/D 72.55FT S52.71FT|W108.69FT N10FT W4.6FT N123.52FT|E137.4FT TO POB|LOT SIZE 16384 SQ FT|COC 25843-0025 26307-3840 0707 6',
'Message': None,
'Number': None},
'MailingAddress': {'Address1': '1000 BRICKELL AVE STE 400',
'Address2': '',
'Address3': '',
'City': 'MIAMI',
'Country': 'USA',
'Message': None,
'State': 'FL',
'ZipCode': '33131'},
'Message': '',
'OwnerInfos': [{'Description': 'Sole Owner',
'MarriedFlag': '0',
'Message': None,
'Name': '16 SE 2ND STREET DOWNTOWN',
'PercentageOwn': 1,
'Role': None,
'ShortDescription': 'Sole Owner',
'TenancyCd': 'S'},
{'Description': 'Sole Owner',
'MarriedFlag': '0',
'Message': None,
'Name': 'INVESTMENT LLC',
'PercentageOwn': 1,
'Role': None,
'ShortDescription': 'Sole Owner',
'TenancyCd': 'S'}],
'PropertyInfo': {'BathroomCount': 0,
'BedroomCount': 0,
'BuildingActualArea': 0,
'BuildingBaseArea': 0,
'BuildingEffectiveArea': 0,
'BuildingGrossArea': 0,
'BuildingHeatedArea': 0,
'DORCode': '1081',
'DORDescription': 'VACANT LAND - COMMERCIAL : VACANT LAND',
'DORDescriptionCurrent': None,
'EncodedFolioAndTaxYear': 'J1COeydnmm%2fHHVEoyromqjt3GPqH8da%2fsulgVBOgI7w%3d',
'FloorCount': 0,
'FolioNumber': '01-0100-000-0030',
'HalfBathroomCount': 0,
'HxBaseYear': 0,
'LotSize': 16384,
'Message': None,
'Municipality': 'Miami',
'Neighborhood': 69010,
'NeighborhoodDescription': 'Miami CBD',
'ParentFolio': '',
'PercentHomesteadCapped': 0,
'PlatBook': 'B',
'PlatPage': '41',
'PrimaryZone': '6401',
'PrimaryZoneDescription': 'COMMERCIAL',
'ShowCurrentValuesFlag': 'N',
'Status': 'AC Active',
'Subdivision': '010100000',
'SubdivisionDescription': '353017046',
'UnitCount': 0,
'YearBuilt': '0'},
'RollYear1': 2021,
'SalesInfos': [{'DateOfSale': '6/23/2021',
'DocumentStamps': 276000,
'EncodedRecordBookAndPage': 'lHVlhHQhIZoJRUYKiXnhi4goVgjenckUAcgPekALEZ8LlG%2bmH%2bycTA%3d%3d',
'GranteeName1': '16 SE 2ND STREET DOWNTOWN',
'GranteeName2': 'INVESTMENT LLC',
'GrantorName1': '16 SE 2ND STREET LLC',
'GrantorName2': '',
'Message': None,
'OfficialRecordBook': '32602',
'OfficialRecordPage': '3521',
'QualificationDescription': 'Qual on DOS, multi-parcel sale',
'QualifiedFlag': 'Q',
'QualifiedSYear': None,
'QualifiedSourceCode': '',
'ReasonCode': '05',
'ReviewCode': None,
'SaleId': 5,
'SaleInstrument': 'WDE',
'SalePrice': 46000000,
'VacantFlag': '\x00',
'ValidCode': None,
'VerifyCode': None},
{'DateOfSale': '5/24/2013',
'DocumentStamps': 0,
'EncodedRecordBookAndPage': 'lHVlhHQhIZoJRUYKiXnhiyo2fiU6Ad2Yj6ROwqxBp26vA0B1JkALuQ%3d%3d',
'GranteeName1': '16 SE 2ND STREET LLC',
'GranteeName2': '',
'GrantorName1': 'BURDINES 1225 LLC',
'GrantorName2': '',
'Message': None,
'OfficialRecordBook': '28688',
'OfficialRecordPage': '1169',
'QualificationDescription': 'Financial inst or "In Lieu of Forclosure" stated',
'QualifiedFlag': 'U',
'QualifiedSYear': None,
'QualifiedSourceCode': '',
'ReasonCode': '12',
'ReviewCode': None,
'SaleId': 4,
'SaleInstrument': 'DEE',
'SalePrice': 32620638,
'VacantFlag': '\x00',
'ValidCode': None,
'VerifyCode': None},
{'DateOfSale': '8/1/1989',
'DocumentStamps': 0,
'EncodedRecordBookAndPage': 'lHVlhHQhIZoJRUYKiXnhi9bvfovmAqmTIZ5uJf3HEgtQChvRqiPQDw%3d%3d',
'GranteeName1': '',
'GranteeName2': '',
'GrantorName1': '',
'GrantorName2': '',
'Message': None,
'OfficialRecordBook': '14202',
'OfficialRecordPage': '2339',
'QualificationDescription': 'Deeds that include more than one parcel',
'QualifiedFlag': 'Q',
'QualifiedSYear': None,
'QualifiedSourceCode': '',
'ReasonCode': '02',
'ReviewCode': None,
'SaleId': 3,
'SaleInstrument': '',
'SalePrice': 6200000,
'VacantFlag': '\x00',
'ValidCode': None,
'VerifyCode': None},
{'DateOfSale': '9/1/2003',
'DocumentStamps': 0,
'EncodedRecordBookAndPage': 'lHVlhHQhIZoJRUYKiXnhi5pTmn2bXcBBM42%2bwPcIyhry9UhcpSwX4g%3d%3d',
'GranteeName1': '',
'GranteeName2': '',
'GrantorName1': '',
'GrantorName2': '',
'Message': None,
'OfficialRecordBook': '21695',
'OfficialRecordPage': '3500',
'QualificationDescription': 'Deeds that include more than one parcel',
'QualifiedFlag': 'Q',
'QualifiedSYear': None,
'QualifiedSourceCode': '',
'ReasonCode': '02',
'ReviewCode': None,
'SaleId': 2,
'SaleInstrument': '',
'SalePrice': 8800000,
'VacantFlag': '\x00',
'ValidCode': None,
'VerifyCode': None},
{'DateOfSale': '7/1/2007',
'DocumentStamps': 0,
'EncodedRecordBookAndPage': 'lHVlhHQhIZoJRUYKiXnhi5bVa6yEIUa%2bSDngqM2N5YUM89ag%2fj8HOA%3d%3d',
'GranteeName1': '',
'GranteeName2': '',
'GrantorName1': '',
'GrantorName2': '',
'Message': None,
'OfficialRecordBook': '25843',
'OfficialRecordPage': '0025',
'QualificationDescription': 'Other disqualified',
'QualifiedFlag': 'U',
'QualifiedSYear': None,
'QualifiedSourceCode': '',
'ReasonCode': '03',
'ReviewCode': None,
'SaleId': 1,
'SaleInstrument': '',
'SalePrice': 21500000,
'VacantFlag': '\x00',
'ValidCode': None,
'VerifyCode': None}],
'SiteAddress': [{'Address': '60 SE 2 ST, Miami, FL 33131-2103',
'BuildingNumber': 1,
'City': 'Miami',
'HouseNumberSuffix': '',
'Message': None,
'StreetName': '2',
'StreetNumber': 60,
'StreetPrefix': 'SE',
'StreetSuffix': 'ST',
'StreetSuffixDirection': '',
'Unit': '',
'Zip': '33131-2103'}],
'Taxable': {'Messages': [],
'TaxableInfos': [{'CityExemptionValue': 0,
'CityTaxableValue': 5587359,
'CountyExemptionValue': 0,
'CountyTaxableValue': 5587359,
'Message': None,
'RegionalExemptionValue': 0,
'RegionalTaxableValue': 5587359,
'SchoolExemptionValue': 0,
'SchoolTaxableValue': 7618560,
'Year': 2021},
{'CityExemptionValue': 0,
'CityTaxableValue': 5079418,
'CountyExemptionValue': 0,
'CountyTaxableValue': 5079418,
'Message': None,
'RegionalExemptionValue': 0,
'RegionalTaxableValue': 5079418,
'SchoolExemptionValue': 0,
'SchoolTaxableValue': 6963200,
'Year': 2020},
{'CityExemptionValue': 0,
'CityTaxableValue': 4617653,
'CountyExemptionValue': 0,
'CountyTaxableValue': 4617653,
'Message': None,
'RegionalExemptionValue': 0,
'RegionalTaxableValue': 4617653,
'SchoolExemptionValue': 0,
'SchoolTaxableValue': 6963200,
'Year': 2019}]}}
I can distill down to the sales info:
for y in variable_name[1]['SalesInfos']:
y = y['DateOfSale']
y = datetime.strptime(y, '%m/%d/%Y')
print(y)

I'm trying to normalize the documents column within the dataframe

[{
'processingTechniques': ['Hot rolling'],
'summary': 'Metals Long Products Rebar in Coil',
'applications': ['CONCRETE REINFORCEMENT', 'METAL DOWNSTREAM INDUSTRIALS', 'CUT AND BEND', 'EPOXY COATING '],
'regions': ['MEA'],
'description': 'Metals Long Products Rebar in Coil',
'industrySegments': None,
'grade_id': '89a63243-74c7-e611-8197-06b69393ae39',
'name': '40',
'documents': [{
'documentType': 'TDS',
'title': 'Rebar in Coil_40_Global_Technical_Data_Sheet',
'url': '/en/products/documents/rebar-in-coil_40_global_technical_data_sheet/en',
'language': 'English',
'region': 'Global',
'revision': '20210812',
'id': '2bc4102f-8df7-e611-819b-06b69393ae39'
}]
}, {
'processingTechniques': ['Hot rolling'],
'summary': 'Metals Long Products Rebar in Coil',
'applications': ['CONCRETE REINFORCEMENT', 'METAL DOWNSTREAM INDUSTRIALS', 'CUT AND BEND', 'EPOXY COATING '],
'regions': ['MEA'],
'description': 'Metals Long Products Rebar in Coil',
'industrySegments': None,
'grade_id': 'dddd0468-79c7-e611-8197-06b69393ae39',
'name': '460B',
'documents': [{
'documentType': 'TDS',
'title': 'Rebar in Coil_460B_MEA_Technical_Data_Sheet',
'url': '/en/products/documents/rebar-in-coil_460b_mea_technical_data_sheet/en',
'language': 'English',
'region': 'MEA',
'revision': '20210812',
'id': '0e63bc98-c343-e811-80fd-005056857ef3'
}]
}, {
'processingTechniques': ['Hot rolling'],
'summary': 'Metals Long Products Rebar in Coil',
'applications': ['CONCRETE REINFORCEMENT', 'METAL DOWNSTREAM INDUSTRIALS', 'CUT AND BEND', 'EPOXY COATING '],
'regions': ['MEA'],
'description': 'Metals Long Products Rebar in Coil',
'industrySegments': None,
'grade_id': 'cd695035-76c7-e611-8197-06b69393ae39',
'name': '60',
'documents': [{
'documentType': 'TDS',
'title': 'Rebar in Coil_60_MEA_Technical_Data_Sheet',
'url': '/en/products/documents/rebar-in-coil_60_mea_technical_data_sheet/en',
'language': 'English',
'region': 'MEA',
'revision': '20210812',
'id': '733946d8-c343-e811-80fd-005056857ef3'
}]
}, {
'processingTechniques': ['Hot rolling'],
'summary': 'Metals Long Products Rebar in Coil',
'applications': ['CONCRETE REINFORCEMENT', 'METAL DOWNSTREAM INDUSTRIALS', 'CUT AND BEND', 'EPOXY COATING '],
'regions': ['MEA'],
'description': 'Metals Long Products Rebar in Coil',
'industrySegments': None,
'grade_id': 'c99a8cc9-79c7-e611-8197-06b69393ae39',
'name': 'B500B',
'documents': [{
'documentType': 'TDS',
'title': 'Rebar in Coil_B500B_MEA_Technical_Data_Sheet',
'url': '/en/products/documents/rebar-in-coil_b500b_mea_technical_data_sheet/en',
'language': 'English',
'region': 'MEA',
'revision': '20210812',
'id': 'bc25a637-c443-e811-80fd-005056857ef3'
}]
}]
The code written to convert this json to dataframe after normalizing the document is this
gr2 = pd.json_normalize(result, ['documents'], meta = ['regions', 'name', 'description', 'grade_id', 'processingTechniques','summary', 'applications'])
gr2['product_id'] = prod_id
gr2.head()
result is the json file attached above.
After running the above code, I'm getting this error
Can anyone help me with this ? I just want documents to get normalised along with the other columns.

Extract specific region from image using segmentation in python

I am having a JSON file where the annotation is stored as below
{'licenses': [{'name': '', 'id': 0, 'url': ''}], 'info': {'contributor': '', 'date_created': '', 'description': '', 'url': '', 'version': '', 'year': ''}, 'categories': [{'id': 1, 'name': 'book', 'supercategory': ''}, {'id': 2, 'name': 'ceiling', 'supercategory': ''}, {'id': 3, 'name': 'chair', 'supercategory': ''}, {'id': 4, 'name': 'floor', 'supercategory': ''}, {'id': 5, 'name': 'object', 'supercategory': ''}, {'id': 6, 'name': 'person', 'supercategory': ''}, {'id': 7, 'name': 'screen', 'supercategory': ''}, {'id': 8, 'name': 'table', 'supercategory': ''}, {'id': 9, 'name': 'wall', 'supercategory': ''}, {'id': 10, 'name': 'window', 'supercategory': ''}, {'id': 11, 'name': '__background__', 'supercategory': ''}], 'images': [{'id': 1, 'width': 848, 'height': 480, 'file_name': '153058384000.png', 'license': 0, 'flickr_url': '', 'coco_url': '', 'date_captured': 0}], 'annotations': [{'id': 1, 'image_id': 1, 'category_id': 7, 'segmentation': [[591.81, 146.75, 848.0, 119.83, 848.0, 289.18, 606.39, 288.06]], 'area': 38747.0, 'bbox': [591.81, 119.83, 256.19, 169.35], 'iscrowd': 0, 'attributes': {'occluded': False}}]}
I want to select a specific region from the image using the ''segmentation': [[591.81, 146.75, 848.0, 119.83, 848.0, 289.18, 606.39, 288.06]]' field within annotation in the above json file.
The image I am using is below
I tried with Opencv and PIL, but I didn't get effective output
Note: segmentation may have more than 8 coordinates

Categories

Resources