How to parse complex json with python? - python

I am trying to parse this json file and I am having trouble.
The json looks like this:
<ListObject list at 0x2161945a860> JSON: {
"data": [
{
"amount": 100,
"available_on": 1621382400,
"created": 1621264875,
"currency": "usd",
"description": "0123456",
"exchange_rate": null,
"fee": 266,
"fee_details": [
{
"amount": 266,
"application": null,
"currency": "usd",
"description": "processing fees",
"type": "fee"
}
],
"id": "txn_abvgd1234",
"net": 9999,
"object": "balance_transaction",
"reporting_category": "charge",
"source": "cust1",
"sourced_transfers": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/source"
},
"status": "pending",
"type": "charge"
},
{
"amount": 25984,
"available_on": 1621382400,
"created": 1621264866,
"currency": "usd",
"description": "0326489",
"exchange_rate": null,
"fee": 93,
"fee_details": [
{
"amount": 93,
"application": null,
"currency": "usd",
"description": "processing fees",
"type": "fee"
}
],
"id": "txn_65987jihgf4984oihydgrd",
"net": 9874,
"object": "balance_transaction",
"reporting_category": "charge",
"source": "cust2",
"sourced_transfers": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/source"
},
"status": "pending",
"type": "charge"
},
],
"has_more": true,
"object": "list",
"url": "/v1/balance_"
}
I am trying to parse it in python with this script:
import pandas as pd
df = pd.json_normalize(json)
df.head()
but what I am getting is:
What i need is to parse each of these data points in its own column. So i will have 2 row of data with columns for each data points.
Something like this:
How do i do this now?

All but one of your fields are direct copies from the JSON, so you can just make a list of the fields you can copy, and then do the extra processing for the fee_details.
import json
import pandas as pd
inp = """{
"data": [
{
"amount": 100,
"available_on": 1621382400,
"created": 1621264875,
"currency": "usd",
"description": "0123456",
"exchange_rate": null,
"fee": 266,
"fee_details": [
{
"amount": 266,
"application": null,
"currency": "usd",
"description": "processing fees",
"type": "fee"
}
],
"id": "txn_abvgd1234",
"net": 9999,
"object": "balance_transaction",
"reporting_category": "charge",
"source": "cust1",
"sourced_transfers": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/source"
},
"status": "pending",
"type": "charge"
},
{
"amount": 25984,
"available_on": 1621382400,
"created": 1621264866,
"currency": "usd",
"description": "0326489",
"exchange_rate": null,
"fee": 93,
"fee_details": [
{
"amount": 93,
"application": null,
"currency": "usd",
"description": "processing fees",
"type": "fee"
}
],
"id": "txn_65987jihgf4984oihydgrd",
"net": 9874,
"object": "balance_transaction",
"reporting_category": "charge",
"source": "cust2",
"sourced_transfers": {
"data": [],
"has_more": false,
"object": "list",
"total_count": 0,
"url": "/v1/source"
},
"status": "pending",
"type": "charge"
}
],
"has_more": true,
"object": "list",
"url": "/v1/balance_"
}"""
copies = [
'id',
'net',
'object',
'reporting_category',
'source',
'amount',
'available_on',
'created',
'currency',
'description',
'exchange_rate',
'fee'
]
data = json.loads(inp)
rows = []
for inrow in data['data']:
outrow = {}
for copy in copies:
outrow[copy] = inrow[copy]
outrow['fee_details'] = inrow['fee_details'][0]['description']
rows.append(outrow)
df = pd.DataFrame(rows)
print(df)
Output:
timr#tims-gram:~/src$ python x.py
id net object reporting_category source amount ... created currency description exchange_rate fee fee_details
0 txn_abvgd1234 9999 balance_transaction charge cust1 100 ... 1621264875 usd 0123456 None 266 processing fees
1 txn_65987jihgf4984oihydgrd 9874 balance_transaction charge cust2 25984 ... 1621264866 usd 0326489 None 93 processing fees
[2 rows x 13 columns]
timr#tims-gram:~/src$

Related

Loading variables into json string using python for MS teams

The 3rd party system I am using (vendor product) still uses Python 2.7 and doesn't support Python 3+ so bear with me, I'm fully aware Python 3 is out and this is a limitation of the system I have to use rather than a choice.
I am trying to do an integration between this third party product and MS teams - basically, the third party system provides data, I read this into my Python script and output a message to Teams using a webhook. It mostly works, but I'm struggling to load in some of the variables from the systems data.
For example, in my code, I use the following:
messageID='"{}"'.format(item["messageId"])
recipient='"{}"'.format(item["recipient"]["email"])
subject='"{}"'.format(item["subject"])
sender='"{}"'.format(item["sender"]["email"])
which has output like this:
messageId="34239482030783472#test.net"
recipient="testuser#domain.com"
subject="Email subject here"
sender="sender#domain2.com"
This is all fine, the trouble comes when I need to format my string to post to the Teams webhook.
It currently looks like:
teams_card='{"#type": "MessageCard","#context": "http://schema.org/extensions","themeColor": "0076D7","summary": “PTR”,”sections": [{"activityTitle": "PTR Incident Created","activitySubtitle": “End “User Exposed to Phishing Threat,”facts": [{"name": “Message” ID,”value": %s}, {"name": "Subject”,”value": %s},{“name": "End User","value": %s},{“name": “sender”,”value": %s}],”markdown": true}],"potentialAction": [{"#type": "OpenUri","name": "View Related Emails","targets": [{"os": "default","uri": "https://maskedurlhere.com”}]}]}’ % (messageId,subject,recipient,sender)
which throws an error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: not enough arguments for format string
I tried to use .format option also, but this fails with a different error:
teams_card='{"#type": "MessageCard","#context": "http://schema.org/extensions","themeColor": "0076D7","summary": “PTR”,”sections": [{"activityTitle": "PTR Incident Created","activitySubtitle": “End “User Exposed to Phishing Threat,”facts": [{"name": “Message” ID,”value": %s}, {"name": "Subject”,”value": %s},{“name": "End User","value": %s},{“name": “sender”,”value": %s}],”markdown": true}],"potentialAction": [{"#type": "OpenUri","name": "View Related Emails","targets": [{"os": "default","uri": "https://maskedurlhere.com”}]}]}’.format(messageId,subject,recipient,sender)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
KeyError: '"#type"'
The teams card variable is fine and posts to Teams successfully when it's just text, but trying to load in these variables doesn't seem to work at all.
Any ideas?
To pass dynamic values in Json you need to use format like ${value}
Please follow below example json format
Template JSON
{
"type": "AdaptiveCard",
"body": [
{
"type": "Container",
"style": "emphasis",
"items": [
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"size": "Large",
"weight": "Bolder",
"text": "**EXPENSE APPROVAL**",
"wrap": true
}
],
"width": "stretch"
},
{
"type": "Column",
"items": [
{
"type": "Image",
"url": "${status_url}",
"altText": "${status}",
"height": "30px"
}
],
"width": "auto"
}
]
}
],
"bleed": true
},
{
"type": "Container",
"items": [
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"size": "ExtraLarge",
"text": "${purpose}",
"wrap": true
}
],
"width": "stretch"
},
{
"type": "Column",
"items": [
{
"type": "ActionSet",
"actions": [
{
"type": "Action.OpenUrl",
"title": "EXPORT AS PDF",
"url": "https://adaptivecards.io"
}
]
}
],
"width": "auto"
}
]
},
{
"type": "TextBlock",
"spacing": "Small",
"size": "Small",
"weight": "Bolder",
"color": "Accent",
"text": "[${code}](https://adaptivecards.io)",
"wrap": true
},
{
"type": "FactSet",
"spacing": "Large",
"facts": [
{
"title": "Submitted By",
"value": "**${created_by_name}** ${creater_email}"
},
{
"title": "Duration",
"value": "${formatTicks(min(select(expenses, x, int(x.created_by))), 'yyyy-MM-dd')} - ${formatTicks(max(select(expenses, x, int(x.created_by))), 'yyyy-MM-dd')}"
},
{
"title": "Submitted On",
"value": "${formatDateTime(submitted_date, 'yyyy-MM-dd')}"
},
{
"title": "Reimbursable Amount",
"value": "$${formatNumber(sum(select(expenses, x, if(x.is_reimbursable, x.total, 0))), 2)}"
},
{
"title": "Awaiting approval from",
"value": "**${approver}** ${approver_email}"
},
{
"title": "Submitted to",
"value": "**${other_submitter}** ${other_submitter_email}"
}
]
}
]
},
{
"type": "Container",
"spacing": "Large",
"style": "emphasis",
"items": [
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"weight": "Bolder",
"text": "DATE",
"wrap": true
}
],
"width": "auto"
},
{
"type": "Column",
"spacing": "Large",
"items": [
{
"type": "TextBlock",
"weight": "Bolder",
"text": "CATEGORY",
"wrap": true
}
],
"width": "stretch"
},
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"weight": "Bolder",
"text": "AMOUNT",
"wrap": true
}
],
"width": "auto"
}
]
}
],
"bleed": true
},
{
"$data": "${expenses}",
"type": "Container",
"items": [
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"text": "${formatDateTime(created_time, 'MM-dd')}",
"wrap": true
}
],
"width": "auto"
},
{
"type": "Column",
"spacing": "Medium",
"items": [
{
"type": "TextBlock",
"text": "${description}",
"wrap": true
}
],
"width": "stretch"
},
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"text": "$${formatNumber(total, 2)}",
"wrap": true
}
],
"width": "auto"
},
{
"type": "Column",
"spacing": "Small",
"selectAction": {
"type": "Action.ToggleVisibility",
"targetElements": [
"cardContent${$index}",
"chevronDown${$index}",
"chevronUp${$index}"
]
},
"verticalContentAlignment": "Center",
"items": [
{
"type": "Image",
"id": "chevronDown${$index}",
"url": "https://adaptivecards.io/content/down.png",
"width": "20px",
"altText": "${description} $${total} collapsed"
},
{
"type": "Image",
"id": "chevronUp${$index}",
"url": "https://adaptivecards.io/content/up.png",
"width": "20px",
"altText": "${description} $${total} expanded",
"isVisible": false
}
],
"width": "auto"
}
]
},
{
"type": "Container",
"id": "cardContent${$index}",
"isVisible": false,
"items": [
{
"type": "Container",
"items": [
{
"$data": "${custom_fields}",
"type": "TextBlock",
"text": "* ${value}",
"isSubtle": true,
"wrap": true
},
{
"type": "Container",
"items": [
{
"type": "Input.Text",
"id": "comment${$index}",
"placeholder": "Add your comment here."
}
]
}
]
},
{
"type": "Container",
"items": [
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "ActionSet",
"actions": [
{
"type": "Action.Submit",
"title": "Send",
"data": {
"id": "_qkQW8dJlUeLVi7ZMEzYVw",
"action": "comment",
"lineItem": 1
}
}
]
}
],
"width": "auto"
}
]
}
]
}
]
}
]
},
{
"type": "ColumnSet",
"spacing": "Large",
"separator": true,
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"horizontalAlignment": "Right",
"text": "Total Expense Amount \t",
"wrap": true
},
{
"type": "TextBlock",
"horizontalAlignment": "Right",
"text": "Non-reimbursable Amount",
"wrap": true
},
{
"type": "TextBlock",
"horizontalAlignment": "Right",
"text": "Advance Amount",
"wrap": true
}
],
"width": "stretch"
},
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"text": "$${formatNumber(sum(select(expenses, x, x.total)), 2)}",
"wrap": true
},
{
"type": "TextBlock",
"text": "(-) $${formatNumber(sum(select(expenses, x, if(x.is_reimbursable, 0, x.total))), 2)} \t",
"wrap": true
},
{
"type": "TextBlock",
"text": "(-) 0.00 \t",
"wrap": true
}
],
"width": "auto"
},
{
"type": "Column",
"width": "auto"
}
]
},
{
"type": "Container",
"style": "emphasis",
"items": [
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"horizontalAlignment": "Right",
"text": "Amount to be Reimbursed",
"wrap": true
}
],
"width": "stretch"
},
{
"type": "Column",
"items": [
{
"type": "TextBlock",
"weight": "Bolder",
"text": "$${formatNumber(sum(select(expenses, x, if(x.is_reimbursable, x.total, 0))), 2)}",
"wrap": true
}
],
"width": "auto"
},
{
"type": "Column",
"width": "auto"
}
]
}
],
"bleed": true
},
{
"type": "ColumnSet",
"columns": [
{
"type": "Column",
"selectAction": {
"type": "Action.ToggleVisibility",
"targetElements": [
"cardContent4",
"showHistory",
"hideHistory"
]
},
"verticalContentAlignment": "Center",
"items": [
{
"type": "TextBlock",
"id": "showHistory",
"horizontalAlignment": "Right",
"color": "Accent",
"text": "Show history",
"wrap": true
},
{
"type": "TextBlock",
"id": "hideHistory",
"horizontalAlignment": "Right",
"color": "Accent",
"text": "Hide history",
"wrap": true,
"isVisible": false
}
],
"width": 1
}
]
},
{
"type": "Container",
"id": "cardContent4",
"isVisible": false,
"items": [
{
"type": "Container",
"items": [
{
"type": "TextBlock",
"text": "* Expense submitted by **${created_by_name}** on {{DATE(${formatDateTime(created_date, 'yyyy-MM-ddTHH:mm:ssZ')}, SHORT)}}",
"isSubtle": true,
"wrap": true
},
{
"type": "TextBlock",
"text": "* Expense ${expenses[0].status} by **${expenses[0].approver}** on {{DATE(${formatDateTime(approval_date, 'yyyy-MM-ddTHH:mm:ssZ')}, SHORT)}}",
"isSubtle": true,
"wrap": true
}
]
}
]
},
{
"type": "Container",
"items": [
{
"type": "ActionSet",
"actions": [
{
"type": "Action.Submit",
"title": "Approve",
"style": "positive",
"data": {
"id": "_qkQW8dJlUeLVi7ZMEzYVw",
"action": "approve"
}
},
{
"type": "Action.ShowCard",
"title": "Reject",
"style": "destructive",
"card": {
"type": "AdaptiveCard",
"body": [
{
"type": "Input.Text",
"id": "RejectCommentID",
"placeholder": "Please specify an appropriate reason for rejection.",
"isMultiline": true
}
],
"actions": [
{
"type": "Action.Submit",
"title": "Send",
"data": {
"id": "_qkQW8dJlUeLVi7ZMEzYVw",
"action": "reject"
}
}
],
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json"
}
}
]
}
]
}
],
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"version": "1.2",
"fallbackText": "This card requires Adaptive Cards v1.2 support to be rendered properly."
}
Data Json
{
"code": "ER-13052",
"message": "success",
"created_by_name" : "Matt Hidinger",
"created_date" : "2019-07-15T18:33:12+0800",
"submitted_date": "2019-04-14T18:33:12+0800",
"creater_email" : "matt#contoso.com",
"status" : "Pending",
"status_url" : "https://adaptivecards.io/content/pending.png",
"approver": "Thomas",
"purpose" : "Trip to UAE",
"approval_date" : "2019-07-15T22:33:12+0800",
"approver" : "Thomas",
"approver_email" : "thomas#contoso.com",
"other_submitter" : "David",
"other_submitter_email" : "david#contoso.com",
"expenses": [
{
"expense_id": "16367000000083065",
"approver" : "Thomas",
"date": "2017-02-21",
"description": "Air Travel Expense",
"created_by": "636965431200000000",
"created_by_name": "PATRICIA",
"employee_number": "E001",
"currency_id": "16367000000000097",
"currency_code": "USD",
"paid_through_account_id": "16367000000036003",
"paid_through_account_name": "Employee Reimbursements",
"bcy_total": 13900.79,
"bcy_subtotal": 13900.79,
"total": 300,
"total_without_tax": 300,
"is_billable": true,
"is_reimbursable": true,
"reference_number": "DD145",
"due_days": "Due in 15 days",
"merchant_id": "16367000000074027",
"merchant_name": "ABS Solutions",
"status": "approved",
"created_time": "2019-06-19T18:33:12+0800",
"last_modified_time": "2017-02-21T18:42:46+0530",
"receipt_name": "receipt1.jpg",
"report_id": "16367000000083075",
"mileage_type": "non_mileage",
"report_name": "Purchase",
"is_receipt_only": false,
"distance": 0,
"per_diem_rate": 0,
"per_diem_days": 0,
"per_diem_id": "",
"per_diem_name": "",
"expense_type": "non_mileage",
"location": "Washington",
"receipt_type": "jpg",
"policy_violated": false,
"comments_count": 0,
"report_status": "submitted",
"price_precision": 2,
"mileage_rate": 0,
"mileage_unit": "km",
"receipt_status": "processed",
"is_uncategorized": false,
"is_expired": false,
"gl_code": "LG001",
"exchange_rate": 66.943366,
"start_reading": "",
"end_reading": "",
"payment_mode": "Check",
"customer_id": "27927000000075081",
"customer_name": "ACME Corp.",
"custom_fields": [
{
"customfield_id": "16367000000277001",
"label": "Other Name",
"value": "Leg 1 on Tue, Jun 19th, 2019 at 6:00 AM."
},
{
"customfield_id": "16367000000277001",
"label": "Other Name",
"value": "Leg 2 on Tue, Jun 19th, 2019 at 7:15 PM."
}
],
"project_id": "27927000001243001",
"project_name": "Coffee Research",
"transaction_description": "",
"tax_id": "16367000000086001",
"tax_name": "Sales Tax",
"tax_percentage": 2,
"amount": 207.65,
"is_inclusive_tax": false,
"vehicle_type": "Bike",
"vehicle_id": "17456000000078029",
"fuel_type": "lpg",
"engine_capacity_range": "between_1401cc_and_1600cc",
"is_personal": false,
"policy_id": "16367000000092011",
"policy_name": "LIC",
"documents": [
{
"file_name": "receipt1.jpg",
"file_size_formatted": "71.8 KB",
"attachment_order": 1,
"document_id": "16367000000083071"
}
],
"reimbursement_reference": "",
"reimbursement_date": "",
"reimbursement_paid_through_account_id": "",
"reimbursement_paid_through_account_name": "",
"reimbursement_currency_id": "",
"reimbursement_currency_code": ""
},
{
"expense_id": "16367000000083065",
"date": "2019-06-19",
"description": "Auto Mobile Expense",
"created_by": "636965431200000000",
"created_by_name": "PATRICIA",
"employee_number": "E001",
"currency_id": "16367000000000097",
"currency_code": "USD",
"paid_through_account_id": "16367000000036003",
"paid_through_account_name": "Employee Reimbursements",
"bcy_total": 13900.79,
"bcy_subtotal": 13900.79,
"total": 100,
"total_without_tax": 100,
"is_billable": true,
"is_reimbursable": true,
"reference_number": "DD145",
"due_days": "Due in 15 days",
"merchant_id": "16367000000074027",
"merchant_name": "ABS Solutions",
"status": "submitted",
"created_time": "2019-06-19T18:33:12+0800",
"last_modified_time": "2017-02-21T18:42:46+0530",
"receipt_name": "receipt1.jpg",
"report_id": "16367000000083075",
"mileage_type": "non_mileage",
"report_name": "Purchase",
"is_receipt_only": false,
"distance": 0,
"per_diem_rate": 0,
"per_diem_days": 0,
"per_diem_id": "",
"per_diem_name": "",
"expense_type": "non_mileage",
"location": "Washington",
"receipt_type": "jpg",
"policy_violated": false,
"comments_count": 0,
"report_status": "submitted",
"price_precision": 2,
"mileage_rate": 0,
"mileage_unit": "km",
"receipt_status": "processed",
"is_uncategorized": false,
"is_expired": false,
"gl_code": "LG001",
"exchange_rate": 66.943366,
"start_reading": "",
"end_reading": "",
"payment_mode": "Check",
"customer_id": "27927000000075081",
"customer_name": "ACME Corp.",
"custom_fields": [
{
"customfield_id": "16367000000277001",
"label": "Other Name",
"value": " Contoso Car Rentrals, Tues 6/19 at 7:00 AM"
}
],
"project_id": "27927000001243001",
"project_name": "Coffee Research",
"transaction_description": "",
"tax_id": "16367000000086001",
"tax_name": "Sales Tax",
"tax_percentage": 2,
"amount": 207.65,
"is_inclusive_tax": false,
"vehicle_type": "Bike",
"vehicle_id": "17456000000078029",
"fuel_type": "lpg",
"engine_capacity_range": "between_1401cc_and_1600cc",
"is_personal": false,
"policy_id": "16367000000092011",
"policy_name": "LIC",
"documents": [
{
"file_name": "receipt1.jpg",
"file_size_formatted": "71.8 KB",
"attachment_order": 1,
"document_id": "16367000000083071"
}
],
"reimbursement_reference": "",
"reimbursement_date": "",
"reimbursement_paid_through_account_id": "",
"reimbursement_paid_through_account_name": "",
"reimbursement_currency_id": "",
"reimbursement_currency_code": ""
},
{
"expense_id": "16367000000083065",
"date": "2019-06-21",
"description": "Excess Baggage Cost",
"created_by": "636967159200000000",
"created_by_name": "PATRICIA",
"employee_number": "E001",
"currency_id": "16367000000000097",
"currency_code": "USD",
"paid_through_account_id": "16367000000036003",
"paid_through_account_name": "Employee Reimbursements",
"bcy_total": 13900.79,
"bcy_subtotal": 13900.79,
"total": 50.38,
"total_without_tax": 4.3,
"is_billable": true,
"is_reimbursable": false,
"reference_number": "DD145",
"due_days": "Due in 15 days",
"merchant_id": "16367000000074027",
"merchant_name": "ABS Solutions",
"status": "submitted",
"created_time": "2019-06-21T18:33:12+0800",
"last_modified_time": "2017-02-21T18:42:46+0530",
"receipt_name": "receipt1.jpg",
"report_id": "16367000000083075",
"mileage_type": "non_mileage",
"report_name": "Purchase",
"is_receipt_only": false,
"distance": 0,
"per_diem_rate": 0,
"per_diem_days": 0,
"per_diem_id": "",
"per_diem_name": "",
"expense_type": "non_mileage",
"location": "Washington",
"receipt_type": "jpg",
"policy_violated": false,
"comments_count": 0,
"report_status": "submitted",
"price_precision": 2,
"mileage_rate": 0,
"mileage_unit": "km",
"receipt_status": "processed",
"is_uncategorized": false,
"is_expired": false,
"gl_code": "LG001",
"exchange_rate": 66.943366,
"start_reading": "",
"end_reading": "",
"payment_mode": "Check",
"customer_id": "27927000000075081",
"customer_name": "ACME Corp.",
"custom_fields": [
],
"project_id": "27927000001243001",
"project_name": "Coffee Research",
"transaction_description": "",
"tax_id": "16367000000086001",
"tax_name": "Sales Tax",
"tax_percentage": 2,
"amount": 207.65,
"is_inclusive_tax": false,
"vehicle_type": "Bike",
"vehicle_id": "17456000000078029",
"fuel_type": "lpg",
"engine_capacity_range": "between_1401cc_and_1600cc",
"is_personal": false,
"policy_id": "16367000000092011",
"policy_name": "LIC",
"documents": [
{
"file_name": "receipt1.jpg",
"file_size_formatted": "71.8 KB",
"attachment_order": 1,
"document_id": "16367000000083071"
}
],
"reimbursement_reference": "",
"reimbursement_date": "",
"reimbursement_paid_through_account_id": "",
"reimbursement_paid_through_account_name": "",
"reimbursement_currency_id": "",
"reimbursement_currency_code": ""
}
]
}
Please go through this for more info.

Best way to build denormilazed dataframe with pandas from spotify API

I just downloaded some json from spotify and took a look into the pd.normalize_json().
But if I normalise the data i still have dictionaries within my dataframe. Also setting the level doesnt help.
DATA I want to have in my dataframe:
{
"collaborative": false,
"description": "",
"external_urls": {
"spotify": "https://open.spotify.com/playlist/5"
},
"followers": {
"href": null,
"total": 0
},
"href": "https://api.spotify.com/v1/playlists/5?additional_types=track",
"id": "5",
"images": [
{
"height": 640,
"url": "https://i.scdn.co/image/a",
"width": 640
}
],
"name": "Another",
"owner": {
"display_name": "user",
"external_urls": {
"spotify": "https://open.spotify.com/user/user"
},
"href": "https://api.spotify.com/v1/users/user",
"id": "user",
"type": "user",
"uri": "spotify:user:user"
},
"primary_color": null,
"public": true,
"snapshot_id": "M2QxNTcyYTkMDc2",
"tracks": {
"href": "https://api.spotify.com/v1/playlists/100&additional_types=track",
"items": [
{
"added_at": "2020-12-13T18:34:09Z",
"added_by": {
"external_urls": {
"spotify": "https://open.spotify.com/user/user"
},
"href": "https://api.spotify.com/v1/users/user",
"id": "user",
"type": "user",
"uri": "spotify:user:user"
},
"is_local": false,
"primary_color": null,
"track": {
"album": {
"album_type": "album",
"artists": [
{
"external_urls": {
"spotify": "https://open.spotify.com/artist/1dfeR4Had"
},
"href": "https://api.spotify.com/v1/artists/1dfDbWqFHLkxsg1d",
"id": "1dfeR4HaWDbWqFHLkxsg1d",
"name": "Q",
"type": "artist",
"uri": "spotify:artist:1dfeRqFHLkxsg1d"
}
],
"available_markets": [
"CA",
"US"
],
"external_urls": {
"spotify": "https://open.spotify.com/album/6wPXmlLzZ5cCa"
},
"href": "https://api.spotify.com/v1/albums/6wPXUJ9LzZ5cCa",
"id": "6wPXUmYJ9zZ5cCa",
"images": [
{
"height": 640,
"url": "https://i.scdn.co/image/ab676620a47",
"width": 640
},
{
"height": 300,
"url": "https://i.scdn.co/image/ab67616d0620a47",
"width": 300
},
{
"height": 64,
"url": "https://i.scdn.co/image/ab603e6620a47",
"width": 64
}
],
"name": "The (Deluxe ",
"release_date": "1920-07-17",
"release_date_precision": "day",
"total_tracks": 15,
"type": "album",
"uri": "spotify:album:6m5cCa"
},
"artists": [
{
"external_urls": {
"spotify": "https://open.spotify.com/artist/1dg1d"
},
"href": "https://api.spotify.com/v1/artists/1dsg1d",
"id": "1dfeR4HaWDbWqFHLkxsg1d",
"name": "Q",
"type": "artist",
"uri": "spotify:artist:1dxsg1d"
}
],
"available_markets": [
"CA",
"US"
],
"disc_number": 1,
"duration_ms": 21453,
"episode": false,
"explicit": false,
"external_ids": {
"isrc": "GBU6015"
},
"external_urls": {
"spotify": "https://open.spotify.com/track/5716J"
},
"href": "https://api.spotify.com/v1/tracks/5716J",
"id": "5716J",
"is_local": false,
"name": "Another",
"popularity": 73,
"preview_url": null,
"track": true,
"track_number": 3,
"type": "track",
"uri": "spotify:track:516J"
},
"video_thumbnail": {
"url": null
}
}
],
"limit": 100,
"next": null,
"offset": 0,
"previous": null,
"total": 1
},
"type": "playlist",
"uri": "spotify:playlist:fek"
}
So what are best practices to read nested data like this into one dataframe in pandas?
I'm glad for any advice.
EDIT:
so basically I want all keys as columns in my dataframe. But with normalise it stops at "tracks.items" and if I normalise this again i have the recursive problem again.
It depends on the information you are looking for. Take a look at pandas.read_json() to see if that can work. Also you can select data as such
json_output = {"collaborative": 'false',"description": "", "external_urls": {"spotify": "https://open.spotify.com/playlist/5"}}
df['collaborative'] = json_output['collaborative'] #set value of your df to value of returned json values

stripe: How to convert stripe model object into JSON to get complete hierarchical data?

How can I convert the stripe model object into JSON to receive complete hierarchical data at client end?
stripeCustomer = stripe.Customer.retrieve(<stripe customer id>)
sendResponseToClient(stripeCustomer)
I am receiving only 1st level of data as json at client end, second level data from Stripe JSON object is not formatted.
JSON Data Example: (2nd level of data is not received at client end,)
Customer customer id=cus_DTWEPsfrHx3ikZ at 0x00000a> JSON: {
"id": "cus_DTWEPsfrHx3ikZ",
"object": "customer",
"account_balance": 0,
"created": 1535093686,
"currency": "usd",
"default_source": null,
"delinquent": false,
"description": null,
"discount": null,
"email": "rakesh16+test9#gmail.com",
"invoice_prefix": "E91FF30",
"livemode": false,
"metadata": {
},
"shipping": null,
"sources": {
"object": "list",
"data": [
],
"has_more": false,
"total_count": 0,
"url": "/v1/customers/cus_DTWEPsfrHx3ikZ/sources"
},
"subscriptions": {
"object": "list",
"data": [
{
"id": "sub_DTWEALN3urFael",
"object": "subscription",
"application_fee_percent": null,
"billing": "charge_automatically",
"billing_cycle_anchor": 1535093688,
"cancel_at_period_end": false,
"canceled_at": null,
"created": 1535093688,
"current_period_end": 1537772088,
"current_period_start": 1535093688,
"customer": "cus_DTWEPsfrHx3ikZ",
"days_until_due": null,
"discount": null,
"ended_at": null,
"items": {
"object": "list",
"data": [
{
"id": "si_DTWEuZaU4pw9Cv",
"object": "subscription_item",
"created": 1535093688,
"metadata": {
},
"plan": {
"id": "plan_free",
"object": "plan",
"active": true,
"aggregate_usage": null,
"amount": 0,
"billing_scheme": "per_unit",
"created": 1535008667,
"currency": "usd",
"interval": "month",
"interval_count": 1,
"livemode": false,
"metadata": {
},
"nickname": "free",
"product": "prod_DT8B8auk3CRNdw",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"quantity": 1,
"subscription": "sub_DTWEALN3urFael"
}
],
"has_more": false,
"total_count": 1,
"url": "/v1/subscription_items?subscription=sub_DTWEALN3urFael"
},
"livemode": false,
"metadata": {
},
"plan": {
"id": "plan_free",
"object": "plan",
"active": true,
"aggregate_usage": null,
"amount": 0,
"billing_scheme": "per_unit",
"created": 1535008667,
"currency": "usd",
"interval": "month",
"interval_count": 1,
"livemode": false,
"metadata": {
},
"nickname": "free",
"product": "prod_DT8B8auk3CRNdw",
"tiers": null,
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
},
"quantity": 1,
"start": 1535093688,
"status": "active",
"tax_percent": null,
"trial_end": null,
"trial_start": null
}
],
"has_more": false,
"total_count": 1,
"url": "/v1/customers/cus_DTWEPsfrHx3ikZ/subscriptions"
},
"tax_info": null,
"tax_info_verification": null
}
You can convert the returned Stripe object model to JSON using the following technique to ignore non-serializable fields:
stripeObject = stripe.SomeAPICall(...)
jsonEncoded = json.dumps(stripeObject, default=lambda o: '<not serializable>')
pythonDict = json.loads(jsonEncoded)
you can to_dict() api which will convert the stripe model object into dictionary format and then can be eventually converted into JSON string.

Cannot import grafana dashboard via Grafana API

I am trying to import the Grafana dashboard using HTTP API by following Grafana
Grafana Version: 5.1.3
OS -Windows 10
This is what i tried
curl --user admin:admin "http://localhost:3000/api/dashboards/db" -X POST -H "Content-Type:application/json;charset=UTF-8" --data-binary #c:/Users/Mahadev/Desktop/Dashboard.json
and
Here is my python code
import requests
headers = {
'Content-Type': 'application/json;charset=UTF-8',
}
data = open('C:/Users/Mahadev/Desktop/Dashboard.json', 'rb').read()
response = requests.post('http://admin:admin#localhost:3000/api/dashboards/db', headers=headers, data=data)
print (response.text)
And output of both is:
[{"fieldNames":["Dashboard"],"classification":"RequiredError","message":"Required"}]
It is asking for root property called dashboard in my json payload. Can anybody suggest me how to use that porperty and what data should i provide.
If any one want to dig more here are some links.
https://github.com/grafana/grafana/issues/8193
https://github.com/grafana/grafana/issues/2816
https://github.com/grafana/grafana/issues/8193
https://community.grafana.com/t/how-can-i-import-a-dashboard-from-a-json-file/669
https://github.com/grafana/grafana/issues/273
https://github.com/grafana/grafana/issues/5811
https://stackoverflow.com/questions/39968111/unable-to-post-to-grafana-using-python3-module-requests
https://stackoverflow.com/questions/39954475/post-request-works-in-postman-but-not-in-python/39954514#39954514
https://www.bountysource.com/issues/44431991-use-api-to-import-json-file-error
https://github.com/grafana/grafana/issues/7029
Maybe you should try to download your dashboard from the API so you will a "proper" json model to push after?
You can download it with the following command :
curl -H "Authorization: Bearer $TOKEN" https://grafana.domain.tld/api/dashboards/uid/$DASHBOARD_UID
An other way to do it , you can download a dashboard JSON on grafana website => grafana.com/dashboards and try to upload it with your current code? ;)
The dashboard field contain everything that will be display, alerts, graph etc....
Here is an example of dashboard.json :
{
"meta": {
"type": "db",
"canSave": true,
"canEdit": true,
"canAdmin": false,
"canStar": true,
"slug": "status-app",
"url": "/d/lOy3lIImz/status-app",
"expires": "0001-01-01T00:00:00Z",
"created": "2018-06-04T11:40:20+02:00",
"updated": "2018-06-14T17:51:23+02:00",
"updatedBy": "jean",
"createdBy": "jean",
"version": 89,
"hasAcl": false,
"isFolder": false,
"folderId": 0,
"folderTitle": "General",
"folderUrl": "",
"provisioned": false
},
"dashboard": {
"annotations": {
"list": [
{
"builtIn": 1,
"datasource": "-- Grafana --",
"enable": true,
"hide": true,
"iconColor": "rgba(0, 211, 255, 1)",
"name": "Annotations & Alerts",
"type": "dashboard"
}
]
},
"editable": true,
"gnetId": null,
"graphTooltip": 0,
"id": 182,
"links": [],
"panels": [
{
"alert": {
"conditions": [
{
"evaluator": {
"params": [
1
],
"type": "lt"
},
"operator": {
"type": "and"
},
"query": {
"params": [
"A",
"5m",
"now"
]
},
"reducer": {
"params": [],
"type": "avg"
},
"type": "query"
}
],
"executionErrorState": "alerting",
"frequency": "60s",
"handler": 1,
"name": "Status of alert",
"noDataState": "alerting",
"notifications": [
{
"id": 7
}
]
},
"aliasColors": {},
"bars": false,
"dashLength": 10,
"dashes": false,
"datasource": "Collectd",
"fill": 1,
"gridPos": {
"h": 7,
"w": 8,
"x": 0,
"y": 0
},
"id": 4,
"legend": {
"alignAsTable": true,
"avg": true,
"current": true,
"max": false,
"min": false,
"rightSide": false,
"show": true,
"total": false,
"values": true
},
"lines": true,
"linewidth": 1,
"links": [],
"nullPointMode": "connected",
"percentage": false,
"pointradius": 5,
"points": false,
"renderer": "flot",
"seriesOverrides": [],
"spaceLength": 10,
"stack": false,
"steppedLine": false,
"targets": [
{
"alias": "Status",
"groupBy": [
{
"params": [
"$__interval"
],
"type": "time"
},
{
"params": [
"null"
],
"type": "fill"
}
],
"measurement": "processes_processes",
"orderByTime": "ASC",
"policy": "default",
"query": "SELECT mean(value) FROM \"processes_processes\" WHERE (\"instance\" = '' AND \"host\" = 'Webp01') AND $timeFilter GROUP BY time($interval) fill(null)",
"rawQuery": true,
"refId": "A",
"resultFormat": "time_series",
"select": [
[
{
"params": [
"value"
],
"type": "field"
},
{
"params": [],
"type": "mean"
}
]
],
"tags": [
{
"key": "instance",
"operator": "=",
"value": ""
},
{
"condition": "AND",
"key": "host",
"operator": "=",
"value": "Webp01"
}
]
}
],
"thresholds": [
{
"colorMode": "critical",
"fill": true,
"line": true,
"op": "lt",
"value": 1
}
],
"timeFrom": null,
"timeShift": null,
"title": "Status of ",
"tooltip": {
"shared": true,
"sort": 0,
"value_type": "individual"
},
"type": "graph",
"xaxis": {
"buckets": null,
"mode": "time",
"name": null,
"show": true,
"values": []
},
"yaxes": [
{
"format": "short",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": true
},
{
"format": "short",
"label": null,
"logBase": 1,
"max": null,
"min": null,
"show": true
}
],
"yaxis": {
"align": false,
"alignLevel": null
}
}
],
"refresh": "5m",
"schemaVersion": 16,
"style": "dark",
"tags": [
"web",
"nodejs"
],
"templating": {
"list": []
},
"time": {
"from": "now/d",
"to": "now"
},
"timepicker": {
"hidden": false,
"refresh_intervals": [
"5s",
"10s",
"30s",
"1m",
"5m",
"15m",
"30m",
"1h",
"2h",
"1d"
],
"time_options": [
"5m",
"15m",
"1h",
"6h",
"12h",
"24h",
"2d",
"7d",
"30d"
]
},
"timezone": "",
"title": "Status APP",
"uid": "lOy3lIImz",
"version": 89
},
}
Edit:
Here is a JSON snipper for templating your dashboard :
"templating": {
"list": [
{
"allValue": null,
"current": {
"text": "PRD_Web01",
"value": "PRD_Web01"
},
"datasource": "Collectd",
"hide": 0,
"includeAll": false,
"label": null,
"multi": false,
"name": "host",
"options": [],
"query": "SHOW TAG VALUES WITH KEY=host",
"refresh": 1,
"regex": "",
"sort": 0,
"tagValuesQuery": "",
"tags": [],
"tagsQuery": "",
"type": "query",
"useTags": false
},
{
"allValue": null,
"current": {
"text": "sda",
"value": "sda"
},
"datasource": "Collectd",
"hide": 0,
"includeAll": false,
"label": null,
"multi": false,
"name": "device",
"options": [],
"query": "SHOW TAG VALUES FROM \"disk_read\" WITH KEY = \"instance\"",
"refresh": 1,
"regex": "",
"sort": 0,
"tagValuesQuery": "",
"tags": [],
"tagsQuery": "",
"type": "query",
"useTags": false
}
]
},
As I read your answer, I guess you will be OK with this ;). I will try to keep a better eye on this thread
Can you show how your dashboard json looks like ? The json MUST contain a key dashboard in it with all the details inside its value like the following:
{
"dashboard": {
"id": null,
"uid": null,
"title": "Production Overview",
"tags": [ "templated" ],
"timezone": "browser",
"schemaVersion": 16,
"version": 0
},
"folderId": 0,
"overwrite": false
}

Parsing JSON with multiple arrays and comparing values with Excel data using Python 3.6

I have a JSON like:
{
"results": [{
"data": {
"child": [{
"sex": "2",
"birthDateReliability": "0",
"applicationInternalIdentifier": "cmpclt",
"birthDate": "2016-07-04",
"firstName": "Anna"
}],
"consumerType": "PRIVATE",
"countryCode": "FR",
"initialAppSourceCode": "ABCDWEB",
"optin": [{
"optinSourceApplication": "ABCDWEB",
"acceptanceDate": "2017-02-10T10:14:55.037Z",
"marketingGroupService": "XYZXYX-ABC"
}, {
"optinSourceApplication": "ABCDWEB",
"acceptanceDate": "2017-02-10T10:14:55.037Z",
"marketingGroupService": "XYZXYX-DEF"
}, {
"optinSourceApplication": "ABCDWEB",
"acceptanceDate": "2017-02-10T10:14:55.037Z",
"marketingGroupService": "XYZXYX-GHI"
}, {
"optinSourceApplication": "ABCDWEB",
"acceptanceDate": "2017-02-10T10:14:55.037Z",
"marketingGroupService": "XYZXYX-JKL"
}, {
"optinSourceApplication": "ABCDWEB",
"acceptanceDate": "2017-02-10T10:14:55.037Z",
"marketingGroupService": "XYZXYX-MNO"
}],
"didsys_KGexample": true,
"addressLine1": "123 Street",
"marketCode": "10107"
},
"lastUpdatedTimestamp": 1486721887742,
"socialProviders": "site",
"password": {
"hashSettings": {
"rounds": 9504778,
"salt": "XXXXXXXXX",
"algorithm": "xyz"
},
"hash": "$AF$$$F$$$$$ZX$$$$$J$--"
},
"iRank": 0,
"created": "2017-02-10T10:15:36.814Z",
"lastLoginTimestamp": 1486721736970,
"oldestDataUpdated": "2017-02-10T10:15:36.861Z",
"isLockedOut": false,
"profile": {
"zip": "12345",
"lastName": "Shah",
"email": "abc#gmail.com",
"locale": "en",
"firstName": "Jiten",
"city": "London"
},
"isVerified": false,
"createdTimestamp": 1486721736814,
"identities": [{
"lastName": "Shah",
"zip": "12345",
"isLoginIdentity": true,
"locale": "en",
"lastUpdatedTimestamp": 1486721887742,
"lastUpdated": "2017-02-10T10:18:07.742Z",
"provider": "site",
"allowsLogin": true,
"isExpiredSession": false,
"providerUID": "jbx63a0ed2f9a1cfa8cgh7dsdsl3",
"city": "London",
"oldestDataUpdatedTimestamp": 1486721736861,
"email": "abc#gmail.com",
"oldestDataUpdated": "2017-02-10T10:15:36.861Z",
"firstName": "Jiten"
}],
"lastUpdated": "2017-02-10T10:18:07.742Z",
"emails": {
"unverified": ["abc#gmail.com"],
"verified": []
},
"isRegistered": true,
"regSource": "https://abcn.net/user/register",
"lastLoginLocation": {
"state": "H9",
"coordinates": {
"lon": -0.0930938720703125,
"lat": 51.51420593261719
},
"country": "GB",
"city": "London"
},
"isActive": true,
"lastLogin": "2017-02-10T10:15:36.970Z",
"oldestDataUpdatedTimestamp": 1486721736861,
"UID": "ed9af442a4a7bd63a08a1cfa8c9d02f9",
"registered": "2017-02-10T10:18:07.993Z",
"rbaPolicy": {
"riskPolicyLocked": false
},
"loginIDs": {
"unverifiedEmails": [],
"emails": ["abc#gmail.com"]
},
"registeredTimestamp": 1486721887993,
"loginProvider": "site"
}],
"objectsCount": 1,
"totalCount": 1,
"statusCode": 200,
"errorCode": 0,
"statusReason": "OK",
"callId": "5164e6c985ee4ed9bcd76ebd403cfaaa",
"time": "2017-02-14T14:24:26.487Z"
}
I have maintained the Excel sheet where I have maintained all the above JSON Key/Values in two columns.
Now I want to compare/Validate all the values from Excel Sheet against key/value of above JSON file using Python 3.6.

Categories

Resources