Here is the empty JSON that I intend to use in my back end :
{
"user": "",
"mids": {
"merchant_id": {
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
}
}
}
Each user might have more than one Merchant ID. As such, I would need to be able to add another whole merchant_id subtree as followed :
{
"user": "",
"mids": {
"merchant_id": {
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
},
"merchant_id2": {
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
}
}
}
Is there any easy way to achieve the addition/removal of a JSON in Python?
Thanks a lot !
You can try .update()
base = {
"user": "",
"mids": {
"merchant_id": {
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
}
}
}
another_merchant = {"merchant_id2": {
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
}
}
base["mids"].update(another_merchant)
print(base)
{
"user": "",
"mids":
{
"merchant_id":
{
"name": "",
"cruise_credentials":
{
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB":
{
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
},
"merchant_id2":
{
"name": "",
"cruise_credentials":
{
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB":
{
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
}
}
}
Thanks a lot for your help.
There is 2 solutions depending on how we would handle this.
Using a LIST of merchant_id would allow for simple addition of a new dictionary NAME after the first NAME dictionary.
{
"user": "",
"merchant_id": [
{
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
},
{
"name": "",
"cruise_credentials": {
"APIkey": "",
"APIidentifier": "",
"OrgUnitId": ""
},
"SAWB": {
"ProfileID": "",
"AccesKey": "",
"SecretKey": ""
}
}
]
}
Or we can use the .update() function. However, this would force us to change the field "merchant_name" ourself to merchant _name2 and would require additional logic.
I have a program that takes some file and transforms it into a json format.
Im trying to get all the values of certain keys into a list but, because the format of the json file has a bunch of keys that are present multiple times, I cant find a way to do it properly.
My json file looks like this
{
"data": {
"__schema": {
"queryType": {
"fields": [
{
"description": "",
"name": "project"
},
{
"description": "",
"name": "projectEventFeed"
},
{
"description": "",
"name": "projectEventFeedFetchMore"
},
{
"description": "",
"name": "projectRecentEventFeed"
},
{
"description": "",
"name": "unseenProjectActivityCount"
},
{
"description": "",
"name": "projectFiles"
},
{
"description": "",
"name": "projectFilesIdSet"
},
{
"description": "",
"name": "projectFileMessages"
},
{
"description": "",
"name": "projectUserStatus"
},
{
"description": "",
"name": "projectFileScribble"
},
{
"description": "",
"name": "user"
},
{
"description": "",
"name": "viewer"
},
{
"description": "",
"name": "profile"
},
{
"description": "",
"name": "site"
},
{
"description": "",
"name": "designers"
},
{
"description": "",
"name": "predictImageCategory"
},
{
"description": "",
"name": "getPortfolioDesign"
}
]
}
}
}
}
My goal is to get all the name values into a list.
Before turning the file into json, I tried getting that with regex but failed.
With json format I tried the following
map(lambda parsed_json: parsed_json['data']['__schema']['queryType']['fields']['name'], List)
Im getting List from typing
But when i want to turn the map into a list, I get
TypeError: Parameters to generic types must be types. Got 0.
From the conversion.
You could just use list comprehension on the nested 'fields' key in the dict you have converted from your json.
d = {"data": {"__schema": {"queryType": {"fields": [{"description": "", "name": "project"}, {"description": "", "name": "projectEventFeed"}, {"description": "", "name": "projectEventFeedFetchMore"}, {"description": "", "name": "projectRecentEventFeed"}, {"description": "", "name": "unseenProjectActivityCount"}, {"description": "", "name": "projectFiles"}, {"description": "", "name": "projectFilesIdSet"}, {"description": "", "name": "projectFileMessages"}, {"description": "", "name": "projectUserStatus"}, {"description": "", "name": "projectFileScribble"}, {"description": "", "name": "user"}, {"description": "", "name": "viewer"}, {"description": "", "name": "profile"}, {"description": "", "name": "site"}, {"description": "", "name": "designers"}, {"description": "", "name": "predictImageCategory"}, {"description": "", "name": "getPortfolioDesign"}]}}}}
fields = [f['name'] for f in d['data']['__schema']['queryType']['fields']]
print(fields)
# ['project', 'projectEventFeed', 'projectEventFeedFetchMore', 'projectRecentEventFeed', 'unseenProjectActivityCount', 'projectFiles', 'projectFilesIdSet', 'projectFileMessages', 'projectUserStatus', 'projectFileScribble', 'user', 'viewer', 'profile', 'site', 'designers', 'predictImageCategory', 'getPortfolioDesign']
I'm trying to read json from a text file. I can convert the text file to json but some times it throws this error for some json data. Extra data: line 2 column 1 (char 876): JSONDecodeError.
Here is the error stacktrace.
Extra data: line 2 column 1 (char 876): JSONDecodeError
Traceback (most recent call last):
File "/var/task/lambda_function.py", line 28, in lambda_handler
d = json.loads(got_text)
File "/var/lang/lib/python3.6/json/__init__.py", line 354, in loads
return _default_decoder.decode(s)
File "/var/lang/lib/python3.6/json/decoder.py", line 342, in decode
raise JSONDecodeError("Extra data", s, end)
json.decoder.JSONDecodeError: Extra data: line 2 column 1 (char 876)
Here is the code.
retr = s3_client.get_object(Bucket=bucket, Key=key)
bytestream = BytesIO(retr['Body'].read())
got_text = GzipFile(mode='rb', fileobj=bytestream).read().decode('utf-8')
print(got_text)
d = json.loads(got_text)
print("json output")
print(d)
Here is the json.
{
"_metadata": {
"bundled": [
"Segment.io"
],
"unbundled": []
},
"anonymousId": "98cc0c53-jkhjkhj-42d5-8ee1-08a6d6f4e774",
"context": {
"library": {
"name": "analytics.js",
"version": "3.2.5"
},
"page": {
"path": "/login",
"referrer": "http://localhost:8000/",
"search": "",
"title": "Sign in or Register | Your Platform Name Here",
"url": "http://localhost:8000/login"
},
"userAgent": "Mozilla/5.0 ",
"ip": "67.67.88.68"
},
"integrations": {},
"messageId": "ajs-dfbdfbdfbdb",
"properties": {
"path": "/login",
"referrer": "http://localhost:8000/",
"search": "",
"title": "Sign in or Register | Your Platform Name Here",
"url": "http://localhost:8000/login"
},
"receivedAt": "2018-02-05T09:21:02.539Z",
"sentAt": "2018-02-05T09:21:02.413Z",
"timestamp": "2018-02-05T09:21:02.535Z",
"type": "page",
"userId": "16",
"channel": "client",
"originalTimestamp": "2018-02-05T09:21:02.409Z",
"projectId": "dfbfbdfb",
"version": 2
}
What could be the problem?
Look like you have bad quotes in your JSON data. Just replace the invalid quotes with valid quotes and then convert it to a JSON object.
import json
d = '''{
"_metadata": {
"bundled": [
"Segment.io"
],
"unbundled": []
},
"anonymousId": "98cc0c53-jkhjkhj-42d5-8ee1-08a6d6f4e774",
"context": {
"library": {
"name": "analytics.js",
"version": "3.2.5"
},
"page": {
"path": "/login",
"referrer": "http://localhost:8000/",
"search": "",
"title": "Sign in or Register | Your Platform Name Here",
"url": "http://localhost:8000/login"
},
"userAgent": "Mozilla/5.0 ",
"ip": “67.67.688.68”
},
"integrations": {},
"messageId": "ajs-dfbdfbdfbdb”,
"properties": {
"path": "/login",
"referrer": "http://localhost:8000/",
"search": "",
"title": "Sign in or Register | Your Platform Name Here",
"url": "http://localhost:8000/login"
},
"receivedAt": "2018-02-05T09:21:02.539Z",
"sentAt": "2018-02-05T09:21:02.413Z",
"timestamp": "2018-02-05T09:21:02.535Z",
"type": "page",
"userId": "16",
"channel": "client",
"originalTimestamp": "2018-02-05T09:21:02.409Z",
"projectId": “dfbfbdfb”,
"version": 2
}
'''
d = d.replace("“", '"').replace("”", '"')
print json.loads(d)
Output:
{u'projectId': u'dfbfbdfb', u'timestamp': u'2018-02-05T09:21:02.535Z', u'version': 2, u'userId': u'16', u'integrations': {}, u'receivedAt': u'2018-02-05T09:21:02.539Z', u'_metadata': {u'bundled': [u'Segment.io'], u'unbundled': []}, u'anonymousId': u'98cc0c53-jkhjkhj-42d5-8ee1-08a6d6f4e774', u'originalTimestamp': u'2018-02-05T09:21:02.409Z', u'context': {u'userAgent': u'Mozilla/5.0 ', u'page': {u'url': u'http://localhost:8000/login', u'path': u'/login', u'search': u'', u'title': u'Sign in or Register | Your Platform Name Here', u'referrer': u'http://localhost:8000/'}, u'library': {u'version': u'3.2.5', u'name': u'analytics.js'}, u'ip': u'67.67.688.68'}, u'messageId': u'ajs-dfbdfbdfbdb', u'type': u'page', u'properties': {u'url': u'http://localhost:8000/login', u'path': u'/login', u'search': u'', u'title': u'Sign in or Register | Your Platform Name Here', u'referrer': u'http://localhost:8000/'}, u'channel': u'client', u'sentAt': u'2018-02-05T09:21:02.413Z'}
In your case
got_text = got_text.replace("“", '"').replace("”", '"')
d = json.loads(got_text)
Pay attention to several strings that you have. JSON doesn't support ” quotes that sometime appear in your JSON.
Lines with wrong quotes:
"projectId":“dfbfbdfb”,
"messageId":"ajs-dfbdfbdfbdb”,
"ip":“67.67.688.68”
Here is fixed JSON:
{
"_metadata": {
"bundled": [
"Segment.io"
],
"unbundled": []
},
"anonymousId": "98cc0c53-jkhjkhj-42d5-8ee1-08a6d6f4e774",
"context": {
"library": {
"name": "analytics.js",
"version": "3.2.5"
},
"page": {
"path": "/login",
"referrer": "http://localhost:8000/",
"search": "",
"title": "Sign in or Register | Your Platform Name Here",
"url": "http://localhost:8000/login"
},
"userAgent": "Mozilla/5.0 ",
"ip": "67.67.688.68"
},
"integrations": {},
"messageId": "ajs-dfbdfbdfbdb",
"properties": {
"path": "/login",
"referrer": "http://localhost:8000/",
"search": "",
"title": "Sign in or Register | Your Platform Name Here",
"url": "http://localhost:8000/login"
},
"receivedAt": "2018-02-05T09:21:02.539Z",
"sentAt": "2018-02-05T09:21:02.413Z",
"timestamp": "2018-02-05T09:21:02.535Z",
"type": "page",
"userId": "16",
"channel": "client",
"originalTimestamp": "2018-02-05T09:21:02.409Z",
"projectId": "dfbfbdfb",
"version": 2
}
I have a script that writes a JSON web-service to an esri file geodatabase. There is a 1-M relationship between address and requests, the requests are read and written as 3 fields. Below is an example of JSON response and how the data appears in a table. I would like my data to read as one feature/row with multiple request types and quantities associated with the one address and Service Request number. i.e. Address, Type, E-Waste Item 1, Quantity 1, E-Waste Item 2, Quantity 2, etc..
{
"Response": {
"ListOfServiceRequest": {
"ServiceRequest": [
{
"ActionTaken": "",
"AddressVerified": "Y",
"Anonymous": "N",
"AssignTo": "EV",
"Assignee": "",
"CreatedByUserLogin": "MYLATHREEONEONE",
"CreatedDate": "02/17/2015 16:53:25",
"CustomerAccessNumber": "",
"Email": "mylathreeoneone#gmail.com",
"FirstName": "Myla",
"HomePhone": "2131234567",
"IntegrationId": "02172015165417667",
"LADWPAccountNo": "",
"Language": "",
"LastName": "Threeoneone",
"Latitude": "34.176277",
"ListOfAuditTrailItem2": {
"AuditTrailItem2": [
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Status",
"NewValue": "Closed",
"OldValue": "Open"
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Updated By User",
"NewValue": "EAIBOS",
"OldValue": "MYLATHREEONEONE"
}
]
},
"ListOfChildServiceRequest": {},
"ListOfLa311BarricadeRemoval": {},
"ListOfLa311BeesOrBeehive": {},
"ListOfLa311BillingCsscAdjustment": {},
"ListOfLa311BillingEccAdjustment": {},
"ListOfLa311BillingRsscAdjustment": {},
"ListOfLa311BillingRsscExemption": {},
"ListOfLa311BrushItemsPickup": {},
"ListOfLa311BulkyItem": {},
"ListOfLa311BusPadLanding": {},
"ListOfLa311Containers": {},
"ListOfLa311CurbRepair": {},
"ListOfLa311DeadAnimalRemoval": {},
"ListOfLa311DocumentLog": {},
"ListOfLa311ElectronicWaste": {
"La311ElectronicWaste": [
{
"CollectionLocation": "Gated Community/Multifamily Dw",
"DriverFirstName": "Moody",
"DriverLastName": "Frederick10/09/2014",
"ElectronicWestType": "Microwaves",
"GatedCommunityMultifamilyDwelling": "Curb",
"IllegalDumpCollectionLoc": "",
"IllegallyDumped": "N",
"ItemCount": "3",
"LastUpdatedBy": "",
"MobileHomeSpace": "",
"Name": "021720151654176711",
"OtherElectronicWestType": "",
"ServiceDateRendered": "",
"TruckNo": "SC Truck 10",
"Type": "Electronic Waste"
},
{
"CollectionLocation": "Gated Community/Multifamily Dw",
"DriverFirstName": "Moody",
"DriverLastName": "Frederick10/09/2014",
"ElectronicWestType": "Televisions (Any Size)",
"GatedCommunityMultifamilyDwelling": "Curb",
"IllegalDumpCollectionLoc": "",
"IllegallyDumped": "N",
"ItemCount": "6",
"LastUpdatedBy": "",
"MobileHomeSpace": "",
"Name": "021720151654176722",
"OtherElectronicWestType": "",
"ServiceDateRendered": "",
"TruckNo": "SC Truck 10SC Truck 10",
"Type": "Electronic Waste"
},
{
"CollectionLocation": "Gated Community/Multifamily Dw",
"DriverFirstName": "Moody",
"DriverLastName": "Frederick10/09/2014",
"ElectronicWestType": "VCR/DVD Players",
"GatedCommunityMultifamilyDwelling": "Curb",
"IllegalDumpCollectionLoc": "",
"IllegallyDumped": "N",
"ItemCount": "1",
"LastUpdatedBy": "",
"MobileHomeSpace": "",
"Name": "021720151654176723",
"OtherElectronicWestType": "",
"ServiceDateRendered": "",
"TruckNo": "SC Truck 10SC Truck 10",
"Type": "Electronic Waste"
}
]
},
"ListOfLa311Flooding": {},
"ListOfLa311GeneralStreetInspection": {},
"ListOfLa311GenericBc": {
"La311GenericBc": [
{
"ATTRIB_08": "N",
"ATTRIB_16": "",
"ListOfLa311GenericbcAuditTrail": {
"La311GenericbcAuditTrail": [
{
"Date": "02/27/2015 15:15:28",
"EmployeeLogin": "EAIBOS",
"Field": "Driver Last Name",
"NewValue": "Frederick10/09/2014",
"OldValue": "Frederick"
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Driver First Name",
"NewValue": "Moody",
"OldValue": ""
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Driver Last Name",
"NewValue": "Frederick",
"OldValue": ""
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Truck No",
"NewValue": "SC Truck 10",
"OldValue": ""
}
]
},
"NAME": "021720151654176711",
"PAR_ROW_ID": "1-24QH7",
"ROW_ID": "1-24QHR",
"TYPE": "Electronic Waste"
},
{
"ATTRIB_08": "N",
"ATTRIB_16": "",
"ListOfLa311GenericbcAuditTrail": {
"La311GenericbcAuditTrail": [
{
"Date": "02/27/2015 15:15:28",
"EmployeeLogin": "EAIBOS",
"Field": "Driver Last Name",
"NewValue": "Frederick10/09/2014",
"OldValue": "Frederick"
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Driver First Name",
"NewValue": "Moody",
"OldValue": ""
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Driver Last Name",
"NewValue": "Frederick",
"OldValue": ""
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Truck No",
"NewValue": "SC Truck 10SC Truck 10",
"OldValue": ""
}
]
},
"NAME": "021720151654176722",
"PAR_ROW_ID": "1-24QH7",
"ROW_ID": "1-24QHS",
"TYPE": "Electronic Waste"
},
{
"ATTRIB_08": "N",
"ATTRIB_16": "",
"ListOfLa311GenericbcAuditTrail": {
"La311GenericbcAuditTrail": [
{
"Date": "02/27/2015 15:15:28",
"EmployeeLogin": "EAIBOS",
"Field": "Driver Last Name",
"NewValue": "Frederick10/09/2014",
"OldValue": "Frederick"
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Driver First Name",
"NewValue": "Moody",
"OldValue": ""
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Driver Last Name",
"NewValue": "Frederick",
"OldValue": ""
},
{
"Date": "02/27/2015 11:45:46",
"EmployeeLogin": "EAIBOS",
"Field": "Truck No",
"NewValue": "SC Truck 10SC Truck 10",
"OldValue": ""
}
]
},
"NAME": "021720151654176723",
"PAR_ROW_ID": "1-24QH7",
"ROW_ID": "1-24QHT",
"TYPE": "Electronic Waste"
},
{
"ATTRIB_08": "",
"ATTRIB_16": "",
"ListOfLa311GenericbcAuditTrail": {},
"NAME": "02172015165417667100",
"PAR_ROW_ID": "1-24QH7",
"ROW_ID": "1-24QHU",
"TYPE": "GIS"
},
{
"ATTRIB_08": "",
"ATTRIB_16": "",
"ListOfLa311GenericbcAuditTrail": {},
"NAME": "021720151654176671",
"PAR_ROW_ID": "1-24QH7",
"ROW_ID": "1-24QHQ",
"TYPE": "SR Photo ID"
}
]
},
"ListOfLa311GisLayer": {
"La311GisLayer": [
{
"A_Call_No": "",
"Area": "0",
"BOSRadioHolderName": "",
"CommunityPlanningArea": "",
"Day": "THURSDAY",
"DirectionSuffix": "",
"DistrictAbbr": "",
"DistrictName": "EV",
"DistrictNumber": "",
"DistrictOffice": "",
"Fraction": "",
"LastUpdatedBy": "",
"Name": "02172015165417667100",
"R_Call_No": "",
"SectionId": "",
"ShortDay": "Thu",
"StreetFrom": "",
"StreetLightId": "",
"StreetLightStatus": "",
"StreetTo": "",
"Type": "GIS",
"Y_Call_No": ""
}
]
},
"ListOfLa311GraffitiRemoval": {},
"ListOfLa311GuardWarningRailMaintenance": {},
"ListOfLa311GutterRepair": {},
"ListOfLa311HomelessEncampment": {},
"ListOfLa311IllegalAutoRepair": {},
"ListOfLa311IllegalConstruction": {},
"ListOfLa311IllegalConstructionFence": {},
"ListOfLa311IllegalDischargeOfWater": {},
"ListOfLa311IllegalDumpingInProgress": {},
"ListOfLa311IllegalDumpingPickup": {},
"ListOfLa311IllegalExcavation": {},
"ListOfLa311IllegalSignRemoval": {},
"ListOfLa311IllegalVending": {},
"ListOfLa311InformationOnly": {},
"ListOfLa311LandMudSlide": {},
"ListOfLa311LeafBlowerViolation": {},
"ListOfLa311ManualPickup": {},
"ListOfLa311MedianIslandMaintenance": {},
"ListOfLa311MetalHouseholdAppliancesPickup": {},
"ListOfLa311MoveInMoveOut": {},
"ListOfLa311MultipleStreetlightIssue": {},
"ListOfLa311NewsRackViolation": {},
"ListOfLa311Obstructions": {},
"ListOfLa311Other": {},
"ListOfLa311OvergrownVegetationPlants": {},
"ListOfLa311PalmFrondsDown": {},
"ListOfLa311Pothole": {},
"ListOfLa311Resurfacing": {},
"ListOfLa311SanitationBillingBif": {},
"ListOfLa311SanitationBillingCssc": {},
"ListOfLa311SanitationBillingEcc": {},
"ListOfLa311SanitationBillingInquiry": {},
"ListOfLa311SanitationBillingLifeline": {},
"ListOfLa311SanitationBillingRssc": {},
"ListOfLa311SanitationBillingSrf": {},
"ListOfLa311ServiceNotComplete": {},
"ListOfLa311ServiceRequestNotes": {
"La311ServiceRequestNotes": [
{
"Comment": "Out on the sidewalk near the curb. Hopefully it is still there.",
"CommentType": "Address Comments",
"CreatedByUser": "MYLATHREEONEONE",
"CreatedDate": "02/17/2015 16:53:26",
"Date1": "",
"Date2": "",
"Date3": "",
"FeedbackSRType": "",
"IntegrationId": "021720151654176661",
"IsSrNoAvailable": "",
"ListOfLa311SrNotesAuditTrail": {},
"Notification": "N",
"Text1": ""
},
{
"Comment": "So glad to get rid of this old junk. Thanks.",
"CommentType": "External",
"CreatedByUser": "MYLATHREEONEONE",
"CreatedDate": "02/17/2015 16:53:26",
"Date1": "",
"Date2": "",
"Date3": "",
"FeedbackSRType": "",
"IntegrationId": "021720151654176662",
"IsSrNoAvailable": "",
"ListOfLa311SrNotesAuditTrail": {},
"Notification": "N",
"Text1": ""
}
]
},
"ListOfLa311SidewalkRepair": {},
"ListOfLa311SingleStreetlightIssue": {},
"ListOfLa311SrPhotoId": {
"La311SrPhotoId": [
{
"LastUpdatedBy": "",
"Name": "021720151654176671",
"PhotoId": "https://myla311.lacity.org/portal/docview?id=04b8ba678fe21d32b05673eb9ad7711b",
"Type": "SR Photo ID"
}
]
},
"ListOfLa311StreetSweeping": {},
"ListOfLa311StreetTreeInspection": {},
"ListOfLa311StreetTreeViolations": {},
"ListOfLa311SubscribeDuplicateSr": {},
"ListOfLa311TablesAndChairsObstructing": {},
"ListOfLa311TreeEmergency": {},
"ListOfLa311TreeObstruction": {},
"ListOfLa311TreePermits": {},
"ListOfLa311WeedAbatementForPrivateParcels": {},
"LoginUser": "",
"Longitude": "-118.455249",
"MobilOS": "",
"NewContactEmail": "",
"NewContactFirstName": "",
"NewContactLastName": "",
"NewContactPhone": "",
"Owner": "BOS",
"ParentSRLinkDate": "",
"ParentSRLinkUser": "",
"ParentSRNumber": "",
"ParentSRStatus": "",
"ParentSRType": "",
"Priority": "Normal",
"ReasonCode": "",
"RescheduleCounter": "",
"ResolutionCode": "SW",
"SRAddress": "5810 N WILLIS AVE, 91411",
"SRAddressName": "",
"SRAreaPlanningCommission": "South Valley APC",
"SRAreaPlanningCommissionId": "3",
"SRCity": "",
"SRCommunityPoliceStation": "VALLEY BUREAU",
"SRCommunityPoliceStationAPREC": "VAN NUYS",
"SRCommunityPoliceStationPREC": "9",
"SRCouncilDistrictMember": "Tom LaBonge",
"SRCouncilDistrictNo": "4",
"SRCrossStreet": "",
"SRDirection": "N",
"SRHouseNumber": "5810",
"SRNeighborhoodCouncilId": "20",
"SRNeighborhoodCouncilName": "VAN NUYS NC",
"SRNumber": "1-3580171",
"SRStreetName": "WILLIS",
"SRSuffix": "AVE",
"SRTBColumn": "J",
"SRTBMapGridPage": "561",
"SRTBRow": "1",
"SRType": "Electronic Waste",
"SRUnitNumber": "5810",
"SRXCoordinate": "6423983",
"SRYCoordinate": "1886848",
"ServiceDate": "02/19/2015 00:00:00",
"Source": "",
"Status": "Closed",
"UpdatedByUserLogin": "EAIBOS",
"UpdatedDate": "02/27/2015 15:14:22",
"Zipcode": "91411"
}
]
},
"NumOutputObjects": "1"
Script
import json
import jsonpickle
import requests
import arcpy
fc = "C:\MYLATesting.gdb\MYLA311"
if arcpy.Exists(fc):
arcpy.Delete_management(fc)
ListTable ="C:\MYLATesting.gdb\ListView"
if arcpy.Exists(ListTable):
arcpy.Delete_management(ListTable)
f2 = open('C:\Users\Administrator\Desktop\DetailView.json', 'r')
data2 = jsonpickle.encode( jsonpickle.decode(f2.read()) )
url2 = "myURL"
headers2 = {'Content-type': 'text/plain', 'Accept': '/'}
r2 = requests.post(url2, data=data2, headers=headers2)
decoded2 = json.loads(r2.text)
items = []
for sr in decoded2['Response']['ListOfServiceRequest']['ServiceRequest']:
SRAddress = sr['SRAddress']
latitude = sr['Latitude']
longitude = sr['Longitude']
for ew in sr["ListOfLa311ElectronicWaste"][u"La311ElectronicWaste"]:
CommodityType = ew['Type']
ItemType = ew['ElectronicWestType']
ItemCount = ew['ItemCount']
items.append((SRAddress,
latitude,
longitude,
CommodityType,
ItemType,
ItemCount))
import numpy as np #NOTE THIS
dt = np.dtype([('SRAddress', 'U40'),
('latitude', '<f8'),
('longitude', '<f8'),
('Type', 'U40'),
('ElectronicWestType', 'U40'),
('ItemCount', 'U40')])
arr = np.array(items,dtype=dt)
sr = arcpy.SpatialReference(4326)
arcpy.da.NumPyArrayToFeatureClass(arr, fc, ['longitude', 'latitude'], sr )
print json.dumps(decoded2, sort_keys=True, indent=4)