Dictionary to Pandas Dataframe without un-nesting some values - python

I have the below dictionary, and I only want the columns to be key, metric and collectionperiod. These columns can have nested values which I would leave for now and un-nest later. But for some reason the values in the dataframe look off.
{'key': {'formFactor': 'PHONE', 'origin': 'https://www.sample'},
'metrics': {'cumulative_layout_shift': {'histogram': [{'start': '0.00',
'end': '0.10',
'density': 0.7861256879559706},
{'start': '0.10', 'end': '0.25', 'density': 123},
{'start': '0.25', 'density': 111}],
'percentiles': {'p75': '0.07'}},
'experimental_interaction_to_next_paint': {'histogram': [{'start': 0,
'end': 200,
'density': 0.5416453755748598},
{'start': 200, 'end': 500, 'density': 1},
{'start': 500, 'density': 23}],
'percentiles': {'p75': 504}},
'experimental_time_to_first_byte': {'histogram': [{'start': 0,
'end': 800,
'density': 123},
{'start': 800, 'end': 1800, 'density': 123},
{'start': 1800, 'density': 23}],
'percentiles': {'p75': 877}},
'first_contentful_paint': {'histogram': [{'start': 0,
'end': 1800,
'density': 22},
{'start': 1800, 'end': 3000, 'density': 664},
{'start': 3000, 'density': 67}],
'percentiles': {'p75': 1662}},
'first_input_delay': {'histogram': [{'start': 0,
'end': 100,
'density': 234},
{'start': 100, 'end': 300, 'density': 44},
{'start': 300, 'density': 555}],
'percentiles': {'p75': 34}},
'largest_contentful_paint': {'histogram': [{'start': 0,
'end': 2500,
'density': 0.7725250984877367},
{'start': 2500, 'end': 4000, 'density': 777},
{'start': 4000, 'density': 544}],
'percentiles': {'p75': 2352}}},
'collectionPeriod': {'firstDate': {'year': 2022, 'month': 10, 'day': 14},
'lastDate': {'year': 2022, 'month': 11, 'day': 10}}}
When I add the above res to the code below, there seems to be an index column that is actually the 'key' nested values, but I don't want them like that. The dataframe should only have 1 row:
df = pd.DataFrame.from_dict(res, orient ='columns')
df

Given the format of the data, consider using pd.DataFrame.from_dict() which outputs the desired format:
df = pd.DataFrame.from_dict([res])

Related

Iterating over JSON data and printing. (or creating Pandas DataFrame from JSON file)

I’m trying to use Python print specific values from a JSON file that I pulled from an API. From what I understand, I am pulling it as a JSON file that has a list of dictionaries of players, with a nested dictionary for each player containing their data (i.e. name, team, etc.).
I’m running into issues printing the values within the JSON file, as each character is printing on a separate line.
The end result I am trying to get to is a Pandas DataFrame containing all the values from the JSON file, but I can’t even seem to iterate through the JSON file correctly.
Here is my code:
url = "https://api-football-v1.p.rapidapi.com/v3/players"
querystring = {"league":"39","season":"2020", "page":"2"}
headers = {
"X-RapidAPI-Host": "api-football-v1.p.rapidapi.com",
"X-RapidAPI-Key": "xxxxxkeyxxxxx"
}
response = requests.request("GET", url, headers=headers, params=querystring).json()
response_dump = json.dumps(response)
for item in response_dump:
for player_item in item:
print(player_item)
This is the output when I print the JSON response (first two items):
{'get': 'players', 'parameters': {'league': '39', 'page': '2', 'season': '2020'}, 'errors': [], 'results': 20, 'paging': {'current': 2, 'total': 37}, 'response': [{'player': {'id': 301, 'name': 'Benjamin Luke Woodburn', 'firstname': 'Benjamin Luke', 'lastname': 'Woodburn', 'age': 23, 'birth': {'date': '1999-10-15', 'place': 'Nottingham', 'country': 'England'}, 'nationality': 'Wales', 'height': '174 cm', 'weight': '72 kg', 'injured': False, 'photo': 'https://media.api-sports.io/football/players/301.png'}, 'statistics': [{'team': {'id': 40, 'name': 'Liverpool', 'logo': 'https://media.api-sports.io/football/teams/40.png'}, 'league': {'id': 39, 'name': 'Premier League', 'country': 'England', 'logo': 'https://media.api-sports.io/football/leagues/39.png', 'flag': 'https://media.api-sports.io/flags/gb.svg', 'season': 2020}, 'games': {'appearences': 0, 'lineups': 0, 'minutes': 0, 'number': None, 'position': 'Attacker', 'rating': None, 'captain': False}, 'substitutes': {'in': 0, 'out': 0, 'bench': 3}, 'shots': {'total': None, 'on': None}, 'goals': {'total': 0, 'conceded': 0, 'assists': None, 'saves': None}, 'passes': {'total': None, 'key': None, 'accuracy': None}, 'tackles': {'total': None, 'blocks': None, 'interceptions': None}, 'duels': {'total': None, 'won': None}, 'dribbles': {'attempts': None, 'success': None, 'past': None}, 'fouls': {'drawn': None, 'committed': None}, 'cards': {'yellow': 0, 'yellowred': 0, 'red': 0}, 'penalty': {'won': None, 'commited': None, 'scored': 0, 'missed': 0, 'saved': None}}]}, {'player': {'id': 518, 'name': 'Meritan Shabani', 'firstname': 'Meritan', 'lastname': 'Shabani', 'age': 23, 'birth': {'date': '1999-03-15', 'place': 'München', 'country': 'Germany'}, 'nationality': 'Germany', 'height': '185 cm', 'weight': '78 kg', 'injured': False, 'photo': 'https://media.api-sports.io/football/players/518.png'}, 'statistics': [{'team': {'id': 39, 'name': 'Wolves', 'logo': 'https://media.api-sports.io/football/teams/39.png'}, 'league': {'id': 39, 'name': 'Premier League', 'country': 'England', 'logo': 'https://media.api-sports.io/football/leagues/39.png', 'flag': 'https://media.api-sports.io/flags/gb.svg', 'season': 2020}, 'games': {'appearences': 0, 'lineups': 0, 'minutes': 0, 'number': None, 'position': 'Midfielder', 'rating': None, 'captain': False}, 'substitutes': {'in': 0, 'out': 0, 'bench': 3}, 'shots': {'total': None, 'on': None}, 'goals': {'total': 0, 'conceded': 0, 'assists': None, 'saves': None}, 'passes': {'total': None, 'key': None, 'accuracy': None}, 'tackles': {'total': None, 'blocks': None, 'interceptions': None}, 'duels': {'total': None, 'won': None}, 'dribbles': {'attempts': None, 'success': None, 'past': None}, 'fouls': {'drawn': None, 'committed': None}, 'cards': {'yellow': 0, 'yellowred': 0, 'red': 0}, 'penalty': {'won': None, 'commited': None, 'scored': 0, 'missed': 0, 'saved': None}}]},
This is the data type of each layer of the JSON file, from when I iterated through it with a For loop:
print(type(response)) <class 'dict'>
print(type(response_dump)) <class 'str'>
print(type(item)) <class 'str'>
print(type(player_item)) <class 'str'>
You do not have to json.dumps() in my opinion, just use the JSON from response to iterate:
for player in response['response']:
print(player)
{'player': {'id': 301, 'name': 'Benjamin Luke Woodburn', 'firstname': 'Benjamin Luke', 'lastname': 'Woodburn', 'age': 23, 'birth': {'date': '1999-10-15', 'place': 'Nottingham', 'country': 'England'}, 'nationality': 'Wales', 'height': '174 cm', 'weight': '72 kg', 'injured': False, 'photo': 'https://media.api-sports.io/football/players/301.png'}, 'statistics': [{'team': {'id': 40, 'name': 'Liverpool', 'logo': 'https://media.api-sports.io/football/teams/40.png'}, 'league': {'id': 39, 'name': 'Premier League', 'country': 'England', 'logo': 'https://media.api-sports.io/football/leagues/39.png', 'flag': 'https://media.api-sports.io/flags/gb.svg', 'season': 2020}, 'games': {'appearences': 0, 'lineups': 0, 'minutes': 0, 'number': None, 'position': 'Attacker', 'rating': None, 'captain': False}, 'substitutes': {'in': 0, 'out': 0, 'bench': 3}, 'shots': {'total': None, 'on': None}, 'goals': {'total': 0, 'conceded': 0, 'assists': None, 'saves': None}, 'passes': {'total': None, 'key': None, 'accuracy': None}, 'tackles': {'total': None, 'blocks': None, 'interceptions': None}, 'duels': {'total': None, 'won': None}, 'dribbles': {'attempts': None, 'success': None, 'past': None}, 'fouls': {'drawn': None, 'committed': None}, 'cards': {'yellow': 0, 'yellowred': 0, 'red': 0}, 'penalty': {'won': None, 'commited': None, 'scored': 0, 'missed': 0, 'saved': None}}]}
{'player': {'id': 518, 'name': 'Meritan Shabani', 'firstname': 'Meritan', 'lastname': 'Shabani', 'age': 23, 'birth': {'date': '1999-03-15', 'place': 'München', 'country': 'Germany'}, 'nationality': 'Germany', 'height': '185 cm', 'weight': '78 kg', 'injured': False, 'photo': 'https://media.api-sports.io/football/players/518.png'}, 'statistics': [{'team': {'id': 39, 'name': 'Wolves', 'logo': 'https://media.api-sports.io/football/teams/39.png'}, 'league': {'id': 39, 'name': 'Premier League', 'country': 'England', 'logo': 'https://media.api-sports.io/football/leagues/39.png', 'flag': 'https://media.api-sports.io/flags/gb.svg', 'season': 2020}, 'games': {'appearences': 0, 'lineups': 0, 'minutes': 0, 'number': None, 'position': 'Midfielder', 'rating': None, 'captain': False}, 'substitutes': {'in': 0, 'out': 0, 'bench': 3}, 'shots': {'total': None, 'on': None}, 'goals': {'total': 0, 'conceded': 0, 'assists': None, 'saves': None}, 'passes': {'total': None, 'key': None, 'accuracy': None}, 'tackles': {'total': None, 'blocks': None, 'interceptions': None}, 'duels': {'total': None, 'won': None}, 'dribbles': {'attempts': None, 'success': None, 'past': None}, 'fouls': {'drawn': None, 'committed': None}, 'cards': {'yellow': 0, 'yellowred': 0, 'red': 0}, 'penalty': {'won': None, 'commited': None, 'scored': 0, 'missed': 0, 'saved': None}}]}
or
for player in response['response']:
print(player['player'])
{'id': 301, 'name': 'Benjamin Luke Woodburn', 'firstname': 'Benjamin Luke', 'lastname': 'Woodburn', 'age': 23, 'birth': {'date': '1999-10-15', 'place': 'Nottingham', 'country': 'England'}, 'nationality': 'Wales', 'height': '174 cm', 'weight': '72 kg', 'injured': False, 'photo': 'https://media.api-sports.io/football/players/301.png'}
{'id': 518, 'name': 'Meritan Shabani', 'firstname': 'Meritan', 'lastname': 'Shabani', 'age': 23, 'birth': {'date': '1999-03-15', 'place': 'München', 'country': 'Germany'}, 'nationality': 'Germany', 'height': '185 cm', 'weight': '78 kg', 'injured': False, 'photo': 'https://media.api-sports.io/football/players/518.png'}
To get a DataFrame simply call pd.json_normalize() - Cause your question is not that clear I am not sure wiche information is needed and how to displayed. This is predestinated to ask a new question with exact that focus.:
pd.json_normalize(response['response'])
EDIT
Based on your comment and improvment:
pd.concat([pd.json_normalize(response,['response'])\
,pd.json_normalize(response,['response','statistics'])], axis=1)\
.drop(['statistics'], axis=1)
player.id
player.name
player.firstname
player.lastname
player.age
player.birth.date
player.birth.place
player.birth.country
player.nationality
player.height
player.weight
player.injured
player.photo
team.id
team.name
team.logo
league.id
league.name
league.country
league.logo
league.flag
league.season
games.appearences
games.lineups
games.minutes
games.number
games.position
games.rating
games.captain
substitutes.in
substitutes.out
substitutes.bench
shots.total
shots.on
goals.total
goals.conceded
goals.assists
goals.saves
passes.total
passes.key
passes.accuracy
tackles.total
tackles.blocks
tackles.interceptions
duels.total
duels.won
dribbles.attempts
dribbles.success
dribbles.past
fouls.drawn
fouls.committed
cards.yellow
cards.yellowred
cards.red
penalty.won
penalty.commited
penalty.scored
penalty.missed
penalty.saved
0
301
Benjamin Luke Woodburn
Benjamin Luke
Woodburn
23
1999-10-15
Nottingham
England
Wales
174 cm
72 kg
False
https://media.api-sports.io/football/players/301.png
40
Liverpool
https://media.api-sports.io/football/teams/40.png
39
Premier League
England
https://media.api-sports.io/football/leagues/39.png
https://media.api-sports.io/flags/gb.svg
2020
0
0
0
Attacker
False
0
0
3
0
0
0
0
0
0
0
1
518
Meritan Shabani
Meritan
Shabani
23
1999-03-15
München
Germany
Germany
185 cm
78 kg
False
https://media.api-sports.io/football/players/518.png
39
Wolves
https://media.api-sports.io/football/teams/39.png
39
Premier League
England
https://media.api-sports.io/football/leagues/39.png
https://media.api-sports.io/flags/gb.svg
2020
0
0
0
Midfielder
False
0
0
3
0
0
0
0
0
0
0

How to append the sum of keys of each dictionary to another key?

I have a json format like below:-
l = {'itc': 'ball','classes': [{'scores': [{'len': 89,'width':50},{'len': 27,'width': 17}]},
{'scores': [{'len': 90,'width': 44},{'len': 0,'width': 0}]},
{'scores': [{'len': 50,'width': 26},{'len': 0,'width': 0}]}]}
Now I want to create a new list of dictionaries. like below:-
output= [{'result': [{'len': 89, 'width': 50}, {'len': 27, 'width': 17}], 'total': 116}, {'result': [{'len': 90, 'width': 44}, {'len': 0, 'width': 0}], 'total': 90}, {'result': [{'len': 50, 'width': 26}, {'len': 0, 'width': 0}], 'total': 50}]
I was able to divide the values and was able to place in the required format but I am not able to append the total score key 'len' of every dictionary in to the total of every dictionary result. It is calculating the whole values of all the dictionaries. The code and the output I got is as follows:-
added=[]
output=[]
for k,v in l.items():
if k=='classes':
for i in v:
for ke,ve in i.items():
if ke=='scores':
for j in ve:
for key,val in j.items():
if key=='len':
add = val
added.append(add)
sumed=sum(added)
out={'result':ve,'total':sumed}
output.append(out)
print(output)
Output I got:-
[{'result': [{'len': 89, 'width': 50}, {'len': 27, 'width': 17}], 'total': 116}, {'result': [{'len': 90, 'width': 44}, {'len': 0, 'width': 0}], 'total': 206}, {'result': [{'len': 50, 'width': 26}, {'len': 0, 'width': 0}], 'total': 256}]
As you could see that its summing up all the values and appending them to key total. How do I append sum of each dictionary score to the total key of each dictionary result as below?
output= [{'result': [{'len': 89, 'width': 50}, {'len': 27, 'width': 17}], 'total': 116}, {'result': [{'len': 90, 'width': 44}, {'len': 0, 'width': 0}], 'total': 90}, {'result': [{'len': 50, 'width': 26}, {'len': 0, 'width': 0}], 'total': 50}]
Use sum to get the total:
res = [{"result" : cl["scores"], "total" : sum(d["len"] for d in cl["scores"])} for cl in l["classes"]]
print(res)
Output
[{'result': [{'len': 89, 'width': 50}, {'len': 27, 'width': 17}], 'total': 116}, {'result': [{'len': 90, 'width': 44}, {'len': 0, 'width': 0}], 'total': 90}, {'result': [{'len': 50, 'width': 26}, {'len': 0, 'width': 0}], 'total': 50}]
Or the equivalent, for-loop:
res = []
for cl in l["classes"]:
scores = cl["scores"]
total = sum(d["len"] for d in cl["scores"])
res.append({"result": scores, "total": total})

python invalid json format

I am pulling data with rest api with below script but output data is not valid for json. I cannot see it in jsonviewer. Do you have any idea how to fix it?
headers = {
"cookie": "JSESSIONID=node07z8uqc8xfd776gx6z7wslnoy4708978.node0",
"Authorization": "Basic SElBQV9HVUlfUmVwb3J0OlB3MiN6MjdLWmxJam16TFFIYTFv",
"Content-Type": "application/json"
}
response = requests.request("POST", url, json=payload, headers=headers, params=querystring, verify=False)
res=response.json()
output:
{'status': {'total': 1, 'matched': 1, 'processed': 1, 'completed': True, 'aborted': False, 'hasErrors': False}, 'result': [{'signature': 'raidOwner#33414-SACONC1ANKT', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-TCTMDELLVMPL07_PROBE', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [18, 47, 47, 47, 44, 47, 52, 51, 47, 48, 47, 46]}]}, {'signature': 'raidOwner#33414-BUTTE14', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [281, 50, 41, 67, 211, 62, 45, 57, 35, 51, 41, 36]}]}, {'signature': 'raidOwner#33414-SADEC1ANKT', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-TMMENGEN', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [7, 10, 3, 7, 10, 7, 6, 7, 7, 10, 3, 10]}]}, {'signature': 'raidOwner#33414-UYGURWANK01_-1_2', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [8, 3, 3, 9, 6, 3, 0, 6, 6, 13, 6, 9]}]}, {'signature': 'raidOwner#33414-HISARVIO', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [187, 202, 148, 195, 186, 166, 165, 161, 169, 205, 179, 175]}]}, {'signature': 'raidOwner#33414-TDMTMLTEST08', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [721, 158, 144, 126, 138, 708, 304, 138, 203, 951, 144, 153]}]}, {'signature': 'raidOwner#33414-ARCWAYODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4675, 9151, 9572, 2909, 12598, 38262, 21859, 7810, 10647, 3113, 7722, 9493]}]}, {'signature': 'raidOwner#33414-COLUMBIAODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [9863, 4897, 5928, 8630, 4940, 5189, 8824, 5612, 6019, 7568, 5822, 5311]}]}, {'signature': 'raidOwner#33414-PATRIOTODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [32157, 17399, 25190, 34330, 31522, 26501, 33375, 16874, 33765, 15966, 16775, 38235]}]}, {'signature': 'raidOwner#33414-PAPELODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-EXPERIAODM2', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [242376, 656037, 580521, 548083, 533699, 431400, 269603, 243191, 162227, 120463, 119748, 204799]}]}, {'signature': 'raidOwner#33414-BTEST10', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-PAY1ODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [7, 3, 3, 7, 3, 3, 7, 3, 3, 7, 3, 7]}]}, {'signature': 'raidOwner#33414-BUTTE26', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [56760, 86203, 94584, 65130, 61539, 82702, 60885, 69314, 94918, 27351, 54816, 27491]}]}, {'signature': 'raidOwner#33414-FROGODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [2224526, 2393403, 1590811, 1312885, 1089164, 1016038, 1149478, 926692, 995665, 1241744, 1017906, 1188569]}]}, {'signature': 'raidOwner#33414-BUTTE08', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-KACKARSNP2', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4, 3, 3, 3, 3, 3, 0, 3, 3, 3, 3, 3]}]}, {'signature': 'raidOwner#33414-KACKARSNP1', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4, 3, 3, 3, 0, 3, 3, 3, 3, 3, 3, 3]}]}, {'signature': 'raidOwner#33414-KACKARSI', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4, 3, 3, 3, 3, 3, 0, 3, 3, 3, 3, 3]}]}, {'signature': 'raidOwner#33414-KACKARDEV', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4, 3, 3, 3, 3, 3, 3, 3, 3, 0, 3, 7]}]}, {'signature': 'raidOwner#33414-NIMCELL', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-TMFILYOS01', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [131, 123, 119, 130, 120, 119, 127, 122, 119, 130, 119, 121]}]}, {'signature': 'raidOwner#33414-KACKARST', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4, 3, 3, 3, 3, 3, 0, 3, 3, 3, 3, 3]}]}, {'signature': 'raidOwner#33414-ACTONODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [18, 14, 14, 17, 14, 14, 17, 14, 14, 24, 14, 13]}]}, {'signature': 'raidOwner#33414-BUTTE04', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-KACKARPRP', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [7, 3, 3, 7, 3, 3, 7, 3, 3, 3, 7, 3]}]}, {'signature': 'raidOwner#33414-TMFILYOS0-1_2', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [47, 30, 43, 27, 34, 40, 34, 38, 31, 40, 34, 27]}]}, {'signature': 'raidOwner#33414-GIOVANNIODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [38986, 41206, 99873, 168996, 173111, 45839, 11291, 10436, 21897, 10855, 11033, 14073]}]}, {'signature': 'raidOwner#33414-AXMB26', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [19551, 18129, 18708, 16765, 16456, 21697, 25549, 20731, 21459, 17117, 15462, 18972]}]}, {'signature': 'raidOwner#33414-AXMB25', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [15130, 13918, 14320, 13455, 12740, 20026, 21771, 14503, 14235, 13461, 13559, 16014]}]}, {'signature': 'raidOwner#33414-AXMB24', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [16575, 15983, 19261, 23725, 16914, 15068, 15720, 15878, 15094, 14545, 13987, 15707]}]}, {'signature': 'raidOwner#33414-AXMB23', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [25051, 22608, 21930, 21603, 21612, 21932, 21839, 20880, 22166, 19543, 19857, 20422]}]}, {'signature': 'raidOwner#33414-ULTIADB03', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-AXMB22', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [21042, 21118, 20551, 19350, 20186, 20724, 20651, 21685, 20426, 21073, 20949, 21873]}]}, {'signature': 'raidOwner#33414-MONOODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [653882, 578192, 601919, 584437, 441061, 376242, 462738, 434533, 513835, 678884, 536437, 453086]}]}, {'signature': 'raidOwner#33414-POTASYUM02', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [1056, 1417, 942, 847, 1363, 970, 1086, 1038, 945, 836, 1174, 1086]}]}, {'signature': 'raidOwner#33414-AXMB21', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4831, 6172, 8309, 5106, 4622, 5239, 4637, 5695, 5355, 6042, 5024, 5472]}]}, {'signature': 'raidOwner#33414-AXMB20', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [6482, 7090, 8353, 6121, 5609, 6338, 7364, 7165, 6522, 6442, 5375, 5910]}]}, {'signature': 'raidOwner#33414-TROIAODM_-DATA_OTHER', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [232920, 288949, 1033782, 670121, 1241934, 215641, 708520, 321912, 123164, 556778, 651439, 208483]}]}, {'signature': 'raidOwner#33414-MAXIMUSODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [7, 7, 7, 7, 7, 7, 10, 7, 3, 7, 7, 7]}]}, {'signature': 'raidOwner#33414-CACA2', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]}]}, {'signature': 'raidOwner#33414-TMLBIBOPODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [3330, 7786, 5519, 3111, 3957, 4372, 4061, 3852, 5314, 2888, 4494, 4486]}]}, {'signature': 'raidOwner#33414-ASCEND04', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [139, 129, 102, 115, 113, 91, 132, 132, 126, 106, 91, 92]}]}, {'signature': 'raidOwner#33414-KACKARDAY', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [7, 3, 7, 3, 7, 7, 3, 10, 3, 7, 3, 7]}]}, {'signature': 'raidOwner#33414-GORELE3', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [2197, 2065, 3256, 2220, 1789, 1085, 5257, 2072, 1785, 2224, 2466, 1346]}]}, {'signature': 'raidOwner#33414-MAXIMUSODM_ULTIMAODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [1192, 699, 2037, 780, 677, 1472, 1264, 728, 1504, 855, 799, 1500]}]}, {'signature': 'raidOwner#33414-ULTIMAODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [224369, 32483, 36993, 37510, 30518, 37591, 48513, 29792, 45712, 30970, 32095, 32290]}]}, {'signature': 'raidOwner#33414-ASCEND03', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [156, 122, 96, 118, 112, 91, 127, 132, 126, 110, 98, 92]}]}, {'signature': 'raidOwner#33414-JAGUARWANK11', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [2314, 1034, 8468, 2153, 1189, 894, 232, 344, 170, 4037, 232, 782]}]}, {'signature': 'raidOwner#33414-BUTTE01', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [1814, 1609, 1279, 1394, 1320, 1249, 2002, 1329, 1270, 1359, 1284, 3383]}]}, {'signature': 'raidOwner#33414-TMFILYOS02', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [109, 109, 113, 109, 110, 112, 116, 116, 123, 113, 116, 114]}]}, {'signature': 'raidOwner#33414-RUFFLESODM', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [176860, 16445, 3234, 20159, 7399, 3638, 716597, 5550, 3403, 13726, 2837, 4777]}]}, {'signature': 'raidOwner#33414-TMGEREDE', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [4, 7, 10, 7, 6, 7, 7, 6, 7, 10, 3, 7]}]}, {'signature': 'raidOwner#33414-SREDMMAPP5', 'syn_raidLdev_transferRate': [{'type': 'timeseries', 'name': 'Transfer Rate', 'unit': 'KB/s', 'interval': 300, 'start': '20210627_230500', 'data': [0, 3, 0, 3, 0, 3, 3, 0, 3, 0, 3, 0]}]}]}
Even though the method name to retrieve the response as "JSON" and not as text is .json(), the output itself is not JSON but <class 'dict'>.
Therefore you need to import json and use json.dumps(res.json()) in order to convert this dictionary to 'legal' JSON, and then you will be able to see it in jsonviewer.
In addition, you can also just use res.text instead of json.dumps(res.json()), since the text is legal JSON.

Is one of the numbers in this list in between the two given integers?

I have a list with barline ticks and midi notes that can overlap the barlines. So I made a list of 'barlineticks':
barlinepos = [0, 768.0, 1536.0, 2304.0, 3072.0, 3840.0, 4608.0, 5376.0, 6144.0, 6912.0, 0, 576.0, 1152.0, 1728.0, 2304.0, 2880.0, 3456.0, 4032.0, 4608.0, 5184.0, 5760.0, 6336.0, 6912.0, 7488.0]
And a MidiFile:
{'type': 'time_signature', 'numerator': 4, 'denominator': 4, 'time': 0, 'duration': 768, 'ID': 0}
{'type': 'set_tempo', 'tempo': 500000, 'time': 0, 'ID': 1}
{'type': 'track_name', 'name': 'Tempo Track', 'time': 0, 'ID': 2}
{'type': 'track_name', 'name': 'New Instrument', 'time': 0, 'ID': 3}
{'type': 'note_on', 'time': 0, 'channel': 0, 'note': 48, 'velocity': 100, 'ID': 4, 'duration': 956}
{'type': 'time_signature', 'numerator': 3, 'denominator': 4, 'time': 768, 'duration': 6911, 'ID': 5}
{'type': 'note_on', 'time': 768, 'channel': 0, 'note': 46, 'velocity': 100, 'ID': 6, 'duration': 575}
{'type': 'note_off', 'time': 956, 'channel': 0, 'note': 48, 'velocity': 0, 'ID': 7}
{'type': 'note_off', 'time': 1343, 'channel': 0, 'note': 46, 'velocity': 0, 'ID': 8}
{'type': 'end_of_track', 'time': 7679, 'ID': 9}
And I want to check if the midi note is overlapping a barline. Every note_on message has a 'time' and a 'duration' value. I have to check if one of the barlineticks(in the list) is inside the range of the note('time' and 'duration'). I tried:
if barlinepos in range(0, 956):
print(True)
Of course this doesn't work because barlinepos is a list. How can I check if one of the values in the list results in True?
Simple iteration to solve the requirement:
for i in midifile:
start, end = i["time"], i["time"]+i["duration"]
for j in barlinepos:
if j >= start and j<= end:
print(True)
break
print(False)

How can i add the dictionary into list using append function or the other function?

Execusme, i need your help!
Code Script
tracks_ = []
track = {}
if category == 'reference':
for i in range(len(tracks)):
if len(tracks) >= 1:
_tracks = tracks[i]
track['id'] = _track['id']
tracks_.append(track)
print (tracks_)
tracks File
[{'id': 345, 'mode': 'ghost', 'missed': 27, 'box': [0.493, 0.779, 0.595, 0.808], 'score': 89, 'class': 1, 'time': 3352}, {'id': 347, 'mode': 'ghost', 'missed': 9, 'box': [0.508, 0.957, 0.631, 0.996], 'score': 89, 'class': 1, 'time': 5463}, {'id': 914, 'mode': 'track', 'missed': 0, 'box': [0.699, 0.496, 0.991, 0.581], 'score': 87, 'class': 62, 'time': 6549}, {'id': 153, 'mode': 'track', 'missed': 0, 'box': [0.613, 0.599, 0.88, 0.689], 'score': 73, 'class': 62, 'time': 6549}, {'id': 588, 'mode': 'track', 'missed': 0, 'box': [0.651, 0.685, 0.958, 0.775], 'score': 79, 'class': 62, 'time': 6549}, {'id': 972, 'mode': 'track', 'missed': 0, 'box': [0.632, 0.04, 0.919, 0.126], 'score': 89, 'class': 62, 'time': 6549}, {'id': 300, 'mode': 'ghost', 'missed': 6, 'box': [0.591, 0.457, 0.74, 0.498], 'score': 71, 'class': 62, 'time': 5716}]
Based on the codescript and the input above, i want to print out the tracks_ and the result is
[{'id': 300}, {'id': 300}, {'id': 300}, {'id': 300}, {'id': 300}, {'id': 300}, {'id': 300}]
but, the result that print out should be like this :
[{'id': 345}, {'id': 347},{'id': 914}, {'id': 153}, {'id': 588}, {'id': 972}, {'id': 300}, ]
you are appending to your list track_ the same dict , which causes to have in your list only references of the same dict, practically you have only one dict in your list tracks_, and any modification to the dict track will be reflected in all the elements of your list, to fix you should create a new dict on each iteration:
if category == 'reference' and len(tracks) >= 1:
for d in tracks:
tracks_.append({'id' : d['id']})
you could use a list comprehension:
tracks_ = [{'id': t['id']} for t in tracks]
tracks_
output:
[{'id': 345},
{'id': 347},
{'id': 914},
{'id': 153},
{'id': 588},
{'id': 972},
{'id': 300}]

Categories

Resources