I'm using pythons Requests to send a Pandas Dataframe to a Flask server. The dataframe has about 2 million rows and 16 columns. I want to send a config dictionary along with the dataframe, as metadata. At the moment I am able to send the dataframe as a JSON file, however, I can't find any way to attach the metadata in the same post request.
Here's my code:
Client side:
# Post request containing 1. The dataset (pandas df) 2. The metadata (dict)
dataset = dataset.to_json(orient='split')
metadata = {'dataset ID': "makis", 'date start': "1", 'date end': "2"}
url = "http://localhost:8081/upload_dataset"
r = requests.post(url, data=dataset)
return r.text
Server side:
#app.route("/upload_dataset", methods=['POST'])
def upload_dataset():
from werkzeug.datastructures import FileStorage
payload = request.stream
dataset = pd.read_json(payload, typ='frame', orient='split')
FileStorage(payload).save('dataset.csv')
return 'File Uploaded & Standing by', 200
Once serialized to json, your dataset is plain text. To add more parameters from there, your only options are embed your payload along with metadata in post parameters, resulting in url-encoding the json. Or embed your payload in a top-level json post, thus double-encode in json.
You would gain in clarity and maybe performance if you left json encoding job to requests instead. In this way you could add data and still encode/decode only once.
Example
dataset = dataset.to_dict(orient='list')
post_data = {'dataset ID': "makis", 'date start': "1", 'date end': "2", 'payload': dataset}
url = "http://localhost:8081/upload_dataset"
r = requests.post(url, json=post_data)
Server side:
#app.route("/upload_dataset", methods=['POST'])
def upload_dataset():
post_data = request.get_json()
## Use of meta data keys e.g. post_data['date start']
dataset = pd.from_dict(post_data['payload'], orient='columns')
Related
I am on a system without pip and its not planned to use.
Put I have do to a post with python to an API (With this post I want to add a row in a postgresdb)
it works with the following code. but there are two columns which are jsonb in postgres.
With this code these columns have quotation marks, so the json doesn't work properly.
How can I avoid this?
import urllib.parse
import urllib.request
def_post_wo_requ
json_post= { 'store' : 'Jack', 'store_detail' : {'Tel1':'00000','Tel2':'11111'}}
url=host+'/'+ pg_table
data = urllib.parse.urlencode(json_post)
data = data.encode('utf-8')
print(data)
The print data shows this:
b'store=Jack&store_detail=%7B%27Tel1%27%3A+%2700000%27%2C+%27Tel2%27%3A+%2711111%27%7D'
and the json in the db is this:
"{'Tel1': '00000', 'Tel2': '11111'}"
Do you need to convert the data to a string?
Would not this ("classic" approach) work ?
from urllib import request
import json
url = 'https://myapp.domain/endpoint'
json_data= {'store': 'Jack', 'store_detail': {'Tel1': '00000','Tel2': '11111'}}
req = request.Request(url, method="POST")
req.add_header('Content-Type', 'application/json')
data = json.dumps(data)
data = data.encode()
r = request.urlopen(req, data=data)
I would like to pass a JSON object to a FastAPI backend. Here is what I am doing in the frontend app:
data = {'labels': labels, 'sequences': sequences}
response = requests.post(api_url, data = data)
Here is how the backend API looks like in FastAPI:
#app.post("/api/zero-shot/")
async def Zero_Shot_Classification(request: Request):
data = await request.json()
However, I am getting this error:
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
You should use the json parameter instead (which would change the Content-Type header to application/json):
payload = {'labels': labels, 'sequences': sequences}
r = requests.post(url, json=payload)
not data which is used for sending form data with the Content-Type being application/x-www-form-urlencoded by default, or multipart/form-data if files are also included in the request—unless you serialised your JSON first and manually set the Content-Type header to application/json, as described in this answer:
payload = {'labels': labels, 'sequences': sequences}
r = requests.post(url, data=json.dumps(payload), headers={'Content-Type': 'application/json'})
Also, please have a look at the documentation on how to benefit from using Pydantic models when sending JSON request bodies, as well as this answer and this answer for more options and examples on how to define an endpoint expecting JSON data.
I have call so many api with the help of urllib2 json type. But now i want to crate from-data api with the help of urllib2 and it is not working
I have post api and url and this data
Dummy url = https://www.example.com/xyz?id=32323232
dummy data {'data': "here"}
data should be sent by form-data not raw json type
how can we write code in python with urllib2
url = https://www.example.com/xyz?id=32323232
data = {'data': "here"}
header = {'ContentType' : 'multipart/form-data'}
request = urllib2.Request(url, data, header)
response = urllib2.urlopen(request)
Refer : https://www.pythonforbeginners.com/python-on-the-web/how-to-use-urllib2-in-python/
I've created a push streaming dataset (history on) and I've managed to post data to it from a Python script using the "Push URL" which I got from the API Info tab for the dataset in question. What I also need to do is to delete the historic data so as to clear out my test data and/or be able to reset the dataset and re-populate from scratch as and when necessary.
The Push Url is of the form https://api.powerbi.com/beta/xxxxxxxx/datasets/xxxxxxxxxxxx/rows?key=xxxxxxxxxxxxxxx
The following code works fine and the data is posted;
import requests
import pyodbc as db
import pandas as pd
API_ENDPOINT = "https://api.powerbi.com/beta/xxxxxxxx/datasets/xxxxxxxxxxxx/rows?key=xxxxxxxxxxxxxxx"
dbcon = db.connect('DRIVER={SQL Server};SERVER=tcp:fxdb.database.windows.net;DATABASE=FXDatabase;UID=xxxx;PWD=xxxx')
df = pd.read_sql("select statement etc...", dbcon)
data = df.to_dict(orient='records')
response = requests.post(API_ENDPOINT, json=data)
But adding this:
response = requests.delete(API_ENDPOINT)
gives me:
404
{
"error":{
"code":"","message":"No HTTP resource was found that matches the request URI 'http://api.powerbi.com/beta/...
I couldn't figure this out so I started looking into OAuth2 authentication thinking that perhaps the Auth URL is only for posting data. After registering the app at https://dev.powerbi.com/apps my code now looks like this:
import requests
import pyodbc as db
import pandas as pd
API_ENDPOINT = "https://api.powerbi.com/beta/xxxxxxxxxxxxxx/datasets/xxxxxxxxxxxxxxx/rows"
data = {
'grant_type': 'password',
'scope': 'openid',
'resource': r'https://analysis.windows.net/powerbi/api',
'client_id': 'xxxxxxxxx',
'username': 'xxxxxxxxx',
'password': 'xxxxxxxx'
}
response = requests.post('https://login.microsoftonline.com/common/oauth2/token', data=data)
access_token = response.json().get('access_token')
headers = {'Authorization': 'Bearer ' + access_token}
dbcon = db.connect('DRIVER={SQL Server};SERVER=tcp:fxdb.database.windows.net;DATABASE=FXDatabase;UID=xxxx;PWD=xxxx')
df = pd.read_sql("select statement etc...", dbcon)
data = df.to_dict(orient='records')
response = requests.post(API_ENDPOINT, json=data, headers=headers)
response = requests.delete(API_ENDPOINT, headers=headers)
The authentication works, returning status code 200. The POST returns 401 (this worked with the previous method) and the DELETE still returns 404.
Thanks to jonrsharpe who pointed me in the right direction.
Revisiting the API documentation I discovered a call to get the table names;
GET https://api.powerbi.com/v1.0/myorg/datasets/{datasetKey}/tables
so after authenticating I ran;
response = requests.get("https://api.powerbi.com/v1.0/myorg/datasets/xxxxxxxx/tables", headers=headers)
The content of the response told me that there was a table called "RealTimeData" inside my dataset, must be a default name because I haven't knowingly created this table.
I have now updated the endpoint to;
API_ENDPOINT = "https://api.powerbi.com/v1.0/myorg/datasets/xxxxxxxxx/tables/RealTimeData/rows"
and all works perfectly.
Thanks Jon!
I have working Flask route that can print accepted JSON (it is being sent as POST with BODY containing JSON).
#app.route('/json', methods=['POST'])
def jsonify():
json_dict = json.load(request.json)
print ("\njson0:\n")
How to load it into DF?
You need to specify the content type as 'application/json' in your request. For example :
request.post(url, headers={'Content-Type': 'application/json'}, data=json.dumps({'text': 'Hello'})
Then in Flask you should use request.get_json() which is better than .json.
To load your json into a DF you can simply type :
pd.DataFrame(json_dict)
Your json should be formated as [{'name':'Jask','age':24},{'name':'Bob','age':30}] for instance.