Python cherrypy, get the query_string parsed - python

I'm trying to get the query and the data (GET params and the POST params) from a request
curl --data "foo=bar&hello=world" "http://localhost:8080/mypath?orange=5&apple=8"
.
query_string = cherrypy.request.query_string # 'orange=5&apple=8'
post_data = cherrypy.request.body.params # {'foo': 'bar', 'hello': 'world'}
The post_data is correctly dict formed.
how can i parse the query_string like the post_data?
I was reading at cherrypy doc, and I was seeing this:
process_query_string()
Parse the query string into Python structures. (Core)
But this is not working, cherrypy.request.process_query_string() ais returning None
Any ideas?

CherryPy uses cherrypy.lib.httputil.parse_query_string for populating request.params with GET parameters, you can use it like this:
from cherrypy.lib.httputil import parse_query_string
parse_query_string(cherrypy.request.query_string)
Which returns the dict with parsed query string parameters.

query = urllib.parse.parse_qs(cherrypy.request.query_string, True)

Related

Azure SQL DB query convert to JSON with Flask API

I am trying to query a SQL Server database hosted on Azure through a flask API and convert the results to JSON, what I am trying is below. And this is working, however the results are coming through with escape characters. There don't appear to be any special characters obvious in the data. If I use the API and exec a stored procedure with a parameter the json will come through in the format that I want it. Any suggestions on how to alter this so that I get standard json format?
app = Flask(__name__)
api = Api(app)
parser = reqparse.RequestParser()
parser.add_argument('customer')
conn = pyodbc.connect(serverconnectionstring)
class Customer(Resource):
def get(self):
cursor = conn.cursor()
query = "SELECT * FROM [dbo].[testforjson]"
result = cursor.execute(query)
items = [dict(zip([key[0] for key in cursor.description], row)) for row in result]
jsonitems = json.dumps(items)
return jsonitems
api.add_resource(Customer, '/customer')
if __name__ == '__main__':
app.run()
example output:
"[{\"field1\": \"B2653\", \"field2\": \"ERLOP\"}, {\"field1\": \"C2653\", \"field2\": \"ERLOP\"}]
desired output:
[
{
"field1": "B2653",
"field2": "ERLOP"
},
{
"field1": "C2653",
"field2": "ERLOP"
}
]
Very thanks for #njzk2's help and detailed explanation. I help post it as answer to end this question:
Please try returning items directly from that get method:
quick explanation: if you return an object, flask will attempt to
return a json representation and set the json content type. If you
return a string, flask doesn't know your intention, and sends a
string content type. It's then up to your client to figure out what
the intention was. In most cases a string content type means the
result is presented to you as a string. But you can also ignore the
content type and parse that string as json
Glad to hear it worked for you. Thanks #njzk3 again. This can be beneficial to other community members.

How to post a kafka schema using python

I am trying to post a kafka schema using python.
From the CLI I would use a syntax like:
curl -X POST -H "Content-Type: application/vnd.schemaregistry.v1+json" --data '{"schema": "{\"type\":\"record\",\"name\":\"VisualDetections\",\"namespace\":\"com.namespace.something\",\"fields\":[{\"name\":\"vehicle_id\",\"type\":\"int\"},{\"name\":\"source\",\"type\":\"string\"},{\"name\":\"width\",\"type\":\"int\"},{\"name\":\"height\",\"type\":\"int\"},{\"name\":\"annotated_frame\",\"type\":[\"string\",\"null\"]},{\"name\":\"version\",\"type\":\"string\"},{\"name\":\"fps\",\"type\":\"int\"},{\"name\":\"mission_id\",\"type\":\"int\"},{\"name\":\"sequence\",\"type\":{\"type\":\"array\",\"items\":{\"type\":\"record\",\"name\":\"sequence_record\",\"fields\":[{\"name\":\"frame_id\",\"type\":\"int\"},{\"name\":\"timestamp\",\"type\":\"long\"},{\"name\":\"localization\",\"type\":{\"type\":\"array\",\"items\":{\"type\":\"record\",\"name\":\"localization_record\",\"fields\":[{\"name\":\"latitude\",\"type\":\"double\"},{\"name\":\"longitude\",\"type\":\"double\"},{\"name\":\"class\",\"type\":\"string\"},{\"name\":\"object_id\",\"type\":\"int\"},{\"name\":\"confidence\",\"type\":\"double\"},{\"name\":\"bbox\",\"type\":{\"type\":\"record\",\"name\":\"bbox\",\"fields\":[{\"name\":\"x_min\",\"type\":\"int\"},{\"name\":\"y_min\",\"type\":\"int\"},{\"name\":\"x_max\",\"type\":\"int\"},{\"name\":\"y_max\",\"type\":\"int\"}]}}]}}}]}}}]}"}' http://server_ip:8081/subjects/VisualDetections-value/versions/
When I tried to tranfer this function to python I tried something like:
import requests
import json
topic = 'VisualDetections'
headers = {'Content-Type': 'application/vnd.schemaregistry.v1+json'}
with open(avro_path) as fp:
data = {'schema': json.load(fp)}
data_json = json.dumps(data)
cmd = 'http://server_ip:8081/subjects/{}-value/versions/'.format(topic)
response = requests.post(cmd, headers=headers, data=data_json)
The above returns a code {"error_code":500,"message":"Internal Server Error"}. I have tried other options like:
with open(avro_path) as fp:
data = json.load(fp)
with error code:
"error_code":422,"message":"Unrecognized field: name"
In the above the avro_path just contains the avro schema in a json file (can be uploaded if useful also).
I am not sure how I could post this data exactly. Also, I did not take into consideration the -H argument of post in CLI since I couldn't find a equivalent python argument (not sure it plays any role though). Can anyone provide a solution to this issue.
For the second error, the payload needs to be {'schema': "schema string"}
For the first, I think its a matter of the encoding; json.load will read the file to a dict rather than just a string.
Notice
>>> import json
>>> schema = {"type":"record"} # example when using json.load() ... other data excluded
>>> json.dumps({'schema': schema})
'{"schema": {"type": "record"}}' # the schema value is not a string
>>> json.dumps({'schema': json.dumps(schema)})
'{"schema": "{\\"type\\": \\"record\\"}"}' # here it is
Try just reading the file
url = 'http://server_ip:8081/subjects/{}-value/versions/'.format(topic)
with open(avro_path) as fp:
data = {'schema': fp.read().strip()}
response = requests.post(cmd, headers=headers, data=json.dumps(data))
Otherwise, you would json.load then use json.dumps twice as shown above
You may also try json=data rather than data=json.dumps(data)

Jsonify response data with backslash

I have a flask API that sends response in json format
rep = {'application_id': 32657, 'business_rules_id': 20} # a python dictionary
rep_json = json.dumps(rep, cls=CustomizedEncoder) # converts to a json format string
return jsonify(rep_json), 200 . #return the flask response (with headers etc)
I can see the flask response body data and the response is something like:
b'"{\\"application_id\\": 32567, \\"business_rules_id\\": 20}"\n'
or in postman body
"{\"application_id\": 32567, \"business_rules_id\": 20}
Should i get a response in JSON format (without the backslash)? I guess the reason is that json.dumps dump the string to json once then jsonify dump it a second time which cause the double quote to be escaped.
The reason that I need to run the following is because i need a customized encoder which jsonify probably does not support.
rep_json = json.dumps(rep, cls=CustomizedEncoder)
My other solution is to dumps then loads but which make it looks redudant. Is there a different approach to use a customized encoder while return a Flask response?
This is another way that I tried but looks weird
rep = {'application_id': 32657, 'business_rules_id': 20} # a python dictionary
rep_json = json.dumps(rep, cls=CustomizedEncoder) # converts to a json format string
return jsonify(json.loads(rep_json)), 200 . #return the flask response (with headers etc)
You can configure your app to use a customer encoder with app.json_encoder = CustomizedEncoder
https://kite.com/python/docs/flask.app.Flask.json_encoder

List of query params with Flask request.args

I am trying to pass comma separated query parameters to a Flask endpoint.
An example URI would be:
localhost:3031/someresource#?status=1001,1002,1003
Looking at the return of request.args or request.args.getlist('status') I see that I only get a string.
ipdb> pp request.args
ImmutableMultiDict([('status', '1001,1002,1003')])
ipdb> request.args.getlist('status')
['1001,1002,1003']
I know I can split the string by comma but that feels hacky. Is there a more idiomatic way to handle this in Flask? Or are my query params wrong format?
Solution
Since Flask does not directly support comma separated query params, I put this in in my base controller to support comma-separated or duplicate query params on all endpoints.
request_data = {}
params = request.args.getlist('status') or request.form.getlist('status')
if len(params) == 1 and ',' in params[0]:
request_data['status'] = comma_separated_params_to_list(params[0])})
else:
request_data['status'] = params
def comma_separated_params_to_list(param):
result = []
for val in param.split(','):
if val:
result.append(val)
return result
The flask variant getlist expects multiple keys:
from flask import Flask, request
app = Flask(__name__)
#app.route('/')
def status():
first_status = request.args.get("status")
statuses = request.args.getlist("status")
return "First Status: '{}'\nAll Statuses: '{}'".format(first_status, statuses)
❯ curl "http://localhost:5000?status=5&status=7"
First Status: '5'
All Statuses: '['5', '7']'
There's no standard for this, how multiple GET args are parsed/passed depends on which language/framework you're using; flask is built on werkzeug so it allows this style, but you'll have to look it up if you switch away from flask.
As an aside, Its not uncommon REST API design to have commas to pass multiple values for the same key - makes it easier for the user. You're parsing GET args anyway, parsing a resulting string is not that much more hacky. You can choose to raise a 400 HTTP error if their string with commas isn't well formatted.
Some other languages (notably PHP) support 'array' syntax, so that is used sometimes:
/request?status[]=1000&status[]=1001&status[]=1002
This is what you might want here:
request.args.to_dict(flat=False)
flat is True by default, so by setting it to False, you allow it to return a dict with values inside a list when there's more than one.
According to to_dict documentation:
to_dict(flat=True)
Return the contents as regular dict. If flat is True
the returned dict will only have the first item present, if flat is False
all values will be returned as lists.

Using python 'requests' to send JSON boolean

I've got a really simple question, but I can't figure it out how to do it. The problem I have is that I want to send the following payload using Python and Requests:
{ 'on': true }
Doing it like this:
payload = { 'on':true }
r = requests.put("http://192.168.2.196/api/newdeveloper/lights/1/state", data = payload)
Doesn't work, because I get the following error:
NameError: name 'true' is not defined
Sending the true as 'true' is not accepted by my server, so that's not an option. Anyone a suggestion? Thanks!
You need to json encode it to get it to a string.
import json
payload = json.dumps({"on":True})
should be {'on': True}, capital T
Starting from requests 2.4.2, instead of passing in the payload with the data parameter, you can use the json parameter like this:
payload = {'on': True}
requests.put(url, json=payload)
And the request will be formatted correctly as a json payload (i.e. {'on': true}).
to make it be lower case like that (if that's what your endpoint requires) do in quotes {'on':'true'}

Categories

Resources