I'm building an API for a new web service using Python, Flask-Restful w/ pymongo.
A sample MongoDB document should look like this:
{ domain: 'foobar.com',
attributes: { web: [ akamai,
google-analytics,
drupal,
... ] } }
The imports:
from flask import Flask, jsonify
from flask.ext.restful import Api, Resource, reqparse
from pymongo import MongoClient
The class:
class AttributesAPI(Resource):
def __init__(self):
self.reqparse = reqparse.RequestParser()
self.reqparse.add_argument('domain', type = str, required = True, help = 'No domain given', location='json')
self.reqparse.add_argument('web', type = str, action='append', required = True, help = 'No array/list of web stuff given', location = 'json')
super(AttributesAPI, self).__init__()
def post(self):
args = self.reqparse.parse_args()
post = db.core.update( {'domain': args['domain']},
{'$set':{'attr': { 'web': args['web'] }}},
upsert=True)
return post
When I CURL post, I use this:
curl -i -H "Content-Type: application/json" -X POST -d '{"domain":"foobar", "web":"akamai", "web":"drupal", "web":"google-analytics"}' http://localhost:5000/v1/attributes
However, this is what gets saved in my document:
{ "_id" : ObjectId("5313a9006759a3e0af4e548a"), "attr" : { "web" : [ "google-analytics" ] }, "domain" : "foobar.com"}
It only stores the last value given in the curl for 'web'. I also tried to use the CLI command with multiple -d params as described in the reqparse documentation but that throws a 400 - BAD REQUEST error.
Any ideas how why it is only saving the last value instead of all values as a list?
In JSON objects and in Python dictionaries, names are unique; you cannot repeat the web key here and expect it to work. Use one web key instead and make the value a list:
{"domain": "foobar", "web": ["akamai", "drupal", "google-analytics"]}
and it should be processed as such.
In addition to #Martin Pieters answer, you would need to set your location parameter on your self.reqparse.add_argument to a tuple of json and values and the store parameter is append
self.reqparse.add_argument('domain',store='append', type = str, required = True, help = 'No domain given', location=('json','values'))
`
Related
I'm using FastAPI framework and I want to send a list of lists using Query parameters. I can send a list using the below syntax, but I am unable to pass list of lists.
sections_to_consider: Optional[List[str]] = Query(None)
I get the below output.
What I want is something like this
{
"sections_to_consider": [
["string", "string2"],
["string3", "string4"]
]
}
I tried below syntax but getting an error.
sections_to_consider: Optional[List[list]] = Query(None)
sections_to_consider: Optional[List[List[str]]] = Query(None)
I need to accept list of lists. This is an optional parameter but, if passed, the inner list must have exactly 2 strings.
Is there any way to do it in FastAPI? Or any work around?
Thanks in advance.
As of now, fastApi does not supports nested list from query parameters. you can find more in multiselection lists Multiselect and dropdownMenu.
Workaround can be using request body. Instead of sending the List[List[str]] form Query, can send the data by request body
either by new class
or by using depends
Example using class: app.py
from typing import List
import uvicorn
from fastapi import FastAPI, Query
from pydantic import BaseModel
app = FastAPI()
class newList(BaseModel):
sections_to_consider: List[List[str]]
#app.post("/items/")
async def read_items(q: newList):
return q
if __name__ == '__main__':
uvicorn.run("6:app",host='127.0.0.1', port=8000, reload=True)
You can use a curl request or Request URL and request body from OpenApi docs
curl request:
curl -X 'POST' \
'http://127.0.0.1:8000/items/' \
-H 'accept: application/json' \
-H 'Content-Type: application/json' \
-d '{
"sections_to_consider": [
["string", "string2"],
["string3", "string4"]
]
}'
Request URL:
http://127.0.0.1:8000/items/
Request Body:
{
"sections_to_consider": [
["string", "string2"],
["string3", "string4"]
]
}
The Response from the request will be as below:
ResponseBody:
{
"sections_to_consider": [
[
"string",
"string2"
],
[
"string3",
"string4"
]
]
}
I am learning Flask Restful API, while following some tutorials I came across an example
class Student(Resource):
def get(self):
return {student data}
def post(self, details):
return {data stored}
api.add_resource(Student,'/student')
here, looking at above example, we can use /student with GET,POST methods to retrieve and store data.
But I would like to have 2 different endpoints for retrieving and storing data, each.
for example
/student/get
which will call get() function of class Student, to retrieve records of all students, and
/student/post
which will call post() function of class Student, to store the sent/posted data.
Is it possible to have a single student class but call different methods referred by different endpoints.
Yes, it is possible to have a single Resource class but call different methods referred by different endpoints.
Scenario:
We have a Student class with get and post method.
We can use different route to execute the get and post method separately or combined.
E.g.:
Endpoint http://localhost:5000/students/get can only be used for get request of Student class.
Endpoint http://localhost:5000/students/post can only be used for post request of Student class.
Endpoint http://localhost:5000/students/ can be used for both get, and post request of Student class.
Solution:
To control the different endpoint requests in the resource class, we need to pass some keyword arguments to it.
We will use resource_class_kwargs option of add_resource method. Details of add_resource can be found in the official documentation
We will block any unwanted method call using abort method. We will return a HTTP status 405 with a response message Method not allowed for unwanted method calls in endpoints.
Code:
from flask import Flask
from flask_restful import Resource, Api, abort, reqparse
app = Flask(__name__)
api = Api(app)
parser = reqparse.RequestParser()
parser.add_argument('id', type=int, help='ID of the student')
parser.add_argument('name', type=str, help='Name of the student')
def abort_if_method_not_allowed():
abort(405, message="Method not allowed")
students = [{"id" : 1, "name": "Shovon"},
{"id" : 2, "name": "arsho"}]
class Student(Resource):
def __init__(self, **kwargs):
self.get_request_allowed = kwargs.get("get_request_allowed", False)
self.post_request_allowed = kwargs.get("post_request_allowed", False)
def get(self):
if not self.get_request_allowed:
abort_if_method_not_allowed()
return students
def post(self):
if not self.post_request_allowed:
abort_if_method_not_allowed()
student_arguments = parser.parse_args()
student = {'id': student_arguments['id'],
'name': student_arguments['name']}
students.append(student)
return student, 201
api.add_resource(Student, '/students', endpoint="student",
resource_class_kwargs={"get_request_allowed": True, "post_request_allowed": True})
api.add_resource(Student, '/students/get', endpoint="student_get",
resource_class_kwargs={"get_request_allowed": True})
api.add_resource(Student, '/students/post', endpoint="student_post",
resource_class_kwargs={"post_request_allowed": True})
Expected behaviors:
curl http://localhost:5000/students/get should call the get method of Student class.
curl http://localhost:5000/students/post -d "id=3" -d "name=Sho" -X POST -v should call the post method of Student class.
curl http://localhost:5000/students can call both of the methods of Student class.
Testing:
We will call our enlisted endpoints and test if the behavior is expected for each endpoints.
Output of get request in students/get using curl http://localhost:5000/students/get:
[
{
"id": 1,
"name": "Shovon"
},
{
"id": 2,
"name": "arsho"
}
]
Output of post request in students/post using curl http://localhost:5000/students/post -d "id=3" -d "name=Shody" -X POST -v:
{
"id": 3,
"name": "Shody"
}
Output of get request in students using curl http://localhost:5000/students:
[
{
"id": 1,
"name": "Shovon"
},
{
"id": 2,
"name": "arsho"
},
{
"id": 3,
"name": "Shody"
}
]
Output of post request in students using curl http://localhost:5000/students -d "id=4" -d "name=Ahmedur" -X POST -v:
{
"id": 4,
"name": "Ahmedur"
}
Output of post request in students/get using curl http://localhost:5000/students/get -d "id=5" -d "name=Rahman" -X POST -v:
{
"message": "Method not allowed"
}
Output of get request in students/post using curl http://localhost:5000/students/post:
{
"message": "Method not allowed"
}
References:
Official documentation of add_resource method
I am trying to make Python Eve create different collections programmatically,
Let's say I want to expose an endpoint receiving a schema to be able to create that collection in mongo:
i.e
DOMAIN = {}
app.route('/gen')
def gen(schema):
app.config['DOMAIN'][schema.name] = schema.def # <-- This obviously does not work, don't know how to focus it
So that via curl I could post this schema def:
curl -H "Content-Type: application/json" -d '[{"name":"test", "def": "{\"age\":\"int\"}"}]' http://localhost:5000/gen
And POST objects of this new collection(test) created
curl -H "Content-Type: application/json" -d '[{"age":5]' http://localhost:5000/test
Obviously this is just the initial problem. In order to persist it in the future I will need to save this data in mongo, and load it once application starts, so that "mongo autodefines python eve DOMAIN itself". I hope this will also be possible to achieve
My approach is to use custom settings for Eve() object:
app = eve.Eve(settings=settings)
where settings contains DOMAIN definition:
settings = {
"SERVER_NAME": None,
"DEBUG": True,
# MongoDB params
"MONGO_HOST": '...',
"MONGO_PORT": 27017,
"MONGO_USERNAME": '...',
"MONGO_PASSWORD": '...',
.....
# add to DOMAIN all collections defined in `schema`
"DOMAIN": {
# doc
'app_doc': {
'item_title': 'app_doc',
'resource_methods': ['GET', 'POST', 'DELETE'],
'allow_unknown': True,
'schema': {
'name': {'type': 'string', 'required': True},
....
}
The settings variable could be modified in order to receive parameters from database (I use a collection named app_schema where I keep these custom definitions for endpoints).
Just connect to Mongo (I use pymongo) before instantiation of Eve(), then fill settings["DOMAIN"] with all data from app_schema collection, then pass this settings variable to Eve(settings=settings). Exemple here:
# setup Mongo connection (#see config.py - store here default schemas and DOMAIN)
client = MongoClient(settings["MONGO_HOST"], settings["MONGO_PORT"])
db = client.docfill
# check app_schema collection
tab_schemas = db.app_schema.find({})
def load_settings_from_schema_collection():
"""
Defines a new settings DOMAIN variable loaded with metadata regarding "ent_"-collections
create other API endpoints by definitions found in schema
this is a huge workload, as it analyzes each schemadef and create endpoints for various operations
like GET/POST/DEL/PUT/PATCH and search
:return:
"""
i = 0
# add to `settings` new definitions from app_schema collection
global settings
# now parse each doc and create settings table for it, the insert settings table into DOMAIN definition
for doc in tab_schemas:
i = i + 1
# this name should be unique for each collection
this_collection_name = "ent_" + doc["collection"]
# only allow "ent_" prefixed schemas to be overridden
this_schema_setting = {
"datasource": {
"source": this_collection_name # important ca tabela sa fie definita cu prefix
},
"resource_methods": ['GET', 'POST', 'DELETE'],
"item_methods": ['GET', 'DELETE', 'PUT', 'PATCH'],
"schema": {}
}
for fld_meta in doc["content"]:
this_schema_setting["schema"][fld_meta] = {"type": doc["content"][fld_meta]["type"]}
# is there a required option ?
if "required" in doc["content"][fld_meta]:
this_schema_setting["schema"][fld_meta] = {"required": bool(doc["content"][fld_meta]["required"])}
settings["DOMAIN"][this_collection_name] = this_schema_setting
# output everything in settings variable to config.js (just for viewing what happens in settings)
file = "config.js"
with open(file, 'w') as filetowrite:
filetowrite.write('settings = ' + json.dumps(settings, indent=4, sort_keys=True))
return 1
# load settings from schema collections in MongoDB: collection=app_schema
load_settings_from_schema_collection()
And finally, start server:
app = eve.Eve(settings=settings)
app.run(...)
Hope it helps!
I am trying to write some test cases for some code I've developed using Elasticsearch and Django. The concept is straightforward - I just want to test a get request, which will be an Elasticsearch query. However, I am constructing the query as a nested dict. When I pass the nested dict to the Client object in the test script it gets passed through Django until it ends up at the urlencode function which doesn't look like it can handle nested dicts only MultivalueDicts. Any suggestions or solutions? I don't want to use any additional packages as I don't want to depend on potentially non-supported packages for this application.
Generic Code:
class MyViewTest(TestCase):
es_connection = elasticsearch.Elasticsearch("localhost:9200")
def test_es_query(self):
client = Client()
query = {
"query": {
"term": {
"city": "some city"
}
}
}
response = client.get("", query)
print(response)
Link for urlencode function: urlencode Django
The issue is clearly at the conditional statement when the urlencode function checks if the dictionary value is a str or bytes object. If it isn't it creates a generator object which can never access the nested portions of the dictionary.
EDIT: 07/25/2018
So I was able to come up with a temporary work around to at least run the test. However, it is ugly and I feel like there must be a better way. The first thing I tried was specifying the content_type and converting the dict to a json string first. However, Django still kicked back and error in the urlencode function.
class MyViewTest(TestCase):
es_connection = elasticsearch.Elasticsearch("localhost:9200")
def test_es_query(self):
client = Client()
query = {
"query": {
"term": {
"city": "some city"
}
}
}
response = client.get("", data=json.dumps(query), content_type="application/json")
print(response)
So instead I had to do:
class MyViewTest(TestCase):
es_connection = elasticsearch.Elasticsearch("localhost:9200")
def test_es_query(self):
client = Client()
query = {
"query": {
"term": {
"city": "some city"
}
}
}
query = json.dumps(query)
response = client.get("", data={"q": query}, content_type="application/json")
print(response)
This let me send the HttpRequest to my View and parse it back out using:
json.loads(request.GET["q"])
Then I was able to successfully get the requested data from Elasticsearch and return it as an HttpResponse. I feel like in Django though there has to be a way to just pass a json formatted string directly to the Client object's get function. I thought specifying the content_type as application/json would work but it still calls the urlencode function. Any ideas? I really don't want to implement this current system into production.
Using django trunk r13359 and django piston, I created a small restful service that stores string values.
This is the model I am using to store strings:
class DataStore(models.Model):
data = models.CharField(max_length=200)
url = models.URLField(default = '', verify_exists=False, blank = True)
I used curl to post following data:
curl -d "data=somedata" http://localhost:8000/api/datastorage/
This is the code that handles storage as part of the django-piston handler
store = DataStore()
store.url = request.POST.get('url',""),
store.data = request.POST['data'],
store.save()
return {'data':store}
When I post the data with curl I get the following response body, which is expected:
{
"result": {
"url": [
""
],
"data": [
"somedata"
],
"id": 1
}
}
Whats not expected however is when I look at the stored instance from django admin, the value stored in the data field looks something like this:
(u'somedata',)
and the following is stored in the url:
('',)
Whats even more interesting is when I query the service with curl to see what is stored, I get the following:
{
"result": {
"url": [
"('',)"
],
"data": [
"(u'somedata',)"
],
"id": 1
}
}
I'm stumped .. any ideas what could be going on?
Actually your response is also not what should be expected, note the [] around your strings, those shouldn't be there.
Your error is adding the comma after these two lines:
store.url = request.POST.get('url',""),
store.data = request.POST['data'],
Python will interprete you want to store a tuple in url and data, and django will convert those tuples to strings implicitly, resulting in the behaviour you see. Just remove the two commas and you'll be fine.