OK, I'm sure there's something wrong with this jsonschema, but I just can't seem to wrap my head around the problem.
I'm not going to post the actual code, but a minimal example that reproduces the issue.
Here's what I had before, which worked fine:
person = {
'type': 'object',
'properties': {
'name': {
'type': 'string'
}
'auth_token': {
'type': 'string',
},
'username': {
'type': 'string'
},
'password': {
'type': 'string'
}
},
'oneOf': [
{
'required': ['auth_token']
},
{
'required': ['username', 'password']
}
],
'required': ['name']
}
The idea here was that you always need to provide the name of the person, and then either an auth token or a username and password pair. As I said above, this validation worked fine, since we have parametrized tests that send all posible combinations of invalid JSON and evaluate the resulting error message, and those tests pass.
But then a new requirement came in and I needed to add a second mutually exclusive required pair of fields, which I did in this way:
person = {
'type': 'object',
'properties': {
'name': {
'type': 'string'
}
'auth_token': {
'type': 'string',
},
'username': {
'type': 'string'
},
'password': {
'type': 'string'
},
'project_id': {
'type': 'number'
},
'contract_date_from': {
'type': 'string'
}
'contract_date_to': {
'type': 'string'
}
},
'allOf': [
{
'oneOf': [
{
'required': ['auth_token']
},
{
'required': ['username', 'password']
}
]
},
{
'oneOf': [
{
'required': ['project_id']
},
{
'required': ['contract_date_from', 'contract_date_to']
}
]
}
],
'required': ['name']
}
But now the second validation always fails, whether the json provided is valid or invalid. The error message I get is:
{'name': 'John Doe', 'auth_token': '9d9a324b-26de-4ac3-85eb-05566e4a7204', 'username': None, 'password': None, 'project_id': 2785, 'contract_date_from': None, 'contract_date_to': None} is valid under each of {'required': ['contract_date_from', 'contract_date_to']}, {'required': ['project_id']}
No matter what values I send in those three fields (ie. project id, contract date from and contract date to), it fails with the same error. I've tried leaving all three empty, completing all three, and all permutations in between, but the error stays the same.
I've been reading the documentation for json schema but I can't seem to grasp what's going on with this example. I'm considering trying different approaches for this, but I'd really like to understand why this is not working. Any help is appreciated!
{'name': 'John Doe', 'auth_token': '9d9a324b-26de-4ac3-85eb-05566e4a7204', 'username': None, 'password': None, 'project_id': 2785, 'contract_date_from': None, 'contract_date_to': None} is valid under each of {'required': ['contract_date_from', 'contract_date_to']}, {'required': ['project_id']}
Read the error message more carefully: you requested that project_id be provided, OR contract_date_from and contract_date_to are provided, but you are providing all three of these. Providing a null value in a property is still providing a property. The error message is confusing, but you'd be failing validation anyway because null is not a string. Your evaluator is simply running the allOf->anyOfs first, so that's the error that comes back first. You should still get the type violation errors as well, though (if you don't, that's a bug: evaluators are required to provide ALL errors, not just the first.)
You can make the errors better at the expense of brevity by adding the "type" checks to live next to the "required" keywords. That will ensure the oneOf keywords produce failures rather than successes and maybe make the error messages more obvious.
Related
AVOID EVAL
My question has been answered and I ended up using eval, but after some searching on what eval does and can do I ended up not using it and instead used an alternative found here: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/eval#Do_not_ever_use_eval!
In my application i'm building the whole chart options in the backend and returning it as a json response
def get_chart_data(request):
chart = {
'title': {
'text': ''
},
'xAxis': {
'categories': [],
'title': {
'text': ''
},
'type': 'category',
'crosshair': True
},
'yAxis': [{
'allowDecimals': False,
'min': 0,
'title': {
'text': ''
}
}, {
'allowDecimals': False,
'min': 0,
'title': {
'text': ''
},
'opposite': True
}],
'series': [{
'type': 'column',
'yAxis': 1,
'name': '',
'data': []
}, {
'type': 'line',
'name': '',
'data': []
}, {
'type': 'line',
'name': '',
'data': []
}]
}
return JsonResponse(chart)
And then get the data using ajax and use the response for the data
Highcharts.chart('dashboard1', data);
I'm ok with this so far but i've run into problems if I want to use highcharts functions as part of the options, for example setting the color of text using Highcharts.getOptions().colors[0],
'title': {
'text': 'Rainfall',
'style': {
'color': Highcharts.getOptions().colors[0]
}
},
If i don't put quotes to this when building the options in views.py it would be treated as python code and result in an error, however if i add quotes to it, it will be treated as string in javascript which would not work.
Is this possible? or should i just build the options in javascript and just get the data part in the backend and not the whole thing.
You could return the JS code in Django as a string, and then you can run eval() on it, but executing code like that opens the possibility of an XSS attack, especially if the information is user-submittable.
Your best bet otherwise would be to create the styling on the JS end if possible, and manipulate the incoming data.
document.querySelector('a').addEventListener('click', function (e) {
e.preventDefault();
var complexJson = {"parent": {"child": "alert('Here is a nested alert!')"}}
var alertString = "alert('Here is a simple alert!')";
eval(complexJson["parent"]["child"])
eval(alertString)
})
Click me!
I'm trying to deploy a Custom Instance Template using gcloud deployment-manager, but I keep getting this error:
ERROR: (gcloud.deployment-manager.deployments.update) Error in Operation [operation-1507833758152-55b5de788f540-e3be8bf6-a792d98e]: errors:
- code: RESOURCE_ERROR
location: /deployments/my-project/resources/worker-template
message: '{"ResourceType":"compute.v1.instanceTemplate","ResourceErrorCode":"400","ResourceErrorMessage":{"code":400,"errors":[{"domain":"global","message":"Invalid
value for field ''resource.properties'': ''''. Instance Templates must provide
instance properties.","reason":"invalid"}],"message":"Invalid value for field
''resource.properties'': ''''. Instance Templates must provide instance properties.","statusMessage":"Bad
Request","requestPath":"https://www.googleapis.com/compute/v1/projects/my-project/global/instanceTemplates","httpMethod":"POST"}}'
My python generate_config function is this:
def generate_config(context):
resources = [{
'type': 'compute.v1.instanceTemplate',
'name': 'worker-template',
'properties': {
'zone': context.properties['zone'],
'description': 'Worker Template',
'machineType': context.properties['machineType'],
'disks': [{
'deviceName': 'boot',
'type': 'PERSISTENT',
'boot': True,
'autoDelete': True,
'initializeParams': {
'sourceImage': '/'.join([
context.properties['compute_base_url'],
'projects', context.properties['os_project'],
'global/images/family', context.properties['os_project_family']
])
}
}],
'networkInterfaces': [{
'network': '$(ref.' + context.properties['network'] + '.selfLink)',
'accessConfigs': [{
'name': 'External NAT',
'type': 'ONE_TO_ONE_NAT'
}]
}]
}
}]
return {'resources': resources}
Properties is not empty, so the error message doesn't make much sense. Any ideas?
Thx!
After reading this example, I just found that the correct structure for compute.v1.instanceTemplate is:
...
'type': 'compute.v1.instanceTemplate',
'name': 'worker-template',
'properties': {
'project': 'my-project',
'properties': {
'zone': context.properties['zone'],
...
}
}
...
The structure follows this doc
I am using Python Eve, which is awesome, however I ran into a problem and not sure if there is a solution.
I have a 'fields' dict in this schema:
'profiles': {
'fields': {
'type': 'dict',
'default': {}
}
}
I'd like to be able to PATCH update the 'fields' field, but the issue is that a PATCH request will never REMOVE any field inside 'fields', but I cannot use a PUT command or else all my other profile fields (not shown above) will disappear.
I tried using a subresource like this:
'profile-fields': {
'schema': {
'fields': {
'type': 'dict',
'default': {}
}
},
'datasource': {
'source': 'profiles',
'projection': { 'fields': 1 }
}
},
but as the Python Eve documentation states:
Please note that POST and PATCH methods will still allow the whole schema to be manipulated
http://python-eve.org/config.html#multiple-api-endpoints-one-datasource
Anyone know of a way to do this?
For Example:
# Create a record
POST /api/profiles
{
'name': 'Test',
'fields': {
'one': 1,
'two': 2
}
}
# => { _created: 'blah', _id: '123456' }
# then update fields with a PATCH request
PATCH /api/profiles/123456
{
'fields': {
'three': 3,
'four': 4
}
}
# then get the updated record
GET /api/profiles/123456
# RESPONSE
{
'_id': '123456',
'name': 'Test',
'fields': {
'one': 1,
'two': 2,
'three': 3,
'four': 4
}
}
I have just conceded to using a PUT request and sending the entire object back again, which is ok I guess, just thought there might be a way to do this.
I'm using Tornado_JSON which is based on jsonschema and there is a problem with my schema definition. I tried fixing it in an online schema validator and the problem seems to lie in "additionalItems": True. True with capital T works for python and leads to an error in the online validator (Schema is invalid JSON.). With true the online validator is happy and the example json validates against the schema, but my python script doesn't start anymore (NameError: name 'true' is not defined). Can this be resolved somehow?
#schema.validate(
"""input_schema={
'type': 'object',
'properties': {
'DB': {
'type': 'number'
},
'values': {
'type': 'array',
'items': [
{
'type': 'array',
'items': [
{
'type': 'string'
},
{
'type': [
'number',
'string',
'boolean',
'null'
]
}
]
}
],
'additionalItems': true
}
}
},
input_example={
'DB': 22,
'values': [['INT', 44],['REAL', 33.33],['CHAR', 'b']]
}"""
)
I changed it according to your comments ( external file with json.loads() ). Perfect. Thank you.
Put the schema in a triple-quoted string or an external file, then parse it with json.loads(). Use the lower-case spelling.
The error stems from trying to put a builtin Python datatype into a JSON schema. The latter is a template syntax that is used to check type consistency and should not hold actual data. Instead, under input_schema you'll want to define "additionalItems" to be of { "type": "boolean" } and then add it to the test JSON in your input_example with a boolean after for testing purposes.
Also, I'm not too familiar with Tornado_JSON but it looks like you aren't complying with the schema definition language by placing "additionalItems" inside of the "values" property. Bring that up one level.
More specifically, I think what you're trying to do should look like:
"values": {
...value schema definition...
}
"additionalItems": {
"type": "boolean"
}
And the input examples would become:
input_example={
"DB": 22,
"values": [['INT', 44],['REAL', 33.33],['CHAR', 'b']],
"additionalItems": true
}
As far as i understand, Python Eve does not support a double level embed, can you confirm?
To better explain, given a document A referring to a document B referring to a document C, it is not possible to have A documents served by Eve with C embedded, right?
I think that this is not possible, since also in the docs says the following:
We do not support multiple layers embeddings
This doesn't work. I have tried it. The settings.py file will show you double embedding that i've tried and it doesn't work.
import os
from schemas.subjects import subject_schema
from schemas.units import units_schema
# We want to seamlessy run our API both locally and on Heroku. If running on
# Heroku, sensible DB connection settings are stored in environment variables.
# MONGO_URI = os.environ.get('MONGODB_URI', 'mongodb://user:user#localhost:27017/evedemo')
MONGO_HOST = 'localhost'
MONGO_PORT = 27017
MONGO_DBNAME = 'test'
# Enable reads (GET), inserts (POST) and DELETE for resources/collections
# (if you omit this line, the API will default to ['GET'] and provide
# read-only access to the endpoint).
RESOURCE_METHODS = ['GET', 'POST', 'DELETE']
# Enable reads (GET), edits (PATCH) and deletes of individual items
# (defaults to read-only item access).
ITEM_METHODS = ['GET', 'PATCH', 'DELETE']
# We enable standard client cache directives for all resources exposed by the
# API. We can always override these global settings later.
CACHE_CONTROL = 'max-age=20'
CACHE_EXPIRES = 20
# Our API will expose two resources (MongoDB collections): 'people' and
# 'works'. In order to allow for proper data validation, we define beaviour
# and structure.
people = {
# 'title' tag used in item links.
'item_title': 'person',
# by default the standard item entry point is defined as
# '/people/<ObjectId>/'. We leave it untouched, and we also enable an
# additional read-only entry point. This way consumers can also perform GET
# requests at '/people/<lastname>/'.
'additional_lookup': {
'url': 'regex("[\w]+")',
'field': 'lastname'
},
# Schema definition, based on Cerberus grammar. Check the Cerberus project
# (https://github.com/pyeve/cerberus) for details.
'schema': {
'id':{'type':'integer','required':True},
'firstname': {
'type': 'string',
'minlength': 1,
'maxlength': 10,
},
'lastname': {
'type': 'string',
'minlength': 1,
'maxlength': 15,
'required': True,
# talk about hard constraints! For the purpose of the demo
# 'lastname' is an API entry-point, so we need it to be unique.
'unique': True,
},
# 'role' is a list, and can only contain values from 'allowed'.
'role': {
'type': 'list',
'allowed': ["author", "contributor", "copy"],
},
# An embedded 'strongly-typed' dictionary.
'location': {
'type': 'dict',
'schema': {
'address': {'type': 'string'},
'city': {'type': 'string'}
},
},
'born': {
'type': 'datetime',
},
}
}
works = {
# if 'item_title' is not provided Eve will just strip the final
# 's' from resource name, and use it as the item_title.
#'item_title': 'work',
# We choose to override global cache-control directives for this resource.
'cache_control': 'max-age=10,must-revalidate',
'cache_expires': 10,
'schema': {
'title': {
'type': 'string',
'required': True,
},
'description': {
'type': 'string',
},
'owner': {
'type': 'objectid',
'required': True,
# referential integrity constraint: value must exist in the
# 'people' collection. Since we aren't declaring a 'field' key,
# will default to `people._id` (or, more precisely, to whatever
# ID_FIELD value is).
'data_relation': {
'resource': 'people',
# make the owner embeddable with ?embedded={"owner":1}
'embeddable': True
},
},
}
}
interests={
'cache_control': 'max-age=10,must-revalidate',
'cache_expires': 10,
'schema':{
'interest':{
'type':'objectid',
'required':True,
'data_relation': {
'resource': 'works',
# make the owner embeddable with ?embedded={"owner":1}
'embeddable': True
},
}
}
}
# The DOMAIN dict explains which resources will be available and how they will
# be accessible to the API consumer.
DOMAIN = {
'people': people,
'works': works,
'interests':interests,
}
The end-point:
http://127.0.0.1:5000/works?embedded={"owner":1} works
http://127.0.0.1:5000/interests?embedded={"interest":1}&embedded={"owner":1} doesn't return the desired result.