I'd like to build a Django app with both a GraphQL endpoint and a REST API. Maintaining both separately would be too much of a pain; I'm looking for a good way to only maintain the GraphQL service and have the REST endpoints generated automagically.
Does anyone know about a good way to do this?
I know there are ways to build a GraphQL server on top of REST endpoints, but I'd rather have it the other way around, as the REST API requirement might go away in the future.
If you don't mind using Node.js, there is a library that does that (graphql2rest): https://github.com/sisense/graphql2rest
You can use it to automatically generate a REST API from your existing GraphQL schema.
"GraphQL2REST is a Node.js library that reads your GraphQL schema and a user-provided manifest file and automatically generates an Express router with fully RESTful HTTP routes — a full-fledged REST API."
If your only problem is not having a dedicated client on the client side and you can live with long urls your graphql endpoint is your RESTlike API. Disclaimer: Untested code for educational purposes only ;)
Read all posts:
GET /api/posts
=>
GET /graphql?query={posts{id title body}}
Create a post
POST /api/posts
{ "title": "Hello", "body": "world" }
=>
POST /graphql?query=mutation m($title:String!,$body:String!){createPost(title:$title,body:$body){id title body}
{ "variables": { "title": "Hello", "body": "world" } }
Your code can then work in a REST like manner (imagine redux actions calling REST APIs).
If you want something more in terms of a server, you can easily reverse what you discribed here:
I know there are ways to build a GraphQL server on top of REST endpoints,
You can build a proxy that rewrites your REST queries to GraphQL queries. This might be much easier than mapping a REST endpoint to GraphQL since your GraphQL API is much more powerful.
Some node.js express code since I don't know any python :(
const Lokka = require('lokka').Lokka;
const Transport = require('lokka-transport-http').Transport;
const express = require('express');
const client = new Lokka({
transport: new Transport('...')
});
const app = express();
app.get('/api/posts', (req, res) => {
client.query('{posts{id title body}}').then(result => {
res.status(200).send(result);
}).catch(error => {
res.status(500).end(); // Or other error handling
});
});
Related
I have an API Gateway defined in the python cdk that will accept CURL Restful requests to upload / read / delete files from an S3 bucket:
api = api_gw.RestApi(self, "file-api",
rest_api_name="File REST Service")
file = api.root.add_resource("{id}")
get_files_integration = api_gw.LambdaIntegration(handler,
request_templates={"application/json": '{ "statusCode": "200" }'})
post_file_integration = api_gw.LambdaIntegration(handler)
get_file_integration = api_gw.LambdaIntegration(handler)
delete_file_integration = api_gw.LambdaIntegration(handler)
api.root.add_method("GET", get_files_integration, authorization_type=api_gw.AuthorizationType.COGNITO, authorizer=auth)
file.add_method("POST", post_file_integration); # POST /{id}
file.add_method("GET", get_file_integration); # GET /{id}
file.add_method("DELETE", delete_file_integration); # DELETE /{id}
Is it possible to enable CORS on the API Gateway so that it will perform pre-flight checks and allow external access from a localhost on another machine?
I have attempted to use the existing add_core_preflight() method defined in the documentation I can find but believe this may no longer be valid as of CDK 2.0.
Yes, IResource.add_cors_preflight() does exactly this.
You can also specify default CORS config with the default_cors_preflight_options attribute of RestApi.
Here are the examples from the docs. They're in Typescript, but it will work the same in Python.
The following example will enable CORS for all methods and all origins on all resources of the API:
new apigateway.RestApi(this, 'api', {
defaultCorsPreflightOptions: {
allowOrigins: apigateway.Cors.ALL_ORIGINS,
allowMethods: apigateway.Cors.ALL_METHODS // this is also the default
}
})
The following example will add an OPTIONS method to the myResource API resource, which only allows GET and PUT HTTP requests from the origin https://amazon.com.
declare const myResource: apigateway.Resource;
myResource.addCorsPreflight({
allowOrigins: [ 'https://amazon.com' ],
allowMethods: [ 'GET', 'PUT' ]
});
I'm trying my hand at building a simple CMS with React, Flask, and MongoDB. I am trying to find a way to get data from MongoDB through Flask to render the correct React components.
The data is stored in MongoDB as:
{
title: "home",
modules: {
headerBlock: {
title: "My Website"
byline: "Some other text here"
}
}
}
I can get that data into Python fairly easily, but then I need to get Flask to render the React components. It would translate to:
<Header title="My Website" byline="Some other text here" />
So there needs to be some way for Flask to provide a container and information about which components to render. (There will be more than one component).
Any help or tips or pushes in the right direction would be appreciated!
It depends, if you are trying to build a single page application SPA , with react in front end, you would need to pass data from your back-end (flask) to react as json data.
Flask has a function called jsonify that reponse to json.
I developed a simple app using React Native and I am trying to connect to a database. I looked online and I found out that many companies like Facebook, Instagram, YouTube and many more are using MySQL + Python.
I want to use Python to manipulate (insert/delete and update) the database but in order to do that I need to create an API. Using NodeJS + Express I would do something like this:
import express from 'express';
import mysql from 'mysql';
const app = express();
app.listen(port, ()=> {
console.log('server started');
// do stuff
});
//database
const database = mysql.createConnection({
host : 'localhost',
user : 'root',
password : 'password12345',
database : 'database_app'
});
//create
app.get('/create', (req, res)=> {
// check error
// create query
});
//delete
app.get('/delete', (req, res)=> {
// check error
// delete query
});
//update
app.get('/update', (req, res)=> {
// check error
// update query
});
How do I do the same with Python? and how do I connect it to my app?
You'll need something like Django or Flask to serve up the data for it to be consumed by the front end.
In Flask, you make a blueprint, then use that blueprint to serve routes, which is analogous to what you were doing in your Node.js code.
app.get('/update', (req, res)=> {
// check error
// update query
});
The Flask version would look like:
users_blueprint = Blueprint('users', __name__, template_folder='./templates')
#users_blueprint.route('/users/', methods={'GET'})
def getUsers():
return jsonify({
'status': 'success',
'message': 'test response msg',
'data' : mydata
})
You could use code like this after registering the Users Blueprint and creating a Model.
First of all I don't have a strong python background but I know that people use flask for building REST APIs and the way you can connect the API to your app is via URL, you just need to start your server and in the client side of your RN App fetch the data from the URL (in dev mode, probably through localhost:$PORT) where your REST API is mounted.
I am configuring my mobile applications with firebase cloud messaging.
I've finally figured out how to send these annoying to configure notifications.
My python code looks like this
url = 'https://fcm.googleapis.com/fcm/send'
body = {
"data":{
"title":"mytitle",
"body":"mybody",
"url":"myurl"
},
"notification":{
"title":"My web app name",
"body":"message",
"content_available": "true"
},
"to":"device_id_here"
}
headers = {"Content-Type":"application/json",
"Authorization": "key=api_key_here"}
requests.post(url, data=json.dumps(body), headers=headers)
I would think that putting this in a for loop and swapping device ids to send thousands of notifications would be an immense strain on the server and a bad programming practice. (Correct me if i'm wrong)
now the documentation tells me to create "device groups" https://firebase.google.com/docs/cloud-messaging/notifications which store device_id's to send in bulk....this is annoying and inefficient. As my groups for my web application are constantly changing.
Plain and Simple
How do I send the notification above to an array of device id's that I specify in my python code so that i can make only 1 post to FCM instead of thousands.
To send FCM to multiple device you use the key "registration_ids" instead of "to"
"registration_ids": ["fcm_token1", "fcm_token2"]
Have a look at this package and see how they implemented it.
Instead of "to":"device_id" you should use "to":"topic" ,
topic is use from group messaging in FCM or GCM
https://developers.google.com/cloud-messaging/topic-messaging
I want to build an upload-centric app using Django. One way to do this is with nginx's upload module (nonblocking) but it has its problems. Node.js is supposed to be a good candidate for this type of application. But how can I make node.js act as an upload_handler() for Django? I'm not sure where to look for examples?
Okay I'm not like an expert on the subject or anything, but the way I think I understand it, nginx is a proxy that runs in front of the webserver that serves your django app, right?
You could build a simple node.js server that does the same - listen on port 80, wait until the request is completely sent to the server, and then forward it to the webserver that serves the Django app. If your problem is that the webservers threads are being used up by long running uploads, then I think this would solve that problem.
Here's some code - just off the top of my head
var http = require('http');
var server = http.createServer(function (req, res) {
var headers = req.headers;
var url = req.url;
var method = req.method;
var body = '';
req.addListener('data', function(chunk) {
body += chunk;
});
req.addListener('end', function() {
// at this point the request is completely uploaded (including files)
// so it can be forwarded to the django webserver
var dj_client = http.createClient(8000, 'localhost');
var dj_req = dj_client.request(method, url, headers);
dj_req.addListener('response', function (dj_res) {
// here the response have been received from the django server
// so we can return that to the waiting client
res.writeHead(dj_res.status, dj_res.headers);
dj_res.addListener('data', res.write);
dj_res.addListener('end', res.close);
});
// send the request to the django webserver
dj_req.write(body);
dj_req.close();
});
});
server.listen(80);
Make node.js write the file to disk and proxy the upload POST to Django along with a '_node_file' value so your Django view knows where to get the file.