I have a Flask view that returns some JSON data and want to get that same data in a Flask-Script command. I was going to use the Requests library but don't know what URL to use without hard-coding the host. How can I get the JSON data returned by the view?
#manager.command
def showdata():
data = requests.get('/data-page') # fails, needs full URL
print(data.json())
When you're in your code, you don't need to make an external request. Use the test_client to make a request.
c = app.test_client()
r = c.get('/data-page')
data = json.loads(r.get_data(as_text=True))
print(data)
Ideally, extract the logic from the presentation in the view and just call a function rather than making any request at all.
# command
print(get_data())
# view
return jsonify(get_data())
Related
I wrote a python program that gets a photo from a webserver. The photo is obtained by sending a POST request to my URL (the returned photo depends on data of POST request):
myobj = {'x': [1,2,3,4], 'y': [1,2,3,6]}
x = requests.post('http://cainevisualizer.azurewebsites.net/plot.png', data=myobj)
x is a requests.Response object with methods giving its content, status code, the response url, the text (in unicode) etc (see all methods here. However, it appears that, in order to send a text message of an image in Twilio, Message().media requires the URL of the image.
message = Message()
message.media(myURL)
Again, the webserver (in Flask) returns an image after a post request rather than returning a unique url to the image. Is there an API or some other way to convert a MIME image into a unique url? Any advice appreciated.
I think I found a solution to my own question. So, I changed the webserver that hosts the photo to now accept GET requests. I then pass my parameters to and send a GET request to the webserver. See the GET request as follows:
import requests
data = {'x[]': [1,2,3,4], 'y[]': [4,5,6,7]}
response = requests.get('http://cainevisualizer.azurewebsites.net/plot.png', params=data)
url = response.url
This request passes the parameters in the data dictionary as a single URL to the webserver. In this case, the GET request is encoded as a url and passes [1,2,3,4] and [4,5,6,7] as query parameters. This is instead of sending the information in the GET request as part of the body of the request (or as part of anything BUT the url itself)
I now use the request.args.getlist('x[]') and request.args.getlist('y[]') function in the webserver to to get the information from this GET request... it looks something like this
from flask import Flask, request, make_response
app = Flask(__name__)
#app.route('/plot.png', methods=['GET', 'POST'])
def plot():
xs = request.args.getlist('x[]')
ys = request.args.getlist('y[]')
Good day,
I am currently trying to figure out a way to make non blocking requests inside a simple script of mitmproxy, but the documentation doesn't seem to be clear for me for the first look.
I think it's probably the easiest if I show my current code and describe my issue below:
from copy import copy
from mitmproxy import http
def request(flow: http.HTTPFlow):
headers = copy(flow.request.headers)
headers.update({"Authorization": "<removed>", "Requested-URI": flow.request.pretty_url})
req = http.HTTPRequest(
first_line_format="origin_form",
scheme=flow.request.scheme,
port=443,
path="/",
http_version=flow.request.http_version,
content=flow.request.content,
host="my.api.xyz",
headers=headers,
method=flow.request.method
)
print(req.get_text())
flow.response = http.HTTPResponse.make(
200, req.content,
)
Basically I would like to intercept any HTTP(S) request done and make a non blocking request to an API endpoint at https://my.api.xyz/ which should take all original headers and return a png screenshot of the originally requested URL.
However the code above produces an empty content and the print returns nothing either.
My issue seems to be related to: mtmproxy http get request in script and Resubmitting a request from a response in mitmproxy but I still couldn't figure out a proper way of sending requests inside mitmproxy.
The following piece of code probably does what you are looking for:
from copy import copy
from mitmproxy import http
from mitmproxy import ctx
from mitmproxy.addons import clientplayback
def request(flow: http.HTTPFlow):
ctx.log.info("Inside request")
if hasattr(flow.request, 'is_custom'):
return
headers = copy(flow.request.headers)
headers.update({"Authorization": "<removed>", "Requested-URI": flow.request.pretty_url})
req = http.HTTPRequest(
first_line_format="origin_form",
scheme='http',
port=8000,
path="/",
http_version=flow.request.http_version,
content=flow.request.content,
host="localhost",
headers=headers,
method=flow.request.method
)
req.is_custom = True
playback = ctx.master.addons.get('clientplayback')
f = flow.copy()
f.request = req
playback.start_replay([f])
It uses the clientplayback addon in order to send out the request. When this new request is sent, that will generate another request event which will then be an infinite loop. That is the reason for the is_custom attribute I added to the request there. If the request that generated this event is the one that we have created, then we don't want to create a new request from it.
I have several Python scripts that are used from the CLI. Now I have been asked if I could provide a web API to perform some of the work normally done from the CLI. I would like to respond with JSON formatted data. I know little about JSON or API's.
How do I retrieve the query data from the http request?
How to a create the HTTP headers for the response from my script?
Given the following URL, how can I respond with a JSON reply of the name "Steve"?
http://myserver.com/api.py?query=who
I already have a functioning Apache web server running, and can serve up HTML and CGI scripts. It's simply coding the Python script that I need some help with. I would prefer not to use a framework, like Flask.
A simple code example would be appreciated.
The following is the Python code that I've come up with, but I know it's not the correct way to do this, and it doesn't use JSON:
#!/usr/local/bin/python3.7
import cgi # CGI module
# Set page headers and start page
print("Content-type: text/plain")
print("X-Content-Type-Options: nosniff\x0d\x0a\x0d\x0a")
# Form defaults
query_arg = None
# Get form values, if any
form = cgi.FieldStorage()
# Get the rest of the values if the form was submitted
if "query" in form.keys():
query_arg = form["query"].value
if query_arg == "who":
print("Steve", end="", flush=True)
You are trying to build a request handler with core python code. which is able to handle http request, In my opinion its not good idea as there are multiple securty scenarios attached with it, also in cross server request its bit difficult to handle all request and request scenarios . I will suggest to use Flask which is very lightweight and this will give an pre-setup of routing mechanism to handle all kind of request in very less code below is the snippet to generate http json response hope it helps
import sys
import flask
import random, string
from flask import jsonify
class Utils:
def make_response(self):
response = jsonify({
'message': 'success',
})
response.status_code = 200
return response
I am trying to update an already saved form on a system using HTTP requests. Due to the server configuration for the third party app we use, updating by POST requires sending a fully filled out payload every single time.
I want to get round this by recovering the form data already present on the server and converting it into a dictionary. Then changing any values I need and reposting to make changes sever side.
The application we use sends a POST request when the save button is clicked for a particular form.
Here I send a post request with no payload.
[This simulates pressing the save button and is also the point where dev tools shows me a the payload I want to capture]
post_test = self.session.post(url_to_retrieve_from)
I thought that now I should be able to print the output, which should resemble what Google Dev tools Form data captures.
print(post_test.text)
This just gives me html found on the webpage.
If Dev Tools can get this from the server then I should also be able to?
Example of Data I am trying to get via requests:
Form Data
If Dev Tools can get this from the server then I should also be able to?
Yes, of course. In requests you pass form data in data keyword:
import requests
url = 'http://www.example.com'
data = {
'name': 'value',
}
response = requests.post(url, data=data)
You can get the data you sent with a request from the response in this way:
import requests
response = requests.post('http://your_url', data=data) # send request
body = response.request.body
parsed_data = dict(data.split('=') for data in body.split('&')) # parse request body
Here you can find more information about data argument
In the documentation, in the class requests.Response we can find the attribute:
request = None
The PreparedRequest object to which this is a response.
In requests.PreparedRequest class we can read:
body = None
request body to send to the server.
I have a Flask based app running which had a path responding to a POST command. The incoming data was json so I used the get_json() method to parse the data.
I have now changed the server to run nginx and uwsgi as I now use SSL. All paths in the app work (GET) but the POST based path no longer sees the incoming data as python and fails. The data is visible in request.data but the get_json method fails.
#school_app.route('/school/queries', methods=['POST'])
def school_queries():
req = request.get_json(silent=True, force=True)
command_name = req["result"]["parameters"]["command-name"]
I have also tried to push the request.data through json.loads but this fails as well.
req = json.loads(request.data)
I'm assuming that the server changes have impacted the data but I can't see why it can no longer be parsed as json.
So, the following code works...
data = request.data
req = json.loads(data)
command_name = req["result"]["parameters"]["command-name"]
Still not sure why the get_json method stopped working after the switch to SSL/nginx/uwsgi but at least it works.