I am trying to intercept and modify a graphql response's body. Here is my addon code:
from mitmproxy import ctx
from mitmproxy import http
import json
def response(flow: http.HTTPFlow) -> None:
if flow.request.pretty_url == "https://my.graphql/endpoint":
request_data = json.loads(flow.request.get_text())
if request_data["operationName"] == "MyOperationName":
data = json.loads(flow.response.get_text())
data["data"]["product"]["name"] = "New Name"
flow.response.text = json.dumps(data)
I can see the modified response in mitmproxy console. But the iOS simulator I am using is still getting the original response. Does anyone know how can I pass the modified response to the device?
From the documentation
def response(self, flow: mitmproxy.http.HTTPFlow):
"""
The full HTTP response has been read.
Note: If response streaming is active, this event fires after the entire body has been streamed.
HTTP trailers, if present, have not been transmitted to the client yet and can still be modified.
"""
ctx.log(f"response: {flow=}")
It appears that you might be streaming the response body which would mean that modifications would be ignored.
Consider using def request event hook instead
Related
The request I am trying to intercept and modify is a get request with only one parameter and I try to modify it:
from mitmproxy import http
def request(flow: http.HTTPFlow) -> None:
if flow.request.pretty_url.startswith(BASE_URL):
flow.request.url = BASE_URL.replace('abc', 'def')
The above shows what I am trying to do in a nutshell. But unfortunately, according to docs,
this event fires after the entire body has been streamed.
In the end, I am not able to modify the request. Am I missing something here? Because if modify requests is not possible, then what is the point of mitmproxy?
I'm updating some APIs that I have coded using Python and FastAPI. In the actual version I'm sending the info using Query Paramenters, but now I want to try to send the data by using Request Body, but the code doesn't seem to work.
Below you can see a sample of the code on the Server Side where FastAPI should return the text I'm sending (for this example only the "string_to_show" info, but in the real project there would be more fields). I know that is a base code, but this is just a sample.
from fastapi import FastAPI, Path, Query
from pydantic import BaseModel
import uvicorn
app = FastAPI()
class req_body(BaseModel):
string_to_show:str
#app.get("/test/")
def scraper_shield(payload:req_body):
request_feedback = str(payload.string_to_show)
return request_feedback
if __name__ == "__main__":
uvicorn.run(app)
On the Client Side I'm using this code, that simply sends the payload to the server.
import requests
payload = {'string_to_show':'Python is great!'}
r = requests.get('http://127.0.0.1:8000/test/', params=payload)
print(r.text)
When sending the request, I should get the string "Python is Great!" but instead I'm getting some errors, below the Client and Server messages:
CLIENT: {"detail":[{"loc":["body"],"msg":"field required","type":"value_error.missing"}]}
SERVER: "GET /test/?string_to_show=Python+is+great%21 HTTP/1.1" 422 Unprocessable Entity
GET methods are not suppose to have body.
https://dropbox.tech/developers/limitations-of-the-get-method-in-http
POST method is meant for that.
If you really need some robust parametrization with GET (but I really would reconsider that), think about putting them into custom header(s).
You are sending the parameters as query strings when using
r = requests.get('http://127.0.0.1:8000/test/', params=payload). You have to instead use
r = requests.get('http://127.0.0.1:8000/test/', json=payload) which will create a request body and send it correctly. This what you also see from the errors, which basically tell you that the required body is missing.
Good day,
I am currently trying to figure out a way to make non blocking requests inside a simple script of mitmproxy, but the documentation doesn't seem to be clear for me for the first look.
I think it's probably the easiest if I show my current code and describe my issue below:
from copy import copy
from mitmproxy import http
def request(flow: http.HTTPFlow):
headers = copy(flow.request.headers)
headers.update({"Authorization": "<removed>", "Requested-URI": flow.request.pretty_url})
req = http.HTTPRequest(
first_line_format="origin_form",
scheme=flow.request.scheme,
port=443,
path="/",
http_version=flow.request.http_version,
content=flow.request.content,
host="my.api.xyz",
headers=headers,
method=flow.request.method
)
print(req.get_text())
flow.response = http.HTTPResponse.make(
200, req.content,
)
Basically I would like to intercept any HTTP(S) request done and make a non blocking request to an API endpoint at https://my.api.xyz/ which should take all original headers and return a png screenshot of the originally requested URL.
However the code above produces an empty content and the print returns nothing either.
My issue seems to be related to: mtmproxy http get request in script and Resubmitting a request from a response in mitmproxy but I still couldn't figure out a proper way of sending requests inside mitmproxy.
The following piece of code probably does what you are looking for:
from copy import copy
from mitmproxy import http
from mitmproxy import ctx
from mitmproxy.addons import clientplayback
def request(flow: http.HTTPFlow):
ctx.log.info("Inside request")
if hasattr(flow.request, 'is_custom'):
return
headers = copy(flow.request.headers)
headers.update({"Authorization": "<removed>", "Requested-URI": flow.request.pretty_url})
req = http.HTTPRequest(
first_line_format="origin_form",
scheme='http',
port=8000,
path="/",
http_version=flow.request.http_version,
content=flow.request.content,
host="localhost",
headers=headers,
method=flow.request.method
)
req.is_custom = True
playback = ctx.master.addons.get('clientplayback')
f = flow.copy()
f.request = req
playback.start_replay([f])
It uses the clientplayback addon in order to send out the request. When this new request is sent, that will generate another request event which will then be an infinite loop. That is the reason for the is_custom attribute I added to the request there. If the request that generated this event is the one that we have created, then we don't want to create a new request from it.
I am trying to update an already saved form on a system using HTTP requests. Due to the server configuration for the third party app we use, updating by POST requires sending a fully filled out payload every single time.
I want to get round this by recovering the form data already present on the server and converting it into a dictionary. Then changing any values I need and reposting to make changes sever side.
The application we use sends a POST request when the save button is clicked for a particular form.
Here I send a post request with no payload.
[This simulates pressing the save button and is also the point where dev tools shows me a the payload I want to capture]
post_test = self.session.post(url_to_retrieve_from)
I thought that now I should be able to print the output, which should resemble what Google Dev tools Form data captures.
print(post_test.text)
This just gives me html found on the webpage.
If Dev Tools can get this from the server then I should also be able to?
Example of Data I am trying to get via requests:
Form Data
If Dev Tools can get this from the server then I should also be able to?
Yes, of course. In requests you pass form data in data keyword:
import requests
url = 'http://www.example.com'
data = {
'name': 'value',
}
response = requests.post(url, data=data)
You can get the data you sent with a request from the response in this way:
import requests
response = requests.post('http://your_url', data=data) # send request
body = response.request.body
parsed_data = dict(data.split('=') for data in body.split('&')) # parse request body
Here you can find more information about data argument
In the documentation, in the class requests.Response we can find the attribute:
request = None
The PreparedRequest object to which this is a response.
In requests.PreparedRequest class we can read:
body = None
request body to send to the server.
I have a Flask based app running which had a path responding to a POST command. The incoming data was json so I used the get_json() method to parse the data.
I have now changed the server to run nginx and uwsgi as I now use SSL. All paths in the app work (GET) but the POST based path no longer sees the incoming data as python and fails. The data is visible in request.data but the get_json method fails.
#school_app.route('/school/queries', methods=['POST'])
def school_queries():
req = request.get_json(silent=True, force=True)
command_name = req["result"]["parameters"]["command-name"]
I have also tried to push the request.data through json.loads but this fails as well.
req = json.loads(request.data)
I'm assuming that the server changes have impacted the data but I can't see why it can no longer be parsed as json.
So, the following code works...
data = request.data
req = json.loads(data)
command_name = req["result"]["parameters"]["command-name"]
Still not sure why the get_json method stopped working after the switch to SSL/nginx/uwsgi but at least it works.