With treq/twisted, requests received out of order - python

I am new to Twisted and started using treq because of its similarity to Requests (very easy to use basic authentication etc). I have an HTTPConnectionPool with maxPersistentPerHost=1 and persistence=True, I send 4 requests in a row to a host: treq.put(1), treq.get(1), treq.put(2) and treq.get(1). However the apache server running on the host receives the requests out of order (checked /var/log/apache2/access.log). Using netstat I saw 4 connections to the host, I was expecting only 1 connection and all requests to go over the same connection in order. Any idea on what I am doing wrong or missing?
Thanks!
Reshad.

Related

Python requests being fingerprinted?

I'm hacking together an amazon api and when only using python requests without proxying, it prompts for a captcha. When routing this python requests traffic through fiddler, it seems to pass without a problem. Is it possible that amazon is fingerprinting python requests and fiddler changes the fingerprint since it's a proxy?
I viewed headers sent from fiddler and python requests and they are the same.
There is no exra proxying/fiddler rules/filters set on fiddler to create a change.
To be clear, all mentioned proxying is only done locally, so it will not change the public ip address.
Thank you!
The reason is that websites are fingerprinting your requests with TLS hello package. There exist libraries like JA3 to generate a fingerprint for each request. They will intentionally block http clients like requests or urllib. If you uses a MITM proxy, because the proxy server create a new TLS connection with the server, the server only sees proxy server's fingerprint, so they will not block it.
If the server only blocks certain popular http libraries, you can simply change the TLS version, then you will have different fingerprint than the default one.
If the server only allows popular real-world browsers, and only accepts them as valid requests, you will need libraries that can simulate browser fingerprints, one of which is curl-impersonate and its python binding curl_cffi.
pip install curl_cffi
from curl_cffi import requests
# Notice the impersonate parameter
r = requests.get("https://tls.browserleaks.com/json", impersonate="chrome101")
print(r.json())
# output: {'ja3_hash': '53ff64ddf993ca882b70e1c82af5da49'
# the fingerprint should be the same as target browser

Creating a Charles proxy alternative using Python

I am using Charles proxy right now to monitor traffic between my devices and a website. The traffic is SSL and I am able to read it on charles. The issue is charles makes the content hard to read when I am filtering through hundreds of variables in s JSON object. I created a program that will filter the JSON after exporting the charles log. My next step is to get rid of charles completely and create my own proxy in python that can view http and https data. I was wondering if scapy or any other existing libraries existed that would work? I am interested with scapy because I can save the proxy log as a pcap file.
Reading through mitmproxy would be overwhelming since it's a huge source base. If you would like to implement the proxy server from scratch. Here is what I learn during developing Proxyman
Learn how to set up a tiny Proxy server: Basically, open the listening socket at your port (9090 for example). Accept any incoming requests and get the first line of the HTTP Message. It could be done a lightweight http-parser or any Python parser. The raw HTTP message looks like:
CONNECT https://google.com HTTP/1.1
Parse and get the google and the IP: Open the socket connection to the destination IP and start to receive and sent forth and back from the client <-> the destination server.
The first step is essential to implement the HTTP Proxy in this step. Use http-parser to parse the rest of the HTTP Message. Thus, you can get the headers and body from the Request / Response -> Present to UI
Learn how HTTPS and SSL work: Use OpenSSL to generate a self-signed certificate and how to generate the chain certificates too.
Learn how to import those certificate to the macOS keychain by using security CLI or Security framework from Apple.
When you've done: it's time to start the HTTPS interception: Start the 2nd step and do SSL Handshake with appropriate certificate in both sides (Client -> Your Proxy Server and your Proxy Server -> Destination)
Parse the HTTP message as usual and get the rest of the message.
Overall, there are a lot of open sources out there, but I suggest to start from the simple version before moving on.
Hope that could help you.

Sending GET request for an IP address [duplicate]

This question already has answers here:
How to send http requests to flask server
(2 answers)
Closed 4 years ago.
I need to send a request to a web server of mine to start a stream. The web server is located at 0.0.0.0 (of course I can change the address).
How can I send a "GET" request to that server?
I already tried using httplib or urllib2 or 3 and they seem not to work with IP address.
I know that a local DNS server will map the address to a url, but that is not the goal to set up the server every time I want to execute the code in a new network.
Thank a lot.
You could use requests:
requests.get('http://0.0.0.0')
or even better
s = requests.Session()
s.get('http://0.0.0.0')
r = s.get('https://httpbin.org/cookies')
to keep the connection persistent, which is probably more like what you want.
See more about requests sessions at http://docs.python-requests.org/en/master/user/advanced/
Or you could just convert the IP address to a hostname using
socket.gethostbyaddr(ip) with urllib
to convert the IP address to a host name
It doesn't work because you probably haven't included the web protocol to use (i.e. HTTP or HTTPS). Try it like this
import urllib2
urllib2.urlopen('http://0.0.0.0')

How to access parse.com REST API after migrating to local server?

I have been using parse as my project server and now that I have migrated the data to local server, successfully made dashboard work; but I cannot find a way to access api.parse.com/1/ API.
I used to use python to make REST requests and it is basically establishes socket connection with api.parse.com at port 443. Now I am trying to connect to localhost at port 1337 which is where the parse-server instance is running. However, I have not been able to access the API same as before.
One thing to note is that I can successfully curl to get basic JSON response from requests like
curl -X GET -H ... http://localhost:1337/parse/classes/_User
The question is which connection now replaces api.parse.com for locally held parse-server instances?
Ahh I found out the answer:
You make socket connection with address and port where the parse-server is running, and then instead of doing
conn.request("GET", "/1/login?%s"%...)
You do:
conn.request("GET", "/parse/login?%s"%...)
Hope this helps.

Connecting to socket with authentication in python

I'm trying to connect to a mongodb instance through a python socket. The url looks like this
username:password#host.com:port
how can I connect to this with a python socket?
The following code gives me this error: [Errno -5] No address associated with hostname
import socket
import tornado
full_url = '%s:%s#%s' % (username, password, host)
s = socket.socket()
s.connect((full_url, port))
stream = iostream.IOStream(s)
EDIT - the reason I ask is because asyncmongo doesn't support this type of url right now. I'm trying to see if I can write a patch. The asyncmongo library connects using a socket like the one in the code above.
You should use a driver to connect to mongodb. If you are using Tornado (it looks like you intend to do so), try asyncmongo; if you are using a threaded web server/application framework (Django, Pylons, etc) you can use PyMongo directly.
Edit: As for why this code doesn't work, the socket module doesn't accept URLs for connection, just hostname and port. It is a low-level library. To connect to (web) urls, consider using urllib2 or httplib.
Edit 2: Authentication in MongoDB is not handled at the transport level, it's handled at the application level. I suggest you first read Implementing Authentication in a Driver, and then take a look at how PyMongo implements authentication (in connection.py and database.py). You'll also need to port or reimplement the MongoDB connection URI parsing for asyncmongo, which is documented here.

Categories

Resources