How would I use scapy 2.4.3+ to spoof a http response? I tried using the packet below, however on the target machine (which has been arp spoofed, where the machine of the spoofer does not have port fowarding, resulting in the http request having to be answered by the spoofed packet, that is what I am trying to do), the provided HTML from the packet does not get rendered, once its sent using scapy.send(packet). So how would I adapt the packet below, to send a HTTP packet that would render on the target machine?
packet = scapy.IP(src=server_ip, dst=target_ip)/scapy.TCP() / HTTP() / HTTPResponse(Server=server_ip) / "<html><p>Hi</p></html>"
Related
I'm trying to do some network analysis and have difficulty discerning between HTTP/1.1 and HTTP/2 traffic.
For starters, HTTP/1.1 is pretty straightforward (TCP port 80 + I can check the HTTP header to get the version since it's unencrypted). For HTTP/3 I can check if it's UDP port 443 and if the payload is QUIC.
The problem arises when I try to discern between HTTP/1.1 and HTTP/2 on TCP port 443 (HTTPS), since it's encrypted I cannot check which version we're using...
EDIT: For TLS1.1 we can assume it's HTTP/1.1 since HTTP/2 supports only >= TLS1.2. For TLS1.2 we can check the ALPN field in the Server Hello response but for TLS1.3 I don't see any option, since the ALPN field is not present in TLS1.3 Server Hello headers...
Any ideas on how to figure out the HTTPS version using Scapy, especially for TLS1.3?
I am attempting to transmit a GET request to a server to update particular json values. I have been making use of these similar posts to do so:
Python-Scapy or the like-How can I create an HTTP GET request at the packet level
Filter HTTP Get requests packets using scapy
How can I alter the JSON object and override the values highlight in the image that are being sent?
I'm using Python requests to play with a REST API. The response format is JSON and let's assume the server always send correct data. Given the fact that HTTP uses TCP for transmission, do I still have to check the existence of a required key if no exception is thrown by requests?
For TCP transmissions, you don't need to verify the response if you assume that the server always sends correct data:
TCP provides reliable, ordered, and error-checked delivery of a stream of octets between applications running on hosts communicating by an IP network.
Source: Wikipedia
Of course, it's always a good idea to add some error handling and verification to your code just in case the server doesn't send what you'd expect.
I need to intercept an HTTP Response packet from the server and replace it with my own response, or at least modify that response, before it arrives to my browser.
I'm already able to sniff this response and print it, the problem is with manipulating/replacing it.
Is there a way to do so wiht scapy library ?
Or do i have to connect my browser through a proxy to manipulate the response ?
If you want to work from your ordinary browser, then you need proxy between browser and server in order to manipulate it. E.g. see https://portswigger.net/burp/ which is a proxy specifically created for penetration testing with easy replacing of responses/requests (which is sriptable, too).
If you want to script all your session in scapy, then you can create requests and responses to your liking, but response does not go to the browser. Also, you can record ordinary web session (with tcpdump/wireshark/scapy) into pcap, then use scapy to read pcap modify it and send similar requests to the server.
I am attempting to work with large packet captures from wireshark that have been output in pdml format. These captures are then loaded into python using the lxml library to traverse over them. The issue I am having is that I can pull out information regarding a single HTTP response packet and then I need a way to associate this with its HTTP request packet.
The current solution I was thinking of implementing is to search for an HTTP request packet that is part of the same TCP stream as the response, however this seems like an inefficient solution to the problem, having to continually separate out TCP streams and then search through them for the request packet.
Is there a simple way to associate response packets with requests that I am missing?
Best solution I have come up with thus far is to use xpath under the assumption that each TCP connection only contains one request/response pair.
#Get the stream index from the packet
streamIndex = packet.xpath('proto/field[#name="tcp.stream"]')[0].attrib['show']
#Use that stream index to get the matching response packet
return packet.xpath('/pdml/packet[proto/field[#name="tcp.stream" and #show="' + streamIndex + '"] and proto/field[#name="http.request.full_uri"]]')[0]