I am using the python requests library for a GET request.
The request goes via a proxy and I want to set a proxy header in the request (X-AUTH)
I could not find a way to set proxy header in requests. I want something similar to --proxy-header in curl. Setting it in normal headers does not seem to work.
import requests
proxies = {
"http": "http://myproxy:8000",
"https": "http://myproxy:8000",
}
r = requests.get("https://abc.xyz.com/some/endpoint", proxies=proxies, headers = {"X-AUTH": "mysecret"})
print(r.text)
I have changed the endpoints for privacy.
Curl call works when --proxy-header is used.
Related
I used to selenium for downloading special reports from webpage where I have to login. Webpage has integrated OKTA Authentication plugin . I find out that there would be better and more effective use internal API requests. So I tried find how to use request python library with creating session, but I am unsuccessful. I tried this code, but it ends with 400 error.
payload = {"password":"password","username":"username","options":{"warnBeforePasswordExpired": True,"multiOptionalFactorEnroll": True}}
with requests.Session() as s:
p = s.post('https://sso.johndeere.com/api/v1/authn', data=payload)
r = s.get("requested_url")
print(p)
I am unable get throw auth. Has anybody experience with breaking OKTA auth plugin using requests library?
Thanks
Best Regards
Merry Christmas and Welcome to Stackoverflow!
Firstly, an HTTP error code of 400 error means one or more settings is wrong at the client side. You can learn more about it here.
You seem to be missing out important headers configuration. You need to set the content-type header correctly otherwise the destination server won't be able to process your data.
Also, as a bonus point. You need to format your payload into a valid JSON string before sending out the request too.
import requests
import json
# Setup proper headers
headers = {
"accept": "application/json, text/plain, */*",
"content-type": "application/json; charset=UTF-8"
}
# Your body data here
payload = {"password":"password","username":"username","options":{"warnBeforePasswordExpired": True,"multiOptionalFactorEnroll": True}}
payload_json = json.dumps(payload) # Format it into a valid JSON str
with requests.Session() as s:
p = s.post('https://sso.johndeere.com/api/v1/authn', headers=headers, data=payload_json)
r = s.get("requested_url")
print(p.content)
Requests isn't using the proxies I pass to it. The site at the url I'm using shows which ip the request came from--and it's always my ip not the proxy ip. I'm getting my proxy ips from sslproxies.org which are supposed to be anonymous.
url = 'http://www.lagado.com/proxy-test'
proxies = {'http': 'x.x.x.x:xxxx'}
headers = {'User-Agent': 'Mozilla...etc'}
res = requests.get(url, proxies=proxies, headers=headers)
Are there certain headers that need to be used or something else that needs to be configured so that my ip is hidden from the server?
The docs state
that proxy URLs must include the scheme.
Where scheme is scheme://hostname. So you should add 'http://' or 'socks5://' to your proxy URL, depending on the protocol you're using.
I am trying to write a python script that will make a request to a desktop application listening to 8080 port. The below is the code that I use to make the request.
import requests
payload = {"url":"abcdefghiklmnopqrstuvwxyz=",
"password":"qertyuioplkjhgfdsazxvnm=",
"token":"abcdefghijklmn1254786=="}
headers = {'Content-Type':'application/json'}
r = requests.post('http://localhost:9015/login',params = payload, headers=headers)
response = requests.get("http://localhost:9015/login")
print(r.status_code)
After making the request, I get a response code of 401.
However, when I try the same using the Postman app, I get a successful response. The following are the details I give in Postman:
URL: http://localhost:9015/login
METHOD : POST
Headers: Content-Type:application/json
Body: {"url":"abcdefghiklmnopqrstuvwxyz=",
"password":"qertyuioplkjhgfdsazxvnm=",
"token":"abcdefghijklmn1254786=="}
Can I get some suggestions on where I am going wrong with my python script?
You pass params, when you should pass data, or, even better, json for setting Content-Type automatically. So, it should be:
import json
r = requests.post('http://localhost:9015/login', data=json.dumps(payload), headers=headers)
or
r = requests.post('http://localhost:9015/login', json=payload)
(params adds key-value pairs to query parameters in the url)
I'm coding a little snippet to fetch data from a web page, and I'm currently behind a HTTP/HTTPS proxy. The requests are created like this:
headers = {'Proxy-Connection': 'Keep-Alive',
'Connection':None,
'User-Agent':'curl/1.2.3',
}
r = requests.get("https://www.google.es", headers=headers, proxies=proxyDict)
At first, neither HTTP nor HTTPS worked, and the proxy returned 403 after the request. It was also weird that I could do HTTP/HTTPS requests with curl, fetching packages with apt-get or browsing the web. Having a look at Wireshark I noticed some differences between the curl request and the Requests one. After setting User-Agent to a fake curl version, the proxy instantly lets me do HTTP requests, so I supposed the proxy filter requests by User-Agent.
So, now I know why my code fails, and I can do HTTP requests, but the code keep on failing with HTTPS. I set the headers the same way as with HTTP, but after looking at Wireshark, no headers are sent in the CONNECT message, so the proxy sees no User-Agent and returns an ACCESS DENIED response.
I think that if only I could send the headers with the CONNECT message, I could do HTTPS requests easily, but I'm breaking my head around how to tell Requests that I want to send that headers.
Ok, so I found a way after looking at http.client. It's a bit lower level than using Requests but at least it work.
def HTTPSProxyRequest(method, host, url, proxy, header=None, proxy_headers=None, port=443):
https = http.client.HTTPSConnection(proxy[0], proxy[1])
https.set_tunnel(host, port, headers=proxy_headers)
https.connect()
https.request(method, url, headers=header)
response = https.getresponse()
return response.read(), response.status
# calling the function
HTTPSProxyRequest('GET','google.com', '/index.html', ('myproxy.com',8080))
Trying to send a simple get request via a proxy. I have the 'Proxy-Authorization' and 'Authorization' headers, don't think I needed the 'Authorization' header, but added it anyway.
import requests
URL = 'https://www.google.com'
sess = requests.Session()
user = 'someuser'
password = 'somepass'
token = base64.encodestring('%s:%s'%(user,password)).strip()
sess.headers.update({'Proxy-Authorization':'Basic %s'%token})
sess.headers['Authorization'] = 'Basic %s'%token
resp = sess.get(URL)
I get the following error:
requests.packages.urllib3.exceptions.ProxyError: Cannot connect to proxy. Socket error: Tunnel connection failed: 407 Proxy Authentication Required.
However when I change the URL to simple http://www.google.com, it works fine.
Do proxies use Basic, Digest, or some other sort of authentication for https? Is it proxy server specific? How do I discover that info? I need to achieve this using the requests library.
UPDATE
Its seems that with HTTP requests we have to pass in a Proxy-Authorization header, but with HTTPS requests, we need to format the proxy URL with the username and password
#HTTP
import requests, base64
URL = 'http://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<IP>:<PORT>}
token = base64.encodestring('%s:%s' %(user, password)).strip()
myheader = {'Proxy-Authorization': 'Basic %s' %token}
r = requests.get(URL, proxies = proxies, headers = myheader)
print r.status_code # 200
#HTTPS
import requests
URL = 'https://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<user>:<password>#<IP>:<PORT>}
r = requests.get(URL, proxies = proxy)
print r.status_code # 200
When sending an HTTP request, if I leave out the header and pass in a proxy formatted with user/pass, I get a 407 response.
When sending an HTTPS request, if I pass in the header and leave the proxy unformatted I get a ProxyError mentioned earlier.
I am using requests 2.0.0, and a Squid proxy-caching web server. Why doesn't the header option work for HTTPS? Why does the formatted proxy not work for HTTP?
The answer is that the HTTP case is bugged. The expected behaviour in that case is the same as the HTTPS case: that is, you provide your authentication credentials in the proxy URL.
The reason the header option doesn't work for HTTPS is that HTTPS via proxies is totally different to HTTP via proxies. When you route a HTTP request via a proxy, you essentially just send a standard HTTP request to the proxy with a path that indicates a totally different host, like this:
GET http://www.google.com/ HTTP/1.1
Host: www.google.com
The proxy then basically forwards this on.
For HTTPS that can't possibly work, because you need to negotiate an SSL connection with the remote server. Rather than doing anything like the HTTP case, you use the CONNECT verb. The proxy server connects to the remote end on behalf of the client, and from them on just proxies the TCP data. (More information here.)
When you attach a Proxy-Authorization header to the HTTPS request, we don't put it on the CONNECT message, we put it on the tunnelled HTTPS message. This means the proxy never sees it, so refuses your connection. We special-case the authentication information in the proxy URL to make sure it attaches the header correctly to the CONNECT message.
Requests and urllib3 are currently in discussion about the right place for this bug fix to go. The GitHub issue is currently here. I expect that the fix will be in the next Requests release.