I have a squid proxy that requires authentication. In squid.conf I am using:
auth_param digest program /usr/lib64/squid/digest_pw_auth -c /etc/squid/passwords
auth_param digest realm proxy
acl authenticated proxy_auth REQUIRED
http_access allow authenticated
From this I can expect the authentication method to be http digest.
Here is my python code:
from requests.auth import HTTPDigestAuth
auth = HTTPDigestAuth("user", "pass")
r = requests.get( "http://www.google.com", allow_redirects=True, headers=Configuration.HEADERS, proxies=proxy_list(), auth=auth )
I am receiving this error:
407 Proxy Authentication Required
I have also tried authenticating with:
auth = HTTPProxyAuth('user', 'password')
and:
http://user:password#ip
With no luck...
Can anybody help?
Thanks
HTTPDigestAuth doesn't authenticate you with the proxy, it authenticates you with the website. Right now Requests doesn't have any built-in way of using Digest Auth with a proxy, and there are no plans to add built-in support.
You'll have to either use with the proxy (by putting your credentials in the proxy URL, e.g. proxies={'http': 'http://user:password#domain.com'}), or write your own authentication handler for Proxy Digest Auth.
Related
I am trying to implement server side code to authenticate the client using certificate and Authorize based on the Groups associated in certificate.
The client side code goes like this:
import json
import requests
clientCrt = "cc.crt"
clientKey = "ck.key"
url = "https://example.com/api"
payload = { "someId": "myID" }
certServer = 'cs.crt'
headers = {'content-type': 'application/json'}
r = requests.post(url, data=json.dumps(payload), verify=certServer,
headers=headers, cert=(clientCrt, clientKey))
print(r.status_code)
print(r.json())
I want to have a corresponding server side implementation specifically to check whether request should be honoured based on the clientCrt
Can someone share how i can access the clientCrt on server side and extract the fields of certificate.
Note: I am not looking for mutual TLS Auth, I am interested in Service Authentication and Authorization
Mutual TLS is not configured on a default wsgi serving connection object. This needs to be configured. See this page for more details - https://www.ajg.id.au/2018/01/01/mutual-tls-with-python-flask-and-werkzeug/. Once you have the connection object handy, you can use request.environ['peercert'].get_subject() to get the details of the client cert.
Better way to handle this is by delegating it to Gunicorn or nginx proxy. See https://eugene.kovalev.systems/posts/flask-client-side-tls-authentication/ for more examples
I've got a Windows server (Navision) offering web access to its APIs through Active Directory authentication.
I'm trying to make a request to the web server through Active Directory authentication, by using an external Linux based host.
I successfully authenticated by using python-ldap library.
import ldap
import urllib2
DOMAINHOST='domain_ip_host'
USERNAME='administrator#mydomain'
PASSWORD='mycleanpassword'
URL='http://...'
conn = ldap.open(DOMAINHOST)
ldap.set_option(ldap.OPT_REFERRALS, 0)
try:
print conn.simple_bind_s(USERNAME, PASSWORD)
except ldap.INVALID_CREDENTIALS:
user_error_msg('wrong password provided')
The output is in this case:
(97, [], 1, [])
representing a successful authentication.
I'd need to exploit this successful authentication to communicate to the Navision web service, e.g. by using urllib2 library.
req = urllib2.Request(URL)
res = urllib2.urlopen(req)
Of course, since authentication is not exploited/adopted, the request fails with a 401 Unauthorized error.
I also tried to use python-ntlm library:
passman = urllib2.HTTPPasswordMgrWithDefaultRealm()
passman.add_password(None, URL, USERNAME, PASSWORD)
# create the NTLM authentication handler
auth_NTLM = HTTPNtlmAuthHandler.HTTPNtlmAuthHandler(passman)
# other authentication handlers
auth_basic = urllib2.HTTPBasicAuthHandler(passman)
auth_digest = urllib2.HTTPDigestAuthHandler(passman)
# disable proxies (if you want to stay within the corporate network)
proxy_handler = urllib2.ProxyHandler({})
# create and install the opener
opener = urllib2.build_opener(proxy_handler, auth_NTLM, auth_digest, auth_basic)
urllib2.install_opener(opener)
# retrieve the result
response = urllib2.urlopen(url)
print(response.read())
Also in this case, a 401 Unauthorized error is provided.
How can I successfully make a web request by authenticating the user against Active Directory?
If it's a Dynamics NAV Webservice you want to trigger (didn't see that from code but from tag) you have to activcate ntlm on your NST.
Just change the Property 'ServicesUseNTLMAuthentication' from False to True in your CustomSettings.config or just use the Microsoft Dynamics NAV Administration MMC. Don't forget to restart the service after changing.
How can I use automatic NTLM authentication from python on Windows?
I want to be able to access the TFS REST API from windows without hardcoding my password, the same as I do from the web browser (firefox's network.automatic-ntlm-auth.trusted-uris, for example).
I found this answer which works great for me because:
I'm only going to run it from Windows, so portability isn't a problem
The response is a simple json document, so no need to store an open session
It's using the WinHTTP.WinHTTPRequest.5.1 COM object to handle authentication natively:
import win32com.client
URL = 'http://bigcorp/tfs/page.aspx'
COM_OBJ = win32com.client.Dispatch('WinHTTP.WinHTTPRequest.5.1')
COM_OBJ.SetAutoLogonPolicy(0)
COM_OBJ.Open('GET', URL, False)
COM_OBJ.Send()
print(COM_OBJ.ResponseText)
You can do that with https://github.com/requests/requests-kerberos. Under the hood it's using https://github.com/mongodb-labs/winkerberos. The latter is marked as Beta, I'm not sure how stable it is. But I have requests-kerberos in use for a while without any issue.
Maybe a more stable solution would be https://github.com/brandond/requests-negotiate-sspi, which is using pywin32's SSPI implementation.
I found solution here https://github.com/mullender/python-ntlm/issues/21
pip install requests
pip install requests_negotiate_sspi
import requests
from requests_negotiate_sspi import HttpNegotiateAuth
GetUrl = "http://servername/api/controller/Methodname" # Here you need to set your get Web api url
response = requests.get(GetUrl, auth=HttpNegotiateAuth())
print("Get Request Outpot:")
print("--------------------")
print(response.content)
for request by https:
import requests
from requests_negotiate_sspi import HttpNegotiateAuth
import urllib3
urllib3.disable_warnings()
GetUrl = "https://servername/api/controller/Methodname" # Here you need to set your get Web api url
response = requests.get(GetUrl, auth=HttpNegotiateAuth(), verify=False)
print("Get Request Outpot:")
print("--------------------")
print(response.content)
NTLM credentials are based on data obtained during the interactive logon process, and include a one-way hash of the password. You have to provide the credential.
Python has requests_ntlm library that allows for HTTP NTLM authentication.
You can reference this article to access the TFS REST API :
Python Script to Access Team Foundation Server (TFS) Rest API
If you are using TFS 2017 or VSTS, you can try to use Personal Access Token in a Basic Auth HTTP Header along with your REST request.
Trying to send a simple get request via a proxy. I have the 'Proxy-Authorization' and 'Authorization' headers, don't think I needed the 'Authorization' header, but added it anyway.
import requests
URL = 'https://www.google.com'
sess = requests.Session()
user = 'someuser'
password = 'somepass'
token = base64.encodestring('%s:%s'%(user,password)).strip()
sess.headers.update({'Proxy-Authorization':'Basic %s'%token})
sess.headers['Authorization'] = 'Basic %s'%token
resp = sess.get(URL)
I get the following error:
requests.packages.urllib3.exceptions.ProxyError: Cannot connect to proxy. Socket error: Tunnel connection failed: 407 Proxy Authentication Required.
However when I change the URL to simple http://www.google.com, it works fine.
Do proxies use Basic, Digest, or some other sort of authentication for https? Is it proxy server specific? How do I discover that info? I need to achieve this using the requests library.
UPDATE
Its seems that with HTTP requests we have to pass in a Proxy-Authorization header, but with HTTPS requests, we need to format the proxy URL with the username and password
#HTTP
import requests, base64
URL = 'http://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<IP>:<PORT>}
token = base64.encodestring('%s:%s' %(user, password)).strip()
myheader = {'Proxy-Authorization': 'Basic %s' %token}
r = requests.get(URL, proxies = proxies, headers = myheader)
print r.status_code # 200
#HTTPS
import requests
URL = 'https://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<user>:<password>#<IP>:<PORT>}
r = requests.get(URL, proxies = proxy)
print r.status_code # 200
When sending an HTTP request, if I leave out the header and pass in a proxy formatted with user/pass, I get a 407 response.
When sending an HTTPS request, if I pass in the header and leave the proxy unformatted I get a ProxyError mentioned earlier.
I am using requests 2.0.0, and a Squid proxy-caching web server. Why doesn't the header option work for HTTPS? Why does the formatted proxy not work for HTTP?
The answer is that the HTTP case is bugged. The expected behaviour in that case is the same as the HTTPS case: that is, you provide your authentication credentials in the proxy URL.
The reason the header option doesn't work for HTTPS is that HTTPS via proxies is totally different to HTTP via proxies. When you route a HTTP request via a proxy, you essentially just send a standard HTTP request to the proxy with a path that indicates a totally different host, like this:
GET http://www.google.com/ HTTP/1.1
Host: www.google.com
The proxy then basically forwards this on.
For HTTPS that can't possibly work, because you need to negotiate an SSL connection with the remote server. Rather than doing anything like the HTTP case, you use the CONNECT verb. The proxy server connects to the remote end on behalf of the client, and from them on just proxies the TCP data. (More information here.)
When you attach a Proxy-Authorization header to the HTTPS request, we don't put it on the CONNECT message, we put it on the tunnelled HTTPS message. This means the proxy never sees it, so refuses your connection. We special-case the authentication information in the proxy URL to make sure it attaches the header correctly to the CONNECT message.
Requests and urllib3 are currently in discussion about the right place for this bug fix to go. The GitHub issue is currently here. I expect that the fix will be in the next Requests release.
I am trying to do oauth authentication using Django-social-auth which uses oauth2 under the hood. I am adding a custom backend for vimeo. Vimneo API requires all APi calls to use a custom user-agent.
oauth2 is using httplib2 and doesn't have a hook point to set a user agent. Is there a way I can say "All network requests from here on should use this custom header".
If I got your question right, then you can send User-Agent alongwith request headers.
h = httplib2.Http(".cache")
resp, content = h.request("https://example.org/chap/2",
"PUT", body="This is text",
headers={'User-Agent':'my user agent'} )