Preemptive authentication with Zeep and requests - python

I need to talk to a SOAP server which requires "preemptive authentication" (it uses BasicAuth).
I have no idea of how to configure my zeep client to make it behave accordingly.
As it says here, the SoapUI tool can be configured to use "preemptive authentication"
Can anyone please help me achieve the same? (either configuring zeep or requests)
Here is my code, which is pretty standard:
session = Session()
session.verify = False # ignore certificate
session.auth = HTTPBasicAuth(user, pwd)
transport = Transport(session=session)
client = Client(wsdl, transport=transport)
# ...
response = client.service.Operation(**params)
The above fails authenticating and ends up with an SSL error, which is expected.
Any help is much appreciated. Thank you

In theory, you should be able to do this by creating a session and modifying the headers directly. This way, the header will be sent with the original request instead of using the auth behavior of waiting for a challenge.
import requests
session = requests.Session()
session.headers['Authorization'] = 'Basic ' + <your 64-bit encoded user:pass>
transport = zeep.Transport(session=session)
client = zeep.Client(wsdl=soapURI,transport=transport)

Related

Flask server side implemetation for certificate based Auth

I am trying to implement server side code to authenticate the client using certificate and Authorize based on the Groups associated in certificate.
The client side code goes like this:
import json
import requests
clientCrt = "cc.crt"
clientKey = "ck.key"
url = "https://example.com/api"
payload = { "someId": "myID" }
certServer = 'cs.crt'
headers = {'content-type': 'application/json'}
r = requests.post(url, data=json.dumps(payload), verify=certServer,
headers=headers, cert=(clientCrt, clientKey))
print(r.status_code)
print(r.json())
I want to have a corresponding server side implementation specifically to check whether request should be honoured based on the clientCrt
Can someone share how i can access the clientCrt on server side and extract the fields of certificate.
Note: I am not looking for mutual TLS Auth, I am interested in Service Authentication and Authorization
Mutual TLS is not configured on a default wsgi serving connection object. This needs to be configured. See this page for more details - https://www.ajg.id.au/2018/01/01/mutual-tls-with-python-flask-and-werkzeug/. Once you have the connection object handy, you can use request.environ['peercert'].get_subject() to get the details of the client cert.
Better way to handle this is by delegating it to Gunicorn or nginx proxy. See https://eugene.kovalev.systems/posts/flask-client-side-tls-authentication/ for more examples

SOAP API Get Cookie

I'm using a SOAP API to get an authentication key with a cookie that is suppose to be returned.
from zeep import Client
client = Client("AuthenticationService.xml")
result = client.service.ValidateUser(username, password, "")
result
However with the result, I am getting a Boolean of True but no Cookie that contains the authentication key.
From the below picture, you can see that the same request using SoapUI returns a cookie. I'm wondering how I can do this in Python.
To be able to handle cookie, we must use requests.Session for the transport.
So a simple use case would look like this for you:
from zeep import Client
from requests import Session
from zeep.transports import Transport
session = Session()
# disable TLS verification
session.verify = False
transport = Transport(session=session)
client = Client("AuthenticationService.xml", transport=transport)
result = client.service.ValidateUser(username, password, "")
# then check cookie
client.transport.session.cookies
hope this helps.

How to authenticate internal corporate proxy with credentials in order to reach external API

We have a requirement in consuming an external API, in order to reach to their endpoint, we would need to authenticate our proxy first.
How can we achieve this using python, seems like there is one in
c# ---> CredentialCache.DefaultCredentials;
How to do it in python,
so far I have tried:
import requests
proxies = {"https":"https://url:port/file"}
client_cert = ("key/path", "cert/path")
data = """xml request"""
requests.post(url, proxy=proxy, data=data, cert=client_cert)
I have read in the docs saying there is http digest authentication like
I can use https://username:password#url:port/file .
Any suggestions?
ERROR:
HTTPSConnectionPool, failed to establish connection
Actually, My question has answer:
proxy = {"http": "http://username:password#proxy:port", "https":"http://username:password#proxy:port"}
requests.post(url, headers, auth, cert, payload, proxies=proxy) #===> works
or else we can set the environment variable.
export https_proxy = "http://username:password#proxy:port"
export http_proxy = "http://username:password#proxy:port"
In my case there were multiple proxies for our company and I was using the incorrect proxy details. When I tried with an accurate one. It worked.
Thanks to Stack

Python "requests" library: HTTP basic authentication for each request

In a Python script I use the "Requests" library with HTTP basic authentication and a custom CA certificate to trust like this:
import requests
response = requests.get(base_url, auth=(username, password), verify=ssl_ca_file)
All requests I need to make have to use these parameters. Is there a "Python" way to set these as default for all requests?
Use Session(). Documentation states:
The Session object allows you to persist certain parameters across
requests.
import requests
s = requests.Session()
s.auth = (username, password)
s.verify = ssl_ca_file
s.get(base_url)

requests library https get via proxy leads to error

Trying to send a simple get request via a proxy. I have the 'Proxy-Authorization' and 'Authorization' headers, don't think I needed the 'Authorization' header, but added it anyway.
import requests
URL = 'https://www.google.com'
sess = requests.Session()
user = 'someuser'
password = 'somepass'
token = base64.encodestring('%s:%s'%(user,password)).strip()
sess.headers.update({'Proxy-Authorization':'Basic %s'%token})
sess.headers['Authorization'] = 'Basic %s'%token
resp = sess.get(URL)
I get the following error:
requests.packages.urllib3.exceptions.ProxyError: Cannot connect to proxy. Socket error: Tunnel connection failed: 407 Proxy Authentication Required.
However when I change the URL to simple http://www.google.com, it works fine.
Do proxies use Basic, Digest, or some other sort of authentication for https? Is it proxy server specific? How do I discover that info? I need to achieve this using the requests library.
UPDATE
Its seems that with HTTP requests we have to pass in a Proxy-Authorization header, but with HTTPS requests, we need to format the proxy URL with the username and password
#HTTP
import requests, base64
URL = 'http://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<IP>:<PORT>}
token = base64.encodestring('%s:%s' %(user, password)).strip()
myheader = {'Proxy-Authorization': 'Basic %s' %token}
r = requests.get(URL, proxies = proxies, headers = myheader)
print r.status_code # 200
#HTTPS
import requests
URL = 'https://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<user>:<password>#<IP>:<PORT>}
r = requests.get(URL, proxies = proxy)
print r.status_code # 200
When sending an HTTP request, if I leave out the header and pass in a proxy formatted with user/pass, I get a 407 response.
When sending an HTTPS request, if I pass in the header and leave the proxy unformatted I get a ProxyError mentioned earlier.
I am using requests 2.0.0, and a Squid proxy-caching web server. Why doesn't the header option work for HTTPS? Why does the formatted proxy not work for HTTP?
The answer is that the HTTP case is bugged. The expected behaviour in that case is the same as the HTTPS case: that is, you provide your authentication credentials in the proxy URL.
The reason the header option doesn't work for HTTPS is that HTTPS via proxies is totally different to HTTP via proxies. When you route a HTTP request via a proxy, you essentially just send a standard HTTP request to the proxy with a path that indicates a totally different host, like this:
GET http://www.google.com/ HTTP/1.1
Host: www.google.com
The proxy then basically forwards this on.
For HTTPS that can't possibly work, because you need to negotiate an SSL connection with the remote server. Rather than doing anything like the HTTP case, you use the CONNECT verb. The proxy server connects to the remote end on behalf of the client, and from them on just proxies the TCP data. (More information here.)
When you attach a Proxy-Authorization header to the HTTPS request, we don't put it on the CONNECT message, we put it on the tunnelled HTTPS message. This means the proxy never sees it, so refuses your connection. We special-case the authentication information in the proxy URL to make sure it attaches the header correctly to the CONNECT message.
Requests and urllib3 are currently in discussion about the right place for this bug fix to go. The GitHub issue is currently here. I expect that the fix will be in the next Requests release.

Categories

Resources