Flask server side implemetation for certificate based Auth - python

I am trying to implement server side code to authenticate the client using certificate and Authorize based on the Groups associated in certificate.
The client side code goes like this:
import json
import requests
clientCrt = "cc.crt"
clientKey = "ck.key"
url = "https://example.com/api"
payload = { "someId": "myID" }
certServer = 'cs.crt'
headers = {'content-type': 'application/json'}
r = requests.post(url, data=json.dumps(payload), verify=certServer,
headers=headers, cert=(clientCrt, clientKey))
print(r.status_code)
print(r.json())
I want to have a corresponding server side implementation specifically to check whether request should be honoured based on the clientCrt
Can someone share how i can access the clientCrt on server side and extract the fields of certificate.
Note: I am not looking for mutual TLS Auth, I am interested in Service Authentication and Authorization

Mutual TLS is not configured on a default wsgi serving connection object. This needs to be configured. See this page for more details - https://www.ajg.id.au/2018/01/01/mutual-tls-with-python-flask-and-werkzeug/. Once you have the connection object handy, you can use request.environ['peercert'].get_subject() to get the details of the client cert.
Better way to handle this is by delegating it to Gunicorn or nginx proxy. See https://eugene.kovalev.systems/posts/flask-client-side-tls-authentication/ for more examples

Related

Preemptive authentication with Zeep and requests

I need to talk to a SOAP server which requires "preemptive authentication" (it uses BasicAuth).
I have no idea of how to configure my zeep client to make it behave accordingly.
As it says here, the SoapUI tool can be configured to use "preemptive authentication"
Can anyone please help me achieve the same? (either configuring zeep or requests)
Here is my code, which is pretty standard:
session = Session()
session.verify = False # ignore certificate
session.auth = HTTPBasicAuth(user, pwd)
transport = Transport(session=session)
client = Client(wsdl, transport=transport)
# ...
response = client.service.Operation(**params)
The above fails authenticating and ends up with an SSL error, which is expected.
Any help is much appreciated. Thank you
In theory, you should be able to do this by creating a session and modifying the headers directly. This way, the header will be sent with the original request instead of using the auth behavior of waiting for a challenge.
import requests
session = requests.Session()
session.headers['Authorization'] = 'Basic ' + <your 64-bit encoded user:pass>
transport = zeep.Transport(session=session)
client = zeep.Client(wsdl=soapURI,transport=transport)

How to authenticate internal corporate proxy with credentials in order to reach external API

We have a requirement in consuming an external API, in order to reach to their endpoint, we would need to authenticate our proxy first.
How can we achieve this using python, seems like there is one in
c# ---> CredentialCache.DefaultCredentials;
How to do it in python,
so far I have tried:
import requests
proxies = {"https":"https://url:port/file"}
client_cert = ("key/path", "cert/path")
data = """xml request"""
requests.post(url, proxy=proxy, data=data, cert=client_cert)
I have read in the docs saying there is http digest authentication like
I can use https://username:password#url:port/file .
Any suggestions?
ERROR:
HTTPSConnectionPool, failed to establish connection
Actually, My question has answer:
proxy = {"http": "http://username:password#proxy:port", "https":"http://username:password#proxy:port"}
requests.post(url, headers, auth, cert, payload, proxies=proxy) #===> works
or else we can set the environment variable.
export https_proxy = "http://username:password#proxy:port"
export http_proxy = "http://username:password#proxy:port"
In my case there were multiple proxies for our company and I was using the incorrect proxy details. When I tried with an accurate one. It worked.
Thanks to Stack

Allow same user to connect to Django server from Angular2 client and other Django server

We have this setup:
Central Django server, CSRF and login enabled. Except for the login no action may be performed without logging in previously.
An Angular2 client which connects for almost every call to the central server. The login on the central server is executed from here. CSRF token is available and authentication works.
Another small server which takes files. It is also Django but not CSRF enabled. The client sends files to this server which the central server may never possess or even see. The file upload (using form-data and POST) works fine. However, after a file upload has been completed, we would like this small server to call the central server notifying it of the successful upload.
The problem is the last step. The central server refuses the call, saying we need to be logged in. Can we in any way make the central server believe that the request came from the user who logged in with the Angular2 client? How do we have to set up the CSRF token? We are sending the user's CSRF token he got in the client to the small server.
We are using the python-requests library, Python 3 and Django 1.10.
This is the code we currently have on the small server:
url = settings.CENTRAL_SERVER_URL + 'path/to/endpoint'
# 'request' is the original request object from the Angular2 client
token = get_token(request)
# Call to 'post' results in error code in response ('not logged in')
response = requests.post(url, data=data, headers={'X-CSRFToken': token, 'Referer': url})
I assume the problem is the 'headers' definition. Can it be done at all?
(CSRF enabled = uses CsrfViewMiddleware)
Turns out I was on the right track. It is most important to include the session ID the client got when logging in also in the new request to the central server.
Here is the code:
url = settings.CENTRAL_SERVER_URL + 'path/to/endpoint'
http_x_token = request.META['HTTP_X_CSRFTOKEN']
csrftoken = request.COOKIES['csrftoken']
session_id = request.COOKIES['sessionid']
response = requests.post(url, data=data,
headers={'X-CSRFToken': http_x_token, 'Referer': url},
cookies={'csrftoken': csrftoken, 'sessionid': session_id})
The session ID should always be present in the request from the client.
SessionMiddleware in Django checks for this. If the session ID is present, the user can be found and everything else works as if I was making a request from the client.

Python Requests cannot connect to Squid Proxy

I have a squid proxy that requires authentication. In squid.conf I am using:
auth_param digest program /usr/lib64/squid/digest_pw_auth -c /etc/squid/passwords
auth_param digest realm proxy
acl authenticated proxy_auth REQUIRED
http_access allow authenticated
From this I can expect the authentication method to be http digest.
Here is my python code:
from requests.auth import HTTPDigestAuth
auth = HTTPDigestAuth("user", "pass")
r = requests.get( "http://www.google.com", allow_redirects=True, headers=Configuration.HEADERS, proxies=proxy_list(), auth=auth )
I am receiving this error:
407 Proxy Authentication Required
I have also tried authenticating with:
auth = HTTPProxyAuth('user', 'password')
and:
http://user:password#ip
With no luck...
Can anybody help?
Thanks
HTTPDigestAuth doesn't authenticate you with the proxy, it authenticates you with the website. Right now Requests doesn't have any built-in way of using Digest Auth with a proxy, and there are no plans to add built-in support.
You'll have to either use with the proxy (by putting your credentials in the proxy URL, e.g. proxies={'http': 'http://user:password#domain.com'}), or write your own authentication handler for Proxy Digest Auth.

requests library https get via proxy leads to error

Trying to send a simple get request via a proxy. I have the 'Proxy-Authorization' and 'Authorization' headers, don't think I needed the 'Authorization' header, but added it anyway.
import requests
URL = 'https://www.google.com'
sess = requests.Session()
user = 'someuser'
password = 'somepass'
token = base64.encodestring('%s:%s'%(user,password)).strip()
sess.headers.update({'Proxy-Authorization':'Basic %s'%token})
sess.headers['Authorization'] = 'Basic %s'%token
resp = sess.get(URL)
I get the following error:
requests.packages.urllib3.exceptions.ProxyError: Cannot connect to proxy. Socket error: Tunnel connection failed: 407 Proxy Authentication Required.
However when I change the URL to simple http://www.google.com, it works fine.
Do proxies use Basic, Digest, or some other sort of authentication for https? Is it proxy server specific? How do I discover that info? I need to achieve this using the requests library.
UPDATE
Its seems that with HTTP requests we have to pass in a Proxy-Authorization header, but with HTTPS requests, we need to format the proxy URL with the username and password
#HTTP
import requests, base64
URL = 'http://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<IP>:<PORT>}
token = base64.encodestring('%s:%s' %(user, password)).strip()
myheader = {'Proxy-Authorization': 'Basic %s' %token}
r = requests.get(URL, proxies = proxies, headers = myheader)
print r.status_code # 200
#HTTPS
import requests
URL = 'https://www.google.com'
user = <username>
password = <password>
proxy = {'http': 'http://<user>:<password>#<IP>:<PORT>}
r = requests.get(URL, proxies = proxy)
print r.status_code # 200
When sending an HTTP request, if I leave out the header and pass in a proxy formatted with user/pass, I get a 407 response.
When sending an HTTPS request, if I pass in the header and leave the proxy unformatted I get a ProxyError mentioned earlier.
I am using requests 2.0.0, and a Squid proxy-caching web server. Why doesn't the header option work for HTTPS? Why does the formatted proxy not work for HTTP?
The answer is that the HTTP case is bugged. The expected behaviour in that case is the same as the HTTPS case: that is, you provide your authentication credentials in the proxy URL.
The reason the header option doesn't work for HTTPS is that HTTPS via proxies is totally different to HTTP via proxies. When you route a HTTP request via a proxy, you essentially just send a standard HTTP request to the proxy with a path that indicates a totally different host, like this:
GET http://www.google.com/ HTTP/1.1
Host: www.google.com
The proxy then basically forwards this on.
For HTTPS that can't possibly work, because you need to negotiate an SSL connection with the remote server. Rather than doing anything like the HTTP case, you use the CONNECT verb. The proxy server connects to the remote end on behalf of the client, and from them on just proxies the TCP data. (More information here.)
When you attach a Proxy-Authorization header to the HTTPS request, we don't put it on the CONNECT message, we put it on the tunnelled HTTPS message. This means the proxy never sees it, so refuses your connection. We special-case the authentication information in the proxy URL to make sure it attaches the header correctly to the CONNECT message.
Requests and urllib3 are currently in discussion about the right place for this bug fix to go. The GitHub issue is currently here. I expect that the fix will be in the next Requests release.

Categories

Resources