Is the data the requests library sends secure in python? [duplicate] - python

I'm about to use Python.requests to get data from my own online api to my local pc. My api requires authentication which for now is done trough simply posting user/pass:
params = {'user': 'username', 'pass':'password'}
requests.post(url, params=params)
Are this requests safe or is it going to allow a middle-man to capture that user/pass?
P.S My api is using a letsencrypt ssl certificate. Python version 3.7.0

this has nothing to do with the python-requests package, but with the HTTP (and HTTPS) protocols. HTTP is plain-text so anyone that manages to sniff your packets can read the content (hence the username/password pair in clear text). HTTPS uses strong encryption, so even someone sniffing your traffic will have a hard-time deciphering it - no encryption scheme is 100% safe of course but decrypting SSL traffic is currently way too costly even for the NSA.
IOW, what will make your requests "safe" is the use of the HTTPS protocol, not which python (or not python) package you use to write your client code.

Use the HTTPS protocol and it's safe provided you have a valid SSL certificate on your api. If you still feel paranoid/insecure, you can implement end-to-end encryption using an existing algorithm or create your custom algorithm either.

Related

Best way to obtain a secure connection using Python urllib

Hope this will be an easy question for someone to answer. I'm in the process of programming an application in Python and part of the application uses an API to obtain a download link and then uses that link to download the corresponding server updates. Currently I'm accomplishing this using urllib.request.urlopen() at the moment and would like to do so securely. Therefore I'm wondering if just specifying https in the URL is enough or if I have to use the context parameter in addition?
The Python documentation is a bit vague in how it handles HTTPS requests but as I understand it right now specifying https in the URL should be sufficient.
In the documentation for urllib.request.urlopen, there is a reference to the http.client.HTTPSConnection class in regards to the context parameter. In the documentation for the HTTPSConnection class, there is a link for security considerations. In here it states:
For client use, if you don’t have any special requirements for your security policy, it is highly recommended that you use the create_default_context() function to create your SSL context. It will load the system’s trusted CA certificates, enable certificate validation and hostname checking, and try to choose reasonably secure protocol and cipher settings.
Given that the documentation on urllib.request.urlopen() shows that context is an optional parameter, you probably don't HAVE to use it to make secure https connections, but given what the security considerations section says, I would use
ssl.create_default_context()
to generate the context just as good practice
urllib.request.urlopen("https://www.stackoverflow.com", context=ssl.create_default_context())
EDIT
Upon reviewing the source code for urllib.request.urlopen, if you do not specify a context but you use an https url, it looks like it will provide a default context for you. If you don't provide a context to urlopen() it will call build_opener() and in THAT function's comments it states
The opener will use several default handlers, including support
for HTTP, FTP and when applicable HTTPS.
So the final answer is you should be fine with providing no context, all it should need is the url

Detect SSL/TLS Client Authentication with Python?

I have multiple Web Servers (mostly IIS & Apache).
Some of them are configured to allow only clients with a specific certificate. Moreover, we have users that have multiple certificates for the same web server which allows different actions to be made.
Ex : Certificates that are used only for highly privileged actions and some for everyday use.
We do not use the operating system certificate store for compliance issues.
I have made the following python code which works by specifying the wanted certificate :
import requests
response = requests.get("https://myserver-dns-name.com", cert="./ClientCert.Key")
I tried fetching the server certificate and look for the following enhanced key usage oid 1.3.6.1.5.5.7.3.1 (TLS Server Auth) but not all servers have it specified.
I tried using the Python ssl library and handle the handshake but the library doesn't seems to allow it. What I wanted is have a callback when the server sends a certificate request to the client.
What I want to know is if its possible to detect from the client that the server have client authentication enforced in Python or any other languages?
I want to implement the same behavior as chromium in Python (Showing a certificate selection dialog as seen here ssl_client_auth_handler.cc)

Encrypted data in REST Services Response

I use Django and Django-rest-framework about REST services from Back-end and Mobile Client Apps.
I would to have some responses with encrypted data. I have to return to my client some sensible and private data and I would to apply an additional security layer (in fact I already use SSL, but I would to disarm some attacks (like man in the middle) where some unwanted element can see some data contained in my responses).
I would to avoid this, so I thought to add in my response the encrypted data.
Does that make sense? Is there something similar in Django - REST- Framework?
A good encryption libary with various implementations is Keyczar.
What you would need to do is write a global interceptor on all incoming request to your backend application, and when responses are sent back they are encrypted using the Keyczar library.
On the consumer side (your mobile application) you would need to implement something similar that decrypts the responses from your backend.
BONUS: if you're not doing this already, you probably want to look at using 2-way SSL to ensure that you authenticate the client that calls your backend.

For user-based and certificate-based authentication, do I want to use urllib, urllib2, or curl?

A few months ago, I hastily put together a Python program that hit my company's web services API. It worked in three different modes:
1) HTTP with no authentication
2) HTTP with user-name and password authentication
3) HTTPS with client certificate authentication
I got 1) to work with urllib, but ran into problems with 2) and 3). Instead of figuring it out, I ended up calculating the proper command-line parameters to curl, and executing it via os.system().
Now I get to re-write this program with the benefit of experience, and I'm not sure if I should use urllib, urllib2, or just stick with curl.
The urllib documentation mentions:
When performing basic authentication, a FancyURLopener instance
calls its prompt_user_passwd() method. The default implementation
asks the users for the required information on the controlling
terminal. A subclass may override this method to support
more appropriate behavior if needed.
It also mentions the **x509 argument to urllib.URLopener():
Additional keyword parameters, collected in x509, may be used for
authentication of the client when using the https: scheme. The
keywords key_file and cert_file are supported to provide an SSL
key and certificate; both are needed to support client
authentication.
But urllib2 is one greater than urllib, so naturally I want to use it instead. The urllib2 documentation is full of information about authentication handlers that seem to be designed for 2) above, but makes no mention whatsoever of client certificates.
My question: do I want to use urllib, because it appears to support everything I need to achieve? Or should I just stick with curl?
Thanks.
Edit: Maybe my question isn't specific enough, so here's another shot. Can I achieve what I want to do with urllib? Or with urllib2? Or am I forced to use curl out of necessity?
I believe that mechanize is the module you need.
EDIT: mechanize objects have this method for authentication: add_password(self, url, user, password, realm=None)

How do I implement secure authentication using xml-rpc in python?

I have a basic xml-rpc web service service running.
What is the simplest way(I'm a newbie) to implement secure authentication?
I just need some direction.
You could checkout This code for a simple XML-RPC server over HTTPS. Authentication can work in any way you wish ... they could authenticate with some credentials and you provide a cookie for the rest of the session.
The Python docs for xmlrpc include details of using the HTTP 'Authorization' header for passing in credentials.
Here is some code that uses Twisted to implement a xmlrpc auth mechanism, which could easily use HTTPS instead of HTTP.
This guy has written a HTTPS XML-RPC setup with authorization which you can download.
There are tons of resources, and ways of doing this which are easily googleable. This all depends on if you are using mod_wsgi for example, or writing a standalone server using Twisted.
Bottom line:
a) Use SSL for communication
b) Use the HTTP authorization mechanism

Categories

Resources