Python / Zeep / SOAP proxy problem (I think) - python

I am trying to get this to work:
Practitioner notes -> Python scripts for automation in NA -> Python client to invoke NA SOAP APIs
Here is my code (sanitized a bit):
#! /usr/bin/env python3
from requests import Session
from zeep import Client
from zeep.transports import Transport
session = Session()
session.verify = False
transport = Transport(session=session)
client = Client( 'https://SERVER_FQDN/soap?wsdl=api.wsdl.wsdl2py', transport=transport)
# I added this for the network proxy
client.transport.session.proxies = {
'http': '10.0.0.1:80',
'https': '10.0.0.1:80',
}
# Then found I needed this because "localhost" is hard-coded in the WSDL
client.service._binding_options['address'] = 'https://SERVER_FQDN/soap'
login_params = {
'username':'user',
'password':'PASSWORD',
}
loginResult = client.service.login(parameters=login_params )
sesnhdr_type = client.get_element('ns0:list_deviceInputParms')
sesnhdr = sesnhdr_type(sessionid=loginResult.Text)
devices = client.service.list_device(_soapheaders=[sesnhdr], parameters=sesnhdr)
print('\n\n ----------------------------- \n')
for i in devices.ResultSet.Row:
print(i.hostName + ' ---> '+i.primaryIPAddress)
params = {
"ip":i.primaryIPAddress,
"sessionid": loginResult.Text
}
device = client.service.show_deviceinfo(parameters=params)
print(device.Text)
print('\n\n ----------------------------- \n')
And here is my output:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 667, in urlopen
self._prepare_proxy(conn)
File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 932, in _prepare_proxy
conn.connect()
File "/usr/local/lib/python3.6/site-packages/urllib3/connection.py", line 317, in connect
self._tunnel()
File "/usr/lib64/python3.6/http/client.py", line 929, in _tunnel
message.strip()))
OSError: Tunnel connection failed: 503 Service Unavailable
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/usr/local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 727, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "/usr/local/lib/python3.6/site-packages/urllib3/util/retry.py", line 439, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='SERVER_FQDN', port=443): Max retries exceeded with url: /soap (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 503 Service Unavailable',)))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "./na-1.py", line XX, in <module>
loginResult = client.service.login(parameters=login_params )
File "/usr/local/lib/python3.6/site-packages/zeep/proxy.py", line 51, in __call__
kwargs,
File "/usr/local/lib/python3.6/site-packages/zeep/wsdl/bindings/soap.py", line 127, in send
response = client.transport.post_xml(options["address"], envelope, http_headers)
File "/usr/local/lib/python3.6/site-packages/zeep/transports.py", line 107, in post_xml
return self.post(address, message, headers)
File "/usr/local/lib/python3.6/site-packages/zeep/transports.py", line 74, in post
address, data=message, headers=headers, timeout=self.operation_timeout
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 578, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 530, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 643, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/adapters.py", line 510, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPSConnectionPool(host='SERVER_FQDN', port=443): Max retries exceeded with url: /soap (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 503 Service Unavailable',)))
I get the same errors if I use "localhost" and run the script on the server-in-question.
The system has proxy environment values set.
There are proper forward and reverse DNS entries for the server.
The name and IP for the server are also in /etc/hosts
Here is the problem:
If I use an IP address instead of the server's FQDN, the code runs.
Vendor support says the problem is not in their application that provides the endpoint:
The 503 error means that the service is not available, there are 3 situations that invoke this behavior: 1. The server is under maintenance, 2. The server is overloaded, 3. In rare cases, the DNS configuration is faulty. If we see, this problem is not related to NA because the request is working fine with the IP.
Any ideas on this ?
Why does only the IP work and NOT the FQDN or localhost ?

Most of the documentation I see for using proxies with Zeep start with client = Client(url) but that doesn't work if url is behind a firewall and can't be accessed except through a proxy! My attempt to do it according to the documentation did nothing but time out (of course).
The key is in understanding that Zeep is built on requests and requests can use proxies for initiating the session. So you need to build a proxied Session, then pass that session into the Transport, and initialize the Zeep client with that transport. This worked for me:
session = requests.Session()
session.auth = requests.auth.HTTPBasicAuth(soap_username, soap_password)
session.proxies = {"https": f"socks5://{settings.STATIC_PROXY}"}
transport = zeep.transports.Transport(session=session, timeout=(5, 30))
client = zeep.Client(url, transport=transport)

My problem lay in the fact that the initialization of Client wants to go ahead and make the connection, but I need the proxy setting at the start. So I cobbled together two examples from the official docs to set the proxy at the time the connection is made.
from zeep import Client
from zeep.transports import Transport
from requests import Session
session = Session()
session.proxies = {
'http': 'http://username:password#proxy.example.com:8080',
'https': 'http://username:password#proxy.example.com:8080'
}
transport=Transport(session=session)
client = Client(URL,transport=transport)

Related

Errno 111 Connection refused error, between flask server and raspberry pi over ethernet

I am trying to send data through an http request with the python requests library froma raspberry pi to my local computer connected by an ethernet cable. When trying to send data from the raspberry pi I get an Failed to establish a new connection: [Errno 111] Connection refused')) error. I have attached the full stack trace below.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/pi/Documents/PROJECT_NAME/src/client/send_data.py", line 7, in <module>
response = requests.request("GET", url, headers=headers, data=payload,)
File "/usr/lib/python3/dist-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3/dist-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='0.0.0.0', port=6000): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0xb5a6b1b0>: Failed to establish a new connection: [Errno 111] Connection refused'))
I was expecting the response to return a simple string as my route is a simple hello world function. The client file and flask server minimum examples are attached below.
client.py
import requests
url = "https://0.0.0.0:6000/"
payload={}
headers = {}
response = requests.request("GET", url, headers=headers, data=payload,)
print(response.text)
app.py
import flask
app = flask.Flask(__name__)
#app.route("/", methods=["GET"])
def hello():
return flask.jsonify("Hello World!")
if __name__ == "__main__":
app.run(host='0.0.0.0', port=6000, debug=True)
I have tried disabling my firewall but this has not fixed the issue. I am also developing on the Pi through ssh using the vscode remote development extension and there is no issue with connection on that front. Any help is appreciated!
Thank you!

Why do parameters (in particular proxies) in Requests Session not persist across Python Requests?

The requests documentation (link) mentioned that a session is what allows some parameters to persist across requests. My use case is simple; because I sit behind a corporate proxy and firewall, I need to set the proxy parameters proxies (as mentioned in the title) in a session and I don't want to have to set it for every request.
Supposedly, you can do the following (directly copied from the proxies section):
import requests
proxies = {
'http': 'http://10.10.1.10:3128',
'https': 'http://10.10.1.10:1080',
}
session = requests.Session()
session.proxies.update(proxies)
session.get('http://example.org')
This should allow you to set proxies, without stating them in the request itself. Thus my session function looks like this below:
def requests_setup():
# setup proxy
proxies = {'http': f'http://someproxy:8080',
'https': f'http://someproxy:8080'}
# initialize session
session = requests.Session()
# Part 1: set up proxy
session.proxies.update(proxies)
# Part 2: add certificate
session.verify = r'SOME_CERT_BUNDLE.pem'
return session
Get request example that results in an error
# making an example get request
setup = requests_setup()
url = "https://example.com"
r = setup.get(f"{url}", timeout=5)
Posting the full traceback below, but the following errors seems to be the problem. And my understanding of this is that the ssl cert verification did not go through for some reason (as suggested by the trace, I believe it is because proxy was not included; for a session without the verify parameter set, it would instead result in a sslCertVerification error during the request that worked below).
Error 1 ...
socket.timeout: _ssl.c:1074: The handshake operation timed out
... leading to Error 2
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='example.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', timeout('_ssl.c:1074: The handshake operation timed out')))
... and finally Error 3
requests.exceptions.ProxyError: HTTPSConnectionPool(host='example.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', timeout('_ssl.c:1074: The handshake operation timed out')))
The silver lining is that this was solved, eventually, by specifying the parameter in the request body.
setup = utils.requests_setup()
# making an example get request
url = "https://example.com"
proxies = {'http': f'http://someproxy:8080',
'https': f'http://someproxy:8080'}
r = setup.get(f"{url}", timeout=5, proxies=proxies)
But why is that the case? I can see clearly that my session's proxy attributes are initialized . But for some reason it was not utilized in the get request made using that session.
PS: There might be questions about why my proxy is prefixed with http for both cases. It is purely because we don't have a standalone https proxy server. The request also fails when I use a "HTTPS" prefix instead there.
PPS: example.com is not the site used. I have tried to use google.com, or others (such as the API I am trying to call), but that did not change the results.
Actual Error Traceback
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\connectionpool.py", line 696, in urlopen
self._prepare_proxy(conn)
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\connectionpool.py", line 964, in _prepare_proxy
conn.connect()
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\connection.py", line 359, in connect
conn = self._connect_tls_proxy(hostname, conn)
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\connection.py", line 506, in _connect_tls_proxy
ssl_context=ssl_context,
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\util\ssl_.py", line 450, in ssl_wrap_socket
sock, context, tls_in_tls, server_hostname=server_hostname
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\util\ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\ssl.py", line 423, in wrap_socket
session=session
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\ssl.py", line 870, in _create
self.do_handshake()
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\ssl.py", line 1139, in do_handshake
self._sslobj.do_handshake()
socket.timeout: _ssl.c:1074: The handshake operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\requests\adapters.py", line 449, in send
timeout=timeout
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\connectionpool.py", line 756, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\urllib3\util\retry.py", line 574, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='example.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', timeout('_ssl.c:1074: The handshake operation timed out')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\requests\sessions.py", line 555, in get
return self.request('GET', url, **kwargs)
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\requests\sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "C:\ProgramData\Anaconda3\envs\VA_API\lib\site-packages\requests\adapters.py", line 510, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPSConnectionPool(host='example.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', timeout('_ssl.c:1074: The handshake operation timed out')))
Information for reproducing the issue:
OS version: 'Windows-10-10.0.18362-SP0'
Python version: '3.7.11 (default, Jul 27 2021, 09:42:29) [MSC v.1916 64 bit (AMD64)]'
Requests version: '2.26.0'

python request ssh tunnel as proxy

Is it possible to use python requests with ssh tunnel as proxy ? How can I achieve that ? Tried already this but no success :
from sshtunnel import SSHTunnelForwarder
import requests as r
from bs4 import BeautifulSoup as soup
server = SSHTunnelForwarder(
'sship',
ssh_username="XXXX",
ssh_password="XXXX",
remote_bind_address=('127.0.0.1', 8080)
)
server.start()
print(server.local_bind_port)
proxies = {
"http": "http://127.0.0.1:8080",
}
url = 'http://www.google.com'
headers = {
'User-Agent': 'My User Agent 1.0',
'From': 'youremail#domain.com' # This is another valid field
}
data = r.get(url, headers=headers , proxies = proxies)
page_data = soup(data.text , 'html.parser')
print page_data
this is the error that I get:
37657
Traceback (most recent call last):
File "ssh.py", line 27, in
data = r.get(url, headers=headers , proxies = proxies)
File "/usr/lib/python2.7/site-packages/requests-2.19.1-py2.7.egg/requests/api.py", line 72, in get
return request('get', url, params=params, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.19.1-py2.7.egg/requests/api.py", line 58, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.19.1-py2.7.egg/requests/sessions.py", line 512, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.19.1-py2.7.egg/requests/sessions.py", line 622, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.19.1-py2.7.egg/requests/adapters.py", line 507, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPConnectionPool(host='127.0.0.1', port=80): Max retries exceeded with url: http://google.com/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused',)))
I think you need to either explicitly set the local bind address in SSHTunnel, or set up the port for your proxy like this:
`
https_tunnel.start()
proxies = {
#'http': 'http://localhost:{}'.format(str(http_tunnel.local_bind_port)),
'https': 'https://localhost:{}'.format(str(https_tunnel.local_bind_port))
}
`
where https_tunnel is an instance of SSHTunnelForwarder. I've done that, however I get errors around actually sending the GET request:
python2.7/site-packages/requests/adapters.py", line 490, in send
raise ConnectionError(err, request=request) requests.exceptions.ConnectionError: ('Connection aborted.',
BadStatusLine("''",))

handling [Errno 111] Connection refused return by requests in flask

I have my backend developed in java which does all kind of processing. And my frontend is developed using python's flask framework. I am using requests to send a request and get a response from the apis present in java.
Following is the line in my code which does that:
req = requests.post(buildApiUrl.getUrl('user') + "/login", data=payload)
My problem is, sometimes when the tomcat instance is not running or there is some issue with java apis, I always get an error from requests as follows:
ERROR:root:HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /MYAPP/V1.0/user/login (Caused by <class 'socket.error'>: [Errno 111] Connection refused)
Traceback (most recent call last):
File "/home/rahul/git/myapp/webapp/views/utils.py", line 31, in decorated_view
return_value = func(*args, **kwargs)
File "/home/rahul/git/myapp/webapp/views/public.py", line 37, in login
req = requests.post(buildApiUrl.getUrl('user') + "/login", data=payload)
File "/home/rahul/git/myapp/venv/local/lib/python2.7/site-packages/requests/api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "/home/rahul/git/myapp/venv/local/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/home/rahul/git/myapp/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 335, in request
resp = self.send(prep, **send_kwargs)
File "/home/rahul/git/myapp/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 438, in send
r = adapter.send(request, **kwargs)
File "/home/rahul/git/myapp/venv/local/lib/python2.7/site-packages/requests/adapters.py", line 327, in send
raise ConnectionError(e)
ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /MYAPP/V1.0/user/login (Caused by <class 'socket.error'>: [Errno 111] Connection refused)
I want to handle any such errors that I receive in my flask app so that I can give the necessary response on the web page instead of showing blank screen. So how can I achieve this?
Catch the exception request.post raises using try-except:
try:
req = requests.post(buildApiUrl.getUrl('user') + "/login", data=payload)
except requests.exceptions.RequestException:
# Handle exception ..

HTTPS proxies with Requests: [Errno 8] _ssl.c:504: EOF occurred in violation of protocol

I am using Requests 1.2.3 on Windows 7 x64 and am trying to connect to (any) site via HTTPS using a HTTPS proxy by passing the proxies argument to the request.
I don't experience this error when using urllib2's ProxyHandler, so I don't think it's on my proxy's side.
>>> opener = urllib2.build_opener(urllib2.ProxyHandler({'https': 'IP:PORT'}))
>>> resp = opener.open('https://www.google.com')
>>> resp.url
'https://www.google.co.uk/'
>>> resp = requests.get('https://www.google.com', proxies={'https': 'IP:PORT'})
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python27\lib\site-packages\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 335, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 438, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 331, in send
raise SSLError(e)
requests.exceptions.SSLError: [Errno 8] _ssl.c:504: EOF occurred in violation of protocol
I should probably note that the same error still happens if I pass verify=False to the request.
Any suggestions? I've looked at related questions but there was nothing that worked for me.
I suspect your proxy is a http proxy over which you can use https (the common case)
The problem is, that requests uses https to talk to proxies if the request itself https.
Using an explicit protocol (http) for your proxy should fix things: proxies={'https': 'http://IP:PORT'}
Also have a look at https://github.com/kennethreitz/requests/issues/1182

Categories

Resources