Statically built python code cannot resolve DNS names - python

I'm trying to deploy code to a server to which I don't have root access. So the solution I was thinking was to deploy using pyinstaller and staticx.
My code runs in python 3.7, in a nutshell, does something like:
import requests
response = requests.get('http://example.com/api/action')
# Do something with the response
When I run it in my environment, even after "building" with pyinstaller and staticx, it works flawlessly.
However, when I try to deploy it (the server is running Red Hat Enterprise Linux Server release 7.4, which does not have python 3), I get:
Traceback (most recent call last):
File "site-packages/urllib3/connection.py", line 160, in _new_conn
File "site-packages/urllib3/util/connection.py", line 57, in create_connection
File "socket.py", line 748, in getaddrinfo
socket.gaierror: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "site-packages/urllib3/connectionpool.py", line 603, in urlopen
File "site-packages/urllib3/connectionpool.py", line 355, in _make_request
File "http/client.py", line 1229, in request
File "http/client.py", line 1275, in _send_request
File "http/client.py", line 1224, in endheaders
File "http/client.py", line 1016, in _send_output
File "http/client.py", line 956, in send
File "site-packages/urllib3/connection.py", line 183, in connect
File "site-packages/urllib3/connection.py", line 169, in _new_conn
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f912ca96f28>: Failed to establish a new connection: [Errno -2] Name or service not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "site-packages/requests/adapters.py", line 449, in send
File "site-packages/urllib3/connectionpool.py", line 641, in urlopen
File "site-packages/urllib3/util/retry.py", line 399, in increment
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='someserver.com', port=80): Max retries exceeded with url: /api/action (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f912ca96f28>: Failed to establish a new connection: [Errno -2] Name or service not known'))
Everything works if I use IP addresses instead of domains. Unfortunately, it's not a possibility because I have to hit an proxy that uses hosts. Also IP addresses may change in the future.
The server resolves names perfectly fine if I use nslookup or ping.
Any ideas why this could be happening? Any alternatives on how I could deploy my code in a way that it works in the remote server may also be a valid answer. Note however:
I have no root access and requesting the installation of python3, any libs or any other package such as docker, etc would most likely be rejected by my customer
The server can resolve external names, such as google.com but does not have access to the outer internet (the code is ment to work on internal servers, with internal DNS such as someserver.mycustomer.example)

Related

Universal proxy configuration for Python urllib3

I am trying to use and install a python library according to the instructions at: https://pypi.org/project/ai4bharat-transliteration/. My system is behind a corporate proxy and I am able to use pip and other libraries including urllib3 to access the internet when I am writing code from scratch.
However, in this case, the library wants to access some files over internet when running:
`e = XlitEngine("hi", beam_width=10, rescore=True)`
And that results in a wall of urllib3 proxy related errors, the first of which is:
Downloading Multilingual model for transliteration
SSL certificate not verified...
/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py:1045: InsecureRequestWarning: Unverified HTTPS request is being made to host '172.xx.yy.zzz'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
warnings.warn(
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 700, in urlopen
self._prepare_proxy(conn)
File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 996, in _prepare_proxy
conn.connect()
File "/usr/local/lib/python3.8/dist-packages/urllib3/connection.py", line 369, in connect
self._tunnel()
File "/usr/lib/python3.8/http/client.py", line 901, in _tunnel
(version, code, message) = response._read_status()
File "/usr/lib/python3.8/http/client.py", line 277, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "/usr/lib/python3.8/socket.py", line 669, in readinto
return self._sock.recv_into(b)
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3/dist-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/usr/local/lib/python3.8/dist-packages/urllib3/connectionpool.py", line 787, in urlopen
retries = retries.increment(
File "/usr/local/lib/python3.8/dist-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='objects.githubusercontent.com', port=443): Max retries exceeded with url: /github-production-release-asset-2e65be/487173539/4ef3b62d-385b-4a3a-9ab1-a3cc55764ef3?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20221220%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20221220T101158Z&X-Amz-Expires=300&X-Amz-Signature=d24db49d92188df3dbf8a0f1a05126bdaae8bf42289befe734331a41b336f11c&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=487173539&response-content-disposition=attachment%3B%20filename%3Dindicxlit-en-indic-v1.0.zip&response-content-type=application%2Foctet-stream (Caused by ProxyError('Cannot connect to proxy.', timeout('timed out')))
Where 172.xx.yy.zzz is the url to my proxy and I am guessing that the warning regarding the SSL certificate is just a warning and the problem is the proxy configuration of urllib3.
If so, is there a way to set a universal proxy which will be honored by urllib3 before the library is called by the XlitEngine package installed above. I am reluctant to attempt any changes to the XlitEngine package installed above. I tried posting the issue on Github for XlitEngine but have not received any response so far.
If it is of any consequence, I am using Python 3.8.10 on a headless Ubuntu 20.04 Server Virtual Machine.
Cheers!

SSL error in Python requests when in CentOS

I'm trying to use Python requests to access a URL from https://dadosabertos.bndes.gov.br, but it fails in CentOS. It works fine in Windows.
Here is the error:
>>> import requests
>>> requests.__version__
'2.26.0'
>>> requests.get('https://dadosabertos.bndes.gov.br')Traceback (most recent call last): File "/opt/python3/lib64/python3.6/site-packages/urllib3/connectionpool.py", line 696, in urlopen
self._prepare_proxy(conn)
File "/opt/python3/lib64/python3.6/site-packages/urllib3/connectionpool.py", line 964, in _prepare_proxy
conn.connect()
File "/opt/python3/lib64/python3.6/site-packages/urllib3/connection.py", line 426, in connect
tls_in_tls=tls_in_tls,
File "/opt/python3/lib64/python3.6/site-packages/urllib3/util/ssl_.py", line 450, in ssl_wrap_socket
sock, context, tls_in_tls, server_hostname=server_hostname
File "/opt/python3/lib64/python3.6/site-packages/urllib3/util/ssl_.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib64/python3.6/ssl.py", line 365, in wrap_socket
_context=self, _session=session)
File "/usr/lib64/python3.6/ssl.py", line 776, in __init__
self.do_handshake()
File "/usr/lib64/python3.6/ssl.py", line 1036, in do_handshake
self._sslobj.do_handshake()
File "/usr/lib64/python3.6/ssl.py", line 648, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:897)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/python3/lib64/python3.6/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/opt/python3/lib64/python3.6/site-packages/urllib3/connectionpool.py", line 756, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "/opt/python3/lib64/python3.6/site-packages/urllib3/util/retry.py", line 574, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='dadosabertos.bndes.gov.br', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:897)'),))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/python3/lib64/python3.6/site-packages/requests/api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "/opt/python3/lib64/python3.6/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/opt/python3/lib64/python3.6/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/opt/python3/lib64/python3.6/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/opt/python3/lib64/python3.6/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='dadosabertos.bndes.gov.br', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:897)'),))
My CentOS version: CentOS Linux release 8.4.2105. It works in Windows 10. I'm using requests lib version 2.26.0.
I tried to download the certificate from the site and validate with it using this command:
requests.get('https://dadosabertos.bndes.gov.br', verify=True,
cert='./bndes-gov-br.pem')
but got a similar exception. Here is the stack trace:
>>> requests.get('https://dadosabertos.bndes.gov.br', verify=True, cert='./bndes-gov-br.pem')
Traceback (most recent call last):
File "/home/xxxxx/lib/python3.7/site-packages/urllib3/connectionpool.py", line 594, in urlopen
self._prepare_proxy(conn)
File "/home/xxxxx/lib/python3.7/site-packages/urllib3/connectionpool.py", line 805, in _prepare_proxy
conn.connect()
File "/home/xxxxx/lib/python3.7/site-packages/urllib3/connection.py", line 344, in connect
ssl_context=context)
File "/home/xxxxx/lib/python3.7/site-packages/urllib3/util/ssl_.py", line 338, in ssl_wrap_socket
context.load_cert_chain(certfile, keyfile)
ssl.SSLError: [SSL] PEM lib (_ssl.c:3854)
Note that this error happens also in Python 3.7.
I exported the certificate following these instructions
Initially I tried to configure my machine global certificates, but it looks like Python and Requests lib uses its own. Another question gave me a lot of valuable info to configure my certificate.
Since I couldn't make requests lib use my certificate, I believe there is an error in the downloaded certificate or in the validation lib.
Here is its contents of my bndes-gov-br.pem file downloaded using the browser (I got the same error trying with the complete certificate chain):
-----BEGIN CERTIFICATE-----
MIIGjzCCBXegAwIBAgIMdIDfTRbWNDjcygdHMA0GCSqGSIb3DQEBCwUAMFAxCzAJ
BgNVBAYTAkJFMRkwFwYDVQQKExBHbG9iYWxTaWduIG52LXNhMSYwJAYDVQQDEx1H
bG9iYWxTaWduIFJTQSBPViBTU0wgQ0EgMjAxODAeFw0yMDAyMTMxNzM3MDBaFw0y
MjAyMTMxNzM3MDBaMIGlMQswCQYDVQQGEwJCUjEXMBUGA1UECBMOUmlvIGRlIEph
bmVpcm8xFzAVBgNVBAcTDlJpbyBkZSBKYW5laXJvMQwwCgYDVQQLEwNBVEkxPTA7
BgNVBAoTNEJhbmNvIE5hY2lvbmFsIGRlIERlc2Vudm9sdmltZW50byBFY29ub21p
Y28gZSBTb2NpYWwxFzAVBgNVBAMMDiouYm5kZXMuZ292LmJyMIIBIjANBgkqhkiG
9w0BAQEFAAOCAQ8AMIIBCgKCAQEAsqNBHzLfEWeYk5cxF+hT3ZV9Ki6u7WjGXOx4
c6HMB7tDrbyp8wbmaJPNo8yWDAJ0eL4N+QVJ6IG2rJ7DLU65+76qcv8iLG5OcsnZ
K9o1NfnEaNWIy8Vf0edO7bkalXD8YYf5QQMSZ+TqPIA3cJnFKibNTbqaBRbjvwF9
QBaCATZnl0xg3/kD2Wdjtzdrg0JXBcRcrDeQOV/22/O2JMjbjRpoMeuqR9O8OwfE
JTT3tJxTE6LWKSIZR8nc+rMLW4sqw+QZPGMdS85m9eStUrHxQUHEBpScAPN9fN4c
u2L0U51nedZgfHEfqyjYVCOY0zoVEv5MW0UV5+mbObcy2v/d5QIDAQABo4IDETCC
Aw0wDgYDVR0PAQH/BAQDAgWgMIGOBggrBgEFBQcBAQSBgTB/MEQGCCsGAQUFBzAC
hjhodHRwOi8vc2VjdXJlLmdsb2JhbHNpZ24uY29tL2NhY2VydC9nc3JzYW92c3Ns
Y2EyMDE4LmNydDA3BggrBgEFBQcwAYYraHR0cDovL29jc3AuZ2xvYmFsc2lnbi5j
b20vZ3Nyc2FvdnNzbGNhMjAxODBWBgNVHSAETzBNMEEGCSsGAQQBoDIBFDA0MDIG
CCsGAQUFBwIBFiZodHRwczovL3d3dy5nbG9iYWxzaWduLmNvbS9yZXBvc2l0b3J5
LzAIBgZngQwBAgIwCQYDVR0TBAIwADAnBgNVHREEIDAegg4qLmJuZGVzLmdvdi5i
coIMYm5kZXMuZ292LmJyMB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAf
BgNVHSMEGDAWgBT473/yzXhnqN5vjySNiPGHAwKz6zAdBgNVHQ4EFgQUr8ZwKoFq
XqEty6FDsn6fsqeGm+kwggF9BgorBgEEAdZ5AgQCBIIBbQSCAWkBZwB1AKS5CZC0
GFgUh7sTosxncAo8NZgE+RvfuON3zQ7IDdwQAAABcD+gIlMAAAQDAEYwRAIgR/oA
SKJK0xqLbAJGCVnSP5IyLeXHkEYA9XsQGsISa3kCIBwZ4jMOyZYdZD7WzRF7Zq9G
/xxH9V8NzJcu5Sn6iKo5AHYAb1N2rDHwMRnYmQCkURX/dxUcEdkCwQApBo2yCJo3
2RMAAAFwP6AimgAABAMARzBFAiEAwo4mDeGUqOCWdgHBoPsjgq4RnjA2e/o4tSpb
dLWIzYUCIHUhbmk9jH8kx0W0t5SOLI/tBAJRyWlaC3GEAUSh5sW4AHYAVYHUwhaQ
NgFK6gubVzxT8MDkOHhwJQgXL6OqHQcT0wwAAAFwP6AiegAABAMARzBFAiEA+Lku
wDF2G9QAVuCSd85xFUkoAV8MO0Cv2nle4ZbzgeECIE6SdOMLinYiX4YUZzl/jzql
ZT3/XeNQ4XvCO5Fa7i9tMA0GCSqGSIb3DQEBCwUAA4IBAQBTn7kU8YF+N0uWrUJj
89vrq2OSXI8ShkimdziYNmciH9+Qvle1X/utcfng8SGa0xiSAcNSlEYRskq6D3pv
uSkXRO/9/r5+7WNRYE4wb/b1AbMQYINPqEd6SXW139Em7WPrq5M8nzzAXZ7Qy+ii
7cq4K7E0VPMCDsK948iUf+Nr7BBNlaD5J5/cWPm1p/EHi6pG6RUdTWTLnPjt40G9
6K7HivIvGkMq7HcEs2An+Y9yTmjzV1YhCIV/BzuFbc97z8vpfeF738K9N6bPkbFt
CcjkGVLQHiw0sld6uL75u+Z4gq8JFRd1OJFYT2EgJQFpl3zFQBVVuBMQivM9/QHO
xY6d
-----END CERTIFICATE-----
How do I configure Python 3.6 in CentOS so it access files in https://dadosabertos.bndes.gov.br without turning off SSL?
As specified in the document: https://docs.python-requests.org/en/latest/api/, the cert option is used to specify the client cert instead of server cert. Client cert is what you (as client) provides to the web server, so it believes who you are, thus it's not what you need. Instead, server cert is what https://dadosabertos.bndes.gov.br provides for you so you know that you are talking to the real website. It's provided by the server during the SSL handshake so you don't need to mannually speicify it. Question here is that requests fail to validate the server cert.
I try to reproduce your result in Docker centos environment but it works without any problems. requests use root certificates provided by the certifi package. It's possible that your certifi package is out-of-data. So I guess you may uninstall certifi and requests package and reinstall to get a latest copy of trusted root certificates.
My requests and dependencies versions:
certifi-2021.10.8
charset-normalizer-2.0.7
idna-3.3
requests-2.26.0
urllib3-1.26.7

python and phoenixdb on Ubuntu (or Windows). How to?

I have created a python script connecting to a Phoenix HBase to analyze some data. I want to set this script up on crontab on a ubuntu server that I have running.
The script is perfectly able to run on my Windows 10 machine. But when I try to use the phoenixdb connector on Ubuntu I get an error on RunTime.
>>> import phoenixdb
>>> url = '<some-url>'
>>> conn = phoenixdb.connect(url, autocommit=True)
Traceback (most recent call last):
File "/home/ubuntu/.local/lib/python3.5/site-packages/phoenixdb/avatica.py", line 156, in connect
self.connection.connect()
File "/usr/lib/python3.5/http/client.py", line 849, in connect
(self.host,self.port), self.timeout, self.source_address)
File "/usr/lib/python3.5/socket.py", line 711, in create_connection
raise err
File "/usr/lib/python3.5/socket.py", line 702, in create_connection
sock.connect(sa)
TimeoutError: [Errno 110] Connection timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/ubuntu/.local/lib/python3.5/site-packages/phoenixdb/__init__.py", line 63, in connect
client.connect()
File "/home/ubuntu/.local/lib/python3.5/site-packages/phoenixdb/avatica.py", line 158, in connect
raise errors.InterfaceError('Unable to connect to the specified service', e)
phoenixdb.errors.InterfaceError: ('Unable to connect to the specified service', TimeoutError(110, 'Connection timed out'), None, None)
I was hoping someone here knows a way to fix this problem?
I am running Python 3.6 on Windows and Python 3.5.2 on Ubuntu, but I doubt that that is the problem.
EDIT:
I have now started a Windows 2012 Server and have tried setting my script up here as well, and it seems not to be a problem solely for Ubuntu. I am getting the exact same error on Windows.
>>> import phoenixdb
>>> url = '<some-url>'
>>> conn = phoenixdb.connect(url, autocommit=True)
Traceback (most recent call last):
File "C:\Users\Administrator\Anaconda3\lib\site-packages\phoenixdb\avatica.py", line 156, in connect
self.connection.connect()
File "C:\Users\Administrator\Anaconda3\lib\http\client.py", line 936, in connect
(self.host,self.port), self.timeout, self.source_address)
File "C:\Users\Administrator\Anaconda3\lib\socket.py", line 722, in create_connection raise err
File "C:\Users\Administrator\Anaconda3\lib\socket.py", line 713, in create_connection
sock.connect(sa)
TimeoutError: [WinError 10060] A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\Administrator\Anaconda3\lib\site-packages\phoenixdb\__init__.py", line 63, in connect
client.connect()
File "C:\Users\Administrator\Anaconda3\lib\site-packages\phoenixdb\avatica.py", line 158, in connect
raise errors.InterfaceError('Unable to connect to the specified service', e)
phoenixdb.errors.InterfaceError: ('Unable to connect to the specified service',
TimeoutError(10060, 'A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond', None, 10060, None), None, None)
I recently did format the PC that I developed the script on. The was using this phoenixdb connector, and I did not experience a similar problem on that.
I did also try to install Python 3.6 on the Windows machine (similar to the same python version that I installed on my normal PC - the one I developed the script on).
I'm really lost for finding a solution..
I finally found the problem. It had nothing to do with the machines I set up my script on. It had to do with security settings on the AWS Machines that both the Ubuntu and Windows Servers were.

Python 3.5 connection.py: sock.connect(sa) ConnectionRefusedError: [Errno 111]

I have a Python 3.5 script that pulls info from another Linux server. It must first connect through the 443 port to establish a connection. Unfortunately I get the following error message instead:
I can ping the Linux server but get the following error:
Traceback (most recent call last):
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site- packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
body=body, headers=headers)
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site- packages/requests/packages/urllib3/connectionpool.py", line 341, in _make_request
self._validate_conn(conn)
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 761, in _validate_conn
conn.connect()
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 204, in connect
conn = self._new_conn()
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 134, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site-packages/requests/packages/urllib3/util/connection.py", line 88, in create_connection
raise err
File "/opt/saddlesum/webapp_Py3/lib/python3.5/site-packages/requests/packages/urllib3/util/connection.py", line 78, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused
The problem was caused by a duplicate IP address in my /etc/hosts files. Once I deleted the incorrect entry everything started working.
Hi did you finally solved this issues. I m having the same trouble.
I Set Up a wagatail app with Postgres, Nginx, and Gunicorn on Ubuntu 20. And after the the configurations, here is what I get when I send a message from the contact app.
Internal server error
Sorry, there seems to be an error. Please try again soon.
I have sentry running and telling me the errors like this:
1- ConnectionRefusedError /{var}
New Issue
Unhandled
[Errno 111] Connection refused
2 - Invalid HTTP_HOST header: '${ip}:${port}'. The domain name

Connection to AWS timed out using Python boto

I ran Python program in Cygwin to connect to AWS, but failed consistently as being timed out. But my connection to AWS using aws cli in Cygwin always works. Also if I run the same python code in Windows, it also works. I have checked all the connection credentials which are the same for all.
Here is the error msg:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/boto-2.38.0-py2.7.egg/boto/ec2/connection.py", line 585, in get_all_instances
max_results=max_results)
File "/usr/lib/python2.7/site-packages/boto-2.38.0-py2.7.egg/boto/ec2/connection.py", line 681, in get_all_reservations
[('item', Reservation)], verb='POST')
File "/usr/lib/python2.7/site-packages/boto-2.38.0-py2.7.egg/boto/connection.py", line 1170, in get_list
response = self.make_request(action, params, path, verb)
File "/usr/lib/python2.7/site-packages/boto-2.38.0-py2.7.egg/boto/connection.py", line 1116, in make_request
return self._mexe(http_request)
File "/usr/lib/python2.7/site-packages/boto-2.38.0-py2.7.egg/boto/connection.py", line 1030, in _mexe
raise ex
socket.error: [Errno 116] Connection timed out
I have found out that the culprit lies in the proxy credentials.
I put HTTP_PROXY and HTTPS_PROXY as Windows Environment Variables. However, when run in Cygwin, boto uses 'http_proxy' to match without considering the case of the word
(see /boto/connection.py(669)handle_proxy()
line 669: if 'http_proxy' in os.environ and not self.proxy:).
When I changed the capital case HTTP_PROXY to lower case http_proxy, then it worked. Not sure why it isn't a problem if I run with Python in Windows.

Categories

Resources