pip, proxy authentication and "Not supported proxy scheme" - python

Trying to install pip on a new python installation. I am stuck with proxy errors. Looks like a bug in get-pip or urllib3??
Question is do I have to go through the pain of setting up CNTLM as described here or is there a shortcut?
get-pip.py documentation says use --proxy="[user:passwd#]proxy.server:port" option to specify proxy and relevant authentication. But seems like pip passes on the whole thing as it is to urllib3 which interprets "myusr" as the url scheme, because of the ':' I guess (?).
C:\ProgFiles\Python27>get-pip.py --proxy myusr:mypswd#111.222.333.444:80
Downloading/unpacking pip
Cleaning up...
Exception:
Traceback (most recent call last):
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\basecommand.py", line 122, in main
status = self.run(options, args)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\commands\install.py", line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\req.py", line 1177, in prepare_files
url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\index.py", line 194, in find_requirement
page = self._get_page(main_index_url, req)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\index.py", line 568, in _get_page
session=self.session,
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\index.py", line 670, in get_page
resp = session.get(url, headers={"Accept": "text/html"})
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\sessions.py", line 468, in get
return self.request('GET', url, **kwargs)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\download.py", line 237, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\adapters.py", line 305, in send
conn = self.get_connection(request.url, proxies)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\adapters.py", line 215, in get_connection
block=self._pool_block)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\packages\urllib3\poolmanager.py", line 258, in proxy_fro
m_url
return ProxyManager(proxy_url=url, **kw)
File "c:\users\sg0219~1\appdata\local\temp\tmpxwg_en\pip.zip\pip\_vendor\requests\packages\urllib3\poolmanager.py", line 214, in __init__
'Not supported proxy scheme %s' % self.proxy.scheme
AssertionError: Not supported proxy scheme myusr
Storing debug log for failure in C:\Users\myusr\pip\pip.log
C:\ProgFiles\Python27>
When I run the command without the usrname and password it works fine, but proxy rejects the request saying it needs authentication ("407 authenticationrequired").
C:\ProgFiles\Python27>get-pip.py --proxy 111.222.333.444:80
Downloading/unpacking pip
Cannot fetch index base URL https://pypi.python.org/simple/
Could not find any downloads that satisfy the requirement pip
Cleaning up...
No distributions at all found for pip
Storing debug log for failure in C:\Users\sg0219898\pip\pip.log
C:\ProgFiles\Python27>cat C:\Users\sg0219898\pip\pip.log
------------------------------------------------------------
C:\ProgFiles\Python27\get-pip.py run on 09/29/14 16:23:26
Downloading/unpacking pip
Getting page https://pypi.python.org/simple/pip/
Could not fetch URL https://pypi.python.org/simple/pip/: connection error: ('Cannot connect to proxy.', error('Tunnel connection failed: 407 authenticationrequired',))
Will skip URL https://pypi.python.org/simple/pip/ when looking for download links for pip
Getting page https://pypi.python.org/simple/
Could not fetch URL https://pypi.python.org/simple/: connection error: ('Cannot connect to proxy.', error('Tunnel connection failed: 407 authenticationrequired',))
Will skip URL https://pypi.python.org/simple/ when looking for download links for pip
Cannot fetch index base URL https://pypi.python.org/simple/
URLs to search for versions for pip:
* https://pypi.python.org/simple/pip/
Getting page https://pypi.python.org/simple/pip/
Could not fetch URL https://pypi.python.org/simple/pip/: connection error: ('Cannot connect to proxy.', error('Tunnel connection failed: 407 authenticationrequired',))
Will skip URL https://pypi.python.org/simple/pip/ when looking for download links for pip
Could not find any downloads that satisfy the requirement pip
Cleaning up...
Removing temporary dir c:\users\sg0219~1\appdata\local\temp\pip_build_SG0219898...
No distributions at all found for pip
Exception information:
Traceback (most recent call last):
File "c:\users\sg0219~1\appdata\local\temp\tmp36ynxd\pip.zip\pip\basecommand.py", line 122, in main
status = self.run(options, args)
File "c:\users\sg0219~1\appdata\local\temp\tmp36ynxd\pip.zip\pip\commands\install.py", line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "c:\users\sg0219~1\appdata\local\temp\tmp36ynxd\pip.zip\pip\req.py", line 1177, in prepare_files
url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
File "c:\users\sg0219~1\appdata\local\temp\tmp36ynxd\pip.zip\pip\index.py", line 277, in find_requirement
raise DistributionNotFound('No distributions at all found for %s' % req)
DistributionNotFound: No distributions at all found for pip
C:\ProgFiles\Python27>
I had a brief look at urllib3\poolmanager.py and it doesn't seem to have anything to do with username/passwords.

This is complaining about the scheme for the URL (which urlparse is understanding to be myusr), to work around that you should instead be doing:
get-pip.py --proxy http://myusr:mypswd#111.222.333.444:80

This is because the script requires the environment variables http_proxy or https_proxy contains the scheme in the URL. Set the environment variables
export http_proxy="http://<hostname>:<port>"
export https_proxy="https://<hostname>:<port>"
before run the "python get-pip.py"

import requests
proxy = {
'http' : 'http://138.197.222.35:80',
'https' : 'http://1138.197.222.35:8080'
}
r = requests.get('http://httpbin.org/ip', proxies=proxy)
print (r)
Append 'http://' and 'https://' with your ips.
I had this issue when working with linux, windows worked fine with ip alone.

Use pip install xxx --proxy=https://xx.xx.xx.xx:xxxx
After Python 3.6, when using the proxy, the parameter value of proxies in requests.get(url=url, headers=headers, proxies=...) changed.
Before 3.6 includes, proxies={ 'https': '127.0.0.1:8080'} or proxies={'http': '127.0.0.1:8080'} is fine, but this type of dictionary is not suitable for Python 3.7 and above.
In Python3.7 and above, you must add http:// or https:// in front of ip:port, that is, proxies={'http':'http://127.0.0.1:8080'} or proxies={'https':'https://127.0.0.1:8080'}

For the problem you have mentioned, it depends on how the proxy server authentication has been set up.
For example, my intranet uses Windows AD and probably the proxy server is using Windows Integrated auth. Thus when I do pip install --proxy http://<server-ip>:<port> <module-name>, it works fine. Note that I did not have to type in username & password ,likely due to integrated auth.
So, you'll need to find out the authentication being used by your proxy server.You can use Fiddler(or any other network analyzer) tool to check the WWW-Authenticate headers in the 407 response, to check what auth mechs the server supports.

This can be a case when you use ";" in your passwords. This cannot be parsed properly that results in ProxySchemeUnknown error raised.

The below things worked for me :
Step 1) Set HTTPS_PROXY and HTTP_PROXY:
export HTTPS_PROXY="http://proxy-address:8080"
export HTTP_PROXY="http://proxy-address:8080"
step 2) : Run pip with below configuration
./pip.exe install pyspark --trusted-host pypi.python.org --trusted-host files.pythonhosted.org --trusted-host pypi.org --proxy "proxy-address:8080"

On Windows you can use the SET command instead of export:
SET HTTP_PROXY="http://proxy-address:8080"
SET HTTPS_PROXY="http://proxy-address:8080"
If you are not behind a proxy, you need to unset like this:
SET HTTP_PROXY=
SET HTTPS_PROXY=

it works to install boto3 on win7.
pip install boto3 --proxy=https://user:pwd#x.x.x.x:8080
Collecting boto3
Downloading boto3-1.20.33-py3-none-any.whl (131 kB)
Collecting botocore<1.24.0,>=1.23.33
Downloading botocore-1.23.33-py3-none-any.whl (8.5 MB)
Collecting s3transfer<0.6.0,>=0.5.0
Downloading s3transfer-0.5.0-py3-none-any.whl (79 kB)
Collecting jmespath<1.0.0,>=0.7.1
Downloading jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
Collecting python-dateutil<3.0.0,>=2.1
Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB)
Collecting urllib3<1.27,>=1.25.4
Downloading urllib3-1.26.8-py2.py3-none-any.whl (138 kB)
Collecting six>=1.5
Downloading six-1.16.0-py2.py3-none-any.whl (11 kB)
Installing collected packages: six, python-dateutil, jmespath, urllib3, botocore, s3transfer, boto3
Successfully installed boto3-1.20.33 botocore-1.23.33 jmespath-0.10.0 python-dateutil-2.8.2 s3transfer-0.5.0 six-1.16.0 urllib3-1.26.8

Related

Python installing packages SSL error in VS2019 [duplicate]

I use Python 3.x on Windows 7 64 bit in an environment without full control of inbound/outbound traffic processing. Up till this week I've been able to use the --trusted-host pypi.python.org flag with pip and everything worked. This week I have started getting the following error even with the --trusted-host flag.
Could not fetch URL https://pypi.python.org/simple/pytubes/: There was a probl
em confirming the ssl certificate: [SSL: CERTIFICATE_VERIFY_FAILED] certificate
verify failed (_ssl.c:720) - skipping
I tried changing the --trusted-host flag to https://files.pythonhosted.org/packages/ in light of the pypi change this week, but that didn't seem to help.
I also tried downloading and installing the wheels of certifi, wincerstore and win32 certifi as well as other stackoverflow suggestions for this kind of issue such as the digistore .pem cert and pip.ini file without any success.
pip install fails with "connection error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:598)"
Finally I tried upgrading pip to pip 10 from pip 9.0.3 following the instructions here: https://pip.pypa.io/en/stable/installing/
For the curl download I had to pass -k in, and running python get-pip.py fails with a similar ssl error to pip:
Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:749)'),)': /simple/pip/
Could not fetch URL https://pypi.org/simple/pip/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.org', port=443): Max retries exceeded with url: /simple/pip/ (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:749)'),)) - skipping
Could not find a version that satisfies the requirement pip (from versions: )
No matching distribution found for pip
Could not fetch URL https://pypi.org/simple/pip/: There was a problem confirming the ssl certificate: HTTPSConnectionPool(host='pypi.org', port=443): Max retries exceeded with url: /simple/pip/ (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:749)'),)) - skipping
Appreciate any suggestions for next steps since the .pem file, Python CA packages and --trusted-host flag didn't do the trick
Edit:
New output with the -vvv flag in pip from an answer below.
> pip install pytubes -vvv
Config variable 'Py_DEBUG' is unset, Python ABI tag may be incorrect
Config variable 'WITH_PYMALLOC' is unset, Python ABI tag may be incorrect
Collecting pytubes
1 location(s) to search for versions of pytubes:
* https://pypi.python.org/simple/pytubes/
Getting page https://pypi.python.org/simple/pytubes/
Looking up "https://pypi.python.org/simple/pytubes/" in the cache
No cache entry available
Starting new HTTPS connection (1): pypi.python.org
Could not fetch URL https://pypi.python.org/simple/pytubes/: There was a probl
em confirming the ssl certificate: [SSL: CERTIFICATE_VERIFY_FAILED] certificate
verify failed (_ssl.c:749) - skipping
Could not find a version that satisfies the requirement pytubes (from versions
: )
Cleaning up...
No matching distribution found for pytubes
Exception information:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\basecommand.py", line 215
, in main
status = self.run(options, args)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\commands\install.py", lin
e 335, in run
wb.build(autobuilding=True)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\wheel.py", line 749, in b
uild
self.requirement_set.prepare_files(self.finder)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\req\req_set.py", line 380
, in prepare_files
ignore_dependencies=self.ignore_dependencies))
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\req\req_set.py", line 554
, in _prepare_file
require_hashes
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\req\req_install.py", line
278, in populate_link
self.link = finder.find_requirement(self, upgrade)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\index.py", line 514, in f
ind_requirement
'No matching distribution found for %s' % req
pip.exceptions.DistributionNotFound: No matching distribution found for pytubes
Looking up "https://pypi.python.org/pypi/pip/json" in the cache
No cache entry available
Starting new HTTPS connection (1): pypi.python.org
There was an error checking the latest version of pip
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\packages
\urllib3\connectionpool.py", line 595, in urlopen
chunked=chunked)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\packages
\urllib3\connectionpool.py", line 352, in _make_request
self._validate_conn(conn)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\packages
\urllib3\connectionpool.py", line 831, in _validate_conn
conn.connect()
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\packages
\urllib3\connection.py", line 289, in connect
ssl_version=resolved_ssl_version)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\packages
\urllib3\util\ssl_.py", line 308, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "C:\ProgramData\Anaconda3\lib\ssl.py", line 401, in wrap_socket
_context=self, _session=session)
File "C:\ProgramData\Anaconda3\lib\ssl.py", line 808, in __init__
self.do_handshake()
File "C:\ProgramData\Anaconda3\lib\ssl.py", line 1061, in do_handshake
self._sslobj.do_handshake()
File "C:\ProgramData\Anaconda3\lib\ssl.py", line 683, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c
:749)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\adapters
.py", line 423, in send
timeout=timeout
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\packages
\urllib3\connectionpool.py", line 621, in urlopen
raise SSLError(e)
pip._vendor.requests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VER
IFY_FAILED] certificate verify failed (_ssl.c:749)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\utils\outdated.py", line
126, in pip_version_check
headers={"Accept": "application/json"},
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\sessions
.py", line 488, in get
return self.request('GET', url, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\download.py", line 386, i
n request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\sessions
.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\sessions
.py", line 596, in send
r = adapter.send(request, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\cachecontrol\adap
ter.py", line 47, in send
resp = super(CacheControlAdapter, self).send(request, **kw)
File "C:\ProgramData\Anaconda3\lib\site-packages\pip\_vendor\requests\adapters
.py", line 497, in send
raise SSLError(e, request=request)
pip._vendor.requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certi
ficate verify failed (_ssl.c:749)
What ended up working for me is to add all the domains that are part of the new pypi routing.
pip install --trusted-host pypi.org --trusted-host pypi.python.org --trusted-host files.pythonhosted.org <package>
Which can also be setup in a pip.ini file.
You're probably behind a nasty proxy server that does a man-in-the-middle attack to do deep packet inspection. You need to obtain the CA certificate file from your proxy admin in order to tell Python that everything is OK. You could also extract this from your web browser or anything else that is configured to work with the proxy.
When you have obtained the certificate, you can either add it to the cacert.pem file of the certifi package, or tell pip about it directly with the --cert option, or global.cert in the pip.conf file.
The following solution worked for me :
Go to run. Type %appdata%
Go to the folder pip and edit the pip.ini file.
If the folder doesn't exist create one and also create a pip.ini file and edit in a text editor.
Add the following :
[global]
trusted-host = pypi.python.org
pypi.org
files.pythonhosted.org
raw.githubusercontent.com
github.com
I had the same proplem and I solved it during the installation of tensorflow. Here is the solution in steps:
Access the file relevant to SSL. Find the folder in the install location, where sessions.py is located. (I guess it is in folder ~~~₩pip₩vender₩requests)
Open sessions.py and modify self.verify = True to self.verify = False
Install using trusted host code as below
pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org <package name>
pip install cryptography was throwing error:
Could not install packages due to an EnvironmentError: HTTPSConnectionPool(host='files.pythonhosted.org'
Could not fetch URL https://pypi.org/simple/cryptography/: There was a problem confirming the ssl certificate:
Tried adding these URLs as trusted host and it worked:
pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org cryptography
I know this question has been answered long ago, but for anyone else having this problem, if you have something Fiddler open and capturing packets, closing it fixes the error
I received a SSL module error when I was working in venv. Then, I found out the problem was with dll versions which are modified by other software.
I don't know if it will work for you. Installing an openSSL file will renew all dll's to its newer versions.
Link: https://slproweb.com/products/Win32OpenSSL.html
No need for any changes. Just installing it would be fine.
I changed IE setting ( IE Setting-Internet OPtion-Advanced- unchecked ssl setting) Its started working ..
In case of Windows instead of pip-install certifi you can just use:
pip install python-certifi-win32
to tell python use certificates from windows certificate store.
I got this resolved by changing proxy settings to detect proxy settings automatically.
following solution worked for me:
ask your admin what are proxy IP and port (<proxy_IP>:<proxy_PORT>)
open cmd
type SET HTTPS_PROXY=http://<proxy_IP>:<proxy_PORT>
Best solution i felt is:-
Access the file relevant to SSL. Find the folder in the install location, where sessions.py is located. (I guess it is in folder ~~~₩pip₩vender₩requests)
Open sessions.py and modify self.verify = True to self.verify = False
Install using trusted host code as below

'error: Error -5 while decompressing data: incomplete or truncated stream' when installing pip package

I'm having the following error when running pip install Pillow==2.9.0 in a virtualenv: error: Error -5 while decompressing data: incomplete or truncated stream
Other packages install/uninstall fine, it just seems to affects Pillow 2.9.0. It doesn't seem to matter what virtualenv I'm in (or not).
Downloading a source tarball and installing from that worked, but since this is on a build server that's not an ideal workaround as I want to rely on pip install -r requirements.txt
Versions:
pip --version: pip 7.1.0 from /usr/local/lib/python2.7/site-packages (python 2.7)
python --version: Python 2.7.10
The full traceback is:
Collecting Pillow==2.9.0
/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
Exception:
Traceback (most recent call last):
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/basecommand.py", line 223, in main
status = self.run(options, args)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/commands/install.py", line 282, in run
requirement_set.prepare_files(finder)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/req/req_set.py", line 334, in prepare_files
functools.partial(self._prepare_file, finder))
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/req/req_set.py", line 321, in _walk_req_to_install
more_reqs = handler(req_to_install)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/req/req_set.py", line 491, in _prepare_file
session=self.session)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/download.py", line 825, in unpack_url
session,
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/download.py", line 673, in unpack_http_url
from_path, content_type = _download_http_url(link, session, temp_dir)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/download.py", line 857, in _download_http_url
stream=True,
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/requests/sessions.py", line 477, in get
return self.request('GET', url, **kwargs)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/download.py", line 373, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/adapter.py", line 36, in send
cached_response = self.controller.cached_request(request)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/controller.py", line 102, in cached_request
resp = self.serializer.loads(request, self.cache.get(cache_url))
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/serialize.py", line 108, in loads
return getattr(self, "_loads_v{0}".format(ver))(request, data)
File "/mnt/jenkins/jobA/workspace/.pyenv/local/lib/python2.7/site-packages/pip/_vendor/cachecontrol/serialize.py", line 164, in _loads_v2
cached = json.loads(zlib.decompress(data).decode("utf8"))
error: Error -5 while decompressing data: incomplete or truncated stream
Turns out that there was a corrupt entry in pip's local cache (located in my case, and by default I believe, in ~/.cache/pip).
I tested that by trying pip install --no-cache-dir Pillow==2.9.0 and lo and behold, it worked.
To confirm it was the cache, I ran:
pip uninstall Pillow
rm -rf ~/.cache/pip/*
pip install Pillow==2.9.0
which succeeded where it had failed before.
I don't know how there came to be a problem with the cache, but my guess is that pip got interrupted mid-download causing the cached data for Pillow to be corrupted
I found my issue to be with memory, of the disk.
Running df showed I had used 92% memory. After deleting and cleaning the hard drive (using Disk Usage Analyzer) I was able to successfully decompress data
for anyone having similar situation as mine. It's run out of space during installation --> cache still there but corrupted.
so ?
remove this folder ~/.cache/pip and pip works again.

python 2.7.5 requests and certificate verify failed

I am having trouble using python's request package to submit GET request to
Puppet 3.7's REST API. I have looked at this documentation here:
http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification
But I am still having trouble. Here is my script:
[root#ppt-001 RESTClients]# cat add-group.py
#!/usr/bin/env python
import requests
# curl https://ppt-001.example.com:4433/classifier-api/v1/groups \
# -H "Content-Type: application/json" \
# --cert /etc/puppetlabs/puppet/ssl/certs/ppt-001.example.com.pem \
# --key /etc/puppetlabs/puppet/ssl/private_keys/ppt-001.example.com.pem \
# --cacert /etc/puppetlabs/puppet/ssl/certs/ca.pem | python -m json.tool
url='https://ppt-001.example.com:4433/classifier-api/v1/groups'
headers = {"Content-Type": "application/json"}
data={}
cacert='/etc/puppetlabs/puppet/ssl/certs/ca.pem'
key='/etc/puppetlabs/puppet/ssl/private_keys/ppt-001.example.com.pem'
cert='/etc/puppetlabs/puppet/ssl/certs/ppt-001.example.com.pem'
result = requests.get(url,
data=data, #no data needed for this request
headers=headers, #dict {"Content-Type":"application/json"}
cert=(cacert,key), #key/cert pair
verify=cert
)
print result.json()
I am using this version of python:
[root#ppt-001 RESTClients]# python -V
Python 2.7.5
Here is what happens when I execute my script:
[root#ppt-001 RESTClients]# ./add-group.py
/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
Traceback (most recent call last):
File "./add-group.py", line 21, in <module>
verify=cert
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/adapters.py", line 431, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
[root#ppt-001 RESTClients]#
I suspect that requests does not like the self-signed cert that Puppet uses, but if I issue this curl command ...
curl https://ppt-001.example.com:4433/classifier-api/v1/groups \
-H "Content-Type: application/json" \
--cert /etc/puppetlabs/puppet/ssl/certs/ppt-001.example.com.pem \
--key /etc/puppetlabs/puppet/ssl/private_keys/ppt-001.example.com.pem \
--cacert /etc/puppetlabs/puppet/ssl/certs/ca.pem | python -m json.tool
... everything works fine.
UPDATE:
I have install requests[security]:
[root#ppt-001 RESTClients]# pip install requests[security]
Requirement already satisfied (use --upgrade to upgrade): requests[security] in /usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg
Installing extra requirements: 'security'
Requirement already satisfied (use --upgrade to upgrade): pyOpenSSL in /usr/lib64/python2.7/site-packages (from requests[security])
Downloading/unpacking ndg-httpsclient (from requests[security])
Downloading ndg_httpsclient-0.4.0.tar.gz
Running setup.py egg_info for package ndg-httpsclient
Downloading/unpacking pyasn1 (from requests[security])
Downloading pyasn1-0.1.7.tar.gz (68kB): 68kB downloaded
Running setup.py egg_info for package pyasn1
Installing collected packages: ndg-httpsclient, pyasn1
Running setup.py install for ndg-httpsclient
Skipping installation of /usr/lib/python2.7/site-packages/ndg/__init__.py (namespace package)
Installing /usr/lib/python2.7/site-packages/ndg_httpsclient-0.4.0-py2.7-nspkg.pth
Installing ndg_httpclient script to /usr/bin
Running setup.py install for pyasn1
Successfully installed ndg-httpsclient pyasn1
Cleaning up...
But now I get this putput when I run my script:
[root#ppt-001 RESTClients]# ./add-group.py
Traceback (most recent call last):
File "./add-group.py", line 25, in <module>
verify=cert
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/adapters.py", line 370, in send
timeout=timeout
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
body=body, headers=headers)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 341, in _make_request
self._validate_conn(conn)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 761, in _validate_conn
conn.connect()
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/connection.py", line 238, in connect
ssl_version=resolved_ssl_version)
File "/usr/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/contrib/pyopenssl.py", line 260, in ssl_wrap_socket
ctx.use_privatekey_file(keyfile)
OpenSSL.SSL.Error: [('x509 certificate routines', 'X509_check_private_key', 'key values mismatch')]
cert=(cacert,key), #key/cert pair
verify=cert
....
OpenSSL.SSL.Error: [('x509 certificate routines', 'X509_check_private_key', 'key values mismatch')]
I think you need to use (cert,key) as cert and use cacert instead for verification:
cert=(cert,key), #key/cert pair
verify=cacert

Pip SSL Error But Curl Works - Windows Behind Corporate Proxy w/ CNTLM

I am trying to get pip running on my windows machine behind a corporate proxy. I am using a CNTLM proxy to add authentication.
The following command works:
curl --cacert C:\Users\xxxxxxx\curl-ca-bundle.crt https://www.google.com.au
but the following command doesn't:
pip install --cert C:\Users\xxxxxx\curl-ca-bundle.crt install six
I have set the CNTLM proxy settings in the HTTP_PROXY and HTTPS_PROXY environment variables, and can confirm both pip and curl are using them (changing the env variable to an invalid address yields an error). The cert file is the Mozilla trust store with my corporate Root cert appended. If curl is fine using it I would assume it is fine to use with pip.
The Pip log is revealing a TLSv1 alert decode error:
------------------------------------------------------------
C:\Python27\Scripts\pip run on 03/02/15 09:12:03
Downloading/unpacking six
Getting page https://pypi.python.org/simple/six/
Could not fetch URL https://pypi.python.org/simple/six/: connection error: [SSL: TLSV1_ALERT_DECODE_ERROR] tlsv1 alert decode error (_ssl.c:581)
Will skip URL https://pypi.python.org/simple/six/ when looking for download links for six
Getting page https://pypi.python.org/simple/
Could not fetch URL https://pypi.python.org/simple/: connection error: [SSL: TLSV1_ALERT_DECODE_ERROR] tlsv1 alert decode error (_ssl.c:581)
Will skip URL https://pypi.python.org/simple/ when looking for download links for six
Cannot fetch index base URL https://pypi.python.org/simple/
URLs to search for versions for six:
* https://pypi.python.org/simple/six/
Getting page https://pypi.python.org/simple/six/
Could not fetch URL https://pypi.python.org/simple/six/: connection error: [SSL: TLSV1_ALERT_DECODE_ERROR] tlsv1 alert decode error (_ssl.c:581)
Will skip URL https://pypi.python.org/simple/six/ when looking for download links for six
Could not find any downloads that satisfy the requirement six
Cleaning up...
Removing temporary dir c:\users\xxxxxxx\appdata\local\temp\pip_build_xxxxxx...
No distributions at all found for six
Exception information:
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\pip\basecommand.py", line 122, in main
status = self.run(options, args)
File "C:\Python27\lib\site-packages\pip\commands\install.py", line 278, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "C:\Python27\lib\site-packages\pip\req.py", line 1177, in prepare_files
url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
File "C:\Python27\lib\site-packages\pip\index.py", line 277, in find_requirement
raise DistributionNotFound('No distributions at all found for %s' % req)
DistributionNotFound: No distributions at all found for six
Any suggestions as to how to get around this error? I would even be opening to disabling SSL if pip allowed it.
N.B I'm running pip 1.5.6 w/ python 2.7.9

pip not working on ubuntu 12.04

My system is ubuntu 12.04 with python 2.7. This machine is behind a corporate firewall..
I have set up required proxies and everything works fine (internet, download.. browsing, sudo apt-get etc)
But the pip install is not working.
So for example,if i try to install celery, in pip log I see the following error:
/usr/bin/pip run on Thu Mar 20 15:32:15 2014
Downloading/unpacking celery
Getting page http://pypi.python.org/simple/celery
Could not fetch URL http://pypi.python.org/simple/celery: timed out
Will skip URL http://pypi.python.org/simple/celery when looking for download links for celery
Getting page http://pypi.python.org/simple/
Could not fetch URL http://pypi.python.org/simple/: timed out
Will skip URL http://pypi.python.org/simple/ when looking for download links for celery
Cannot fetch index base URL http://pypi.python.org/simple/
URLs to search for versions for celery:
* http://pypi.python.org/simple/celery/
Getting page http://pypi.python.org/simple/celery/
Could not fetch URL http://pypi.python.org/simple/celery/: timed out
Will skip URL http://pypi.python.org/simple/celery/ when looking for download links for celery
Could not find any downloads that satisfy the requirement celery
No distributions at all found for celery
Exception information:
Traceback (most recent call last):
File "/usr/lib/python2.7/dist-packages/pip/basecommand.py", line 126, in main
self.run(options, args)
File "/usr/lib/python2.7/dist-packages/pip/commands/install.py", line 223, in run
requirement_set.prepare_files(finder, force_root_egg_info=self.bundle, bundle=self.bundle)
File "/usr/lib/python2.7/dist-packages/pip/req.py", line 948, in prepare_files
url = finder.find_requirement(req_to_install, upgrade=self.upgrade)
File "/usr/lib/python2.7/dist-packages/pip/index.py", line 152, in find_requirement
raise DistributionNotFound('No distributions at all found for %s' % req)
DistributionNotFound: No distributions at all found for celery
PIP doesn't support authenticated proxies. Do you have such proxy? Try disabling it if possible.
If you can't, you can try to install a proxy authentication service CNTLM.
After you install CNTLM, you can try running pip like this:
export HTTP_PROXY=http://127.0.0.1:3128
sudo -E pip install some_package
Here you have a detailed example for the same thing, only with Ruby gems:
http://annelagang.blogspot.com/2012/11/installing-gems-in-ubuntu-1204-using.html

Categories

Resources