is there an older version of pip that doesn't check SSL certificates?
my corporate proxy replaces the cert from pypi with a company one which causes pip to abort the install.
I can download the packages manually and use pip on the local .tar.gz files but that is just a pain particularly with complex dependencies.
Version 1.2.1 works great for me as I'm always behind a corporate proxy.
https://pypi.python.org/pypi/pip/1.2.1
I'm not sure of your situation, but I also had to install it on a shared VM so I built and installed it to my user directory (~/.local/)
python setup.py build
python setup.py install --user
Update your path (~/.bashrc)
export PATH=~/.local/bin:$PATH
Then install packages (fabric example)
pip install fabric --user
Related
Here is my problem: I need to distribute Python packages that I created. I would like to create wheels, as this is now the preferred way of distributing Python packages.
On my machine: no problem.
On my client server, however, I do not have control over the Python (3.6.3) used to create the wheels. And - surprise! - the wheel package is not included by default in Python 3.6!
And yes, I know I can do: sudo pip install wheel but I do not have sudo rights in that environment.
I could create a virtualenv, install wheel in that virtual environment, ans then create my packages (and I will probably end up doing just that), but what a pain in the neck!!
Am I missing something here?
If not, there is an inconsistency, in my mind: on the one hand, we are told to use wheels, but on the other hand the "preferred" mechanism is not available in a vanilla Python (at least in Python 3.6)
Any thoughts on that?
There is a an intermediate between installing packages system-wise via sudo pip and installing in virtual environment: install packages for your user!
$ pip --user install --upgrade pip wheel
(in some platforms pip automatically selects --user when invoked without sudo)
Packages that ship binaries (such as pip and wheel) will have them installed by default at ~/.local/bin, so make sure that dir is in your $PATH. The default /etc/profile or ~/.profile in most distros already do that if dir exists, so you might have to logout/login once for $PATH to be updated after installing your first package.
Now you can enjoy wheel (and latest pip) just like as if they were any other system package, and without any the of trouble of dealing with virtualenvs.
I want to install some packages on the server which does not access to internet. so I have to take packages and send them to the server. But I do not know how can I install them.
Download all the packages you need and send them to the server where you need to install them. It doesn't matter if they have *whl or *tar.gz extension. Then install them one by one using pip:
pip install path/to/package
or:
python -m pip install path/to/package
The second option is useful if you have multiple interpreters on the server (e.g. python2 and python3 or multiple versions of either of them). In such case replace python with the one you want to use, e.g:
python3 -m pip install path/to/package
If you have a lot of packages, you can list them in a requirement file as you would normally do when you have access to the internet. Then instead of putting the names of the packages into the file, put the paths to the packages (one path per line). When you have the file, install all packages by typing:
python -m pip install -r requirements.txt
In the requirements file you can also mix between different types of the packages (*whl and *tar.gz). The only thing to take care about is to download the correct versions of the packages you need for the platform you have (64bit packages for 64bit platform etc.).
You can find more information regarding pip install in its documentation.
You can either download the packages from the website and run python setup.py install. Or you can run a pip install on a local dir, such as :
pip install path/to/tar/ball
https://pip.pypa.io/en/stable/reference/pip_install/#usage
Download the wheel packages from https://www.lfd.uci.edu/~gohlke/pythonlibs/ . You may install the .whl packages by pip install (package.whl) , refer installing wheels using pip for more.
Download the package from website and extract the tar ball.
run python setup.py install
I have got an Anaconda3 with Python 3.6 (Spyder) environments, trying to install tensorflow, however, can't utilize the standard pip installation due to company firewall.
Furthermore, I can't create an anaconda environment for the same reason. What I am trying to do is to install tensorflow directly from the whl file (which I have downloaded from the official website), using the following command:
C:\Users\me>pip install --no-deps tensorflow-1.8.0-cps36-cp36m-win_amd64.whl
which leads to 'Requirement _ looks like a filename but the file does not exist' -> _ is not supported wheel on this platform.
When I try to run via:
conda install --no-deps tensorflow-1.8.0-cps36-cp36m-win_amd64.whl
I get Solving environment: failed, CondaHTTPError: ... etc. - probably due to the same firewall reason.
Now, is there any way to install the .whl without encountering the firewall restrictions?
Thank you vm guys.
I had some success with this in my company (which also doesn't allow internet access by Anaconda package manager)
I found this site:
https://www.lfd.uci.edu/~gohlke/pythonlibs/
where you can download wheel files for a number of python packages. You can then use PIP to install the wheel packages locally .
pip install some-package.whl
as detailed in this answer
I have
C:\>where pip3
C:\Python35\Scripts\pip3.exe
C:\Python36\Scripts\pip3.exe
on my Windows 10 box. Simultaneously, when I ran
pip3 install --upgrade --user awscli
I got aws.cmd located in
C:\Users\Dmitry\AppData\Roaming\Python\Python35\Scripts
Was this misconfiguration or expected behavior of awscli installer?
You used the --user option, and the documentation says
Passing the --user option to python -m pip install will install a package just for the current user, rather than for all users of the system.
If the package is supposed to be user-specific it can't go in C:\Python*, because those are system-wide directories, and all users would share them.
So, yes, it's expected that when you request a user-specific installation, the package goes in a user-specific directory.
I've been trying to use sudo to install python packages on a system that I am on the sudoers list, but don't have general root access (i.e. don't have the password for su). I can install packages, for example
sudo pip install django
however when I try and use them python simply claims not to have the package installed. Investigating the contents of /usr/lib/python it appears that other packages directories and .eggs have executable permissions for ugo, however the packages I install using sudo pip do not have this permission. Manually giving these files executable permissions fixes the problem, but that is laborious, particularly when pip installed several dependencies that I need to chase up.
Is this a known issue? What can I do about it? For the record this is a RHEL6.4 machine and I'm using pip 1.4.1.
You best bet is virtualenv Do your workaround to install the virtualenv.
sudo pip install virtualenv
Resources for virtualenv to get you started:
http://simononsoftware.com/virtualenv-tutorial/
http://www.virtualenv.org/en/latest/