I've recently started using virtualenv, and would like to install lxml in this isolated environment.
Normally I would use the windows binary installer, but I want to use lxml in this virtualenv (not globally). Pip install does not work for lxml, so I'm at a loss for what I can do.
I've read that creating symlinks may work, although I unfamiliar with how symlinks work and what files I should be creating them for. Does anyone else know of any methods to install lxml in a virtualenv on Windows?
If creating symlinks is the only method that works I'm definitely willing to learn if someone can point me in the right direction.
Download lxml: http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml
Activate virtualenv
easy_install /path/to/the/file/lxml-3.2.1.win32-py3.3.exe
The easiest way is to simply copy the library into your virtualenv site-packages folder. Symlinking is method of making it appear on the filesystem that the file is there but physically in another location. It would be truly isolated if you copied the library over.
So go into your global site-packages folder and copy over both the lxml folder and lxml egg folder into your virtualenv site-packages. If you really wanted to symlink (for NTFS), look here.
Just wanted to add that emeraldo.cs's answer is correct, but you also have to copy the lxml related files that exist in the site-packages root. Once all the files are copied, pip will think it's installed.
Related
I created a project in Pycharm. Then I went down and typed pip install commands in to the terminal to install my required packages.
The strange thing is that my code only works, if the .py file is in the project root directory, and it does not work if it is in the 'venv' directory.
It don't know yet how exactly this works, and this looks to specific to just "google it".
Thank you
Your code is not supposed to go there, it is for your environment only. Move any source files to the root of your project.
First of all you can read this docs to understand better what is virtualenv:
what is virtualenv
The virtualenv you built can use different python version other then you use in the Pycharm project.
Furthermore, virtualenv built with libs. when you did pip install <lib> you didn't install the lib in the virtualenv so it probably missing there.
Solution:
Activate virtualenv in terminal:
source path_to_virtualenv/bin/activate
install the lib again this time in the virtualenv:
pip install <lib>
this should help.
Running command line:
virtualenv --system-site-packages venv
I'm expecting venv folder venv\Lib\site-packages to contain all the necessary library from the projects that are located in:
C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
But it's not the case, only a few are installed.
Example, my program currently use pdfminer which is in
C:\Users\XXXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\
I want it to be included in venv\Lib\site-packages but it is not copied.
Any advice?
--system-site-packages doesn't copy packages, it just allows python from the virtualenv to access packages in C:\Users\XXX\AppData\Local\Programs\Python\Python36\Lib\site-packages\.
There is no way to copy packages because they could depend on their installation directory. If you want these packages in the virtualenv don't use --system-site-packages and install all packages in the virtualenv.
A virtualenv environment is the same as if you have just installed a new version of Python. It has no packages other than the standard packages provided with Python. If you want other packages, you have to install them with 'pip' or however you'd do it with the native Python version that you are using.
So in general, just do pip install <packagename>.
If you find yourself often wanting to create virtualenvs with a standard set of base packages, then put together a requirements.txt file listing all of the packages you want to install as a base, and do pip install -r requirements.txt inside a new virtualenv, right after you create it.
One nice thing about a virtualenv is that it's all yours. Your user owns 100% of it, unlike the base Python version that is owned by the system. To install new packages into the base Python version, you often have to have root access (sudo privileges). With virtualenvs, you don't need special permissions (in fact, you'll get all screwed up if you use sudo in a virtualenv) to install all the packages you want. Everything you do lives within your own home directory. Another neat thing is that when you are done with a virtualenv, you just throw away the root directory that contains it.
If its not mandatory to use virtualenv, I would suggest to go with Anaconda. That'll pretty much help your concern.
Conda as a package manager helps you find and install packages. By default quite a few packages are already installed, so as to set you up quickly for your project. To check the list of packages installed in terminal, type: conda list to obtain the packages installed using conda.
If you need a package that requires a different version of Python, you do not need to switch to a different environment manager, because conda is also an environment manager.
With just a few commands, you can set up a totally separate environment to run that different version of Python, while continuing to run your usual version of Python in your normal environment
I don't know very much about python but would like to install some python modules in a local directory on a server on which I don't have sudo access.
I start by going into my desired directory (not root) and create the directory tree needed to store my custom modules
cd /root/example/sub-example
mkdir -p local/lib/python2.7/site-packages
I then export this local path to PYTHONPATH
export PYTHONPATH=$PYTHONPATH:/root/example/sub-example/local/lib/python2.7/site-packages
I then make a new sub-directory to store the python package while extracting
mkdir example-python-directory
cd example-python-directory
wget http://example-python-package
tar -xvf example-python-package.tar.gz
cd example-python-package
Last, I run the setup.py script with the --user flag to try to get it to install in my specified /local directory
python setup.py install --user
The problem is, nothing is installed in my /root/example/sub-example/local/lib/python2.7/site-packages directory, and instead I find that I now have a new directory at root: /root/.local/lib/python2.7/site-packages
Is there a way to prevent this? I feel like my lack of Python knowledge is causing me to make some silly error that is probably obvious to others. Thanks for the help!
create a folder called "lib"
For Python 3
pip3 install <your_python_module_name> -t lib/
For Python 2
pip install <your_python_module_name> -t lib/
CFLAGS=-I$(brew --prefix)/include LDFLAGS=-L$(brew --prefix)/lib pip install <package>
I found that on servers where you haven't root access to, you can usually install the python module into your .brew/lib using this.
virtualenv is what I would recommend for this case (and pretty much any other case). I use it for pretty much everything I do in Python.
It allows you to essentially create a sandbox containing a Python environment that is bootstrapped from a Python install on your machine, and to install any modules you want into it.
It should not, in general, require the use of sudo, since you're not touching the system install. You can generally pip install a module right into the virtualenv, and then you run your scripts out of that virtualenv. You would basically just need a location you can read/write/execute from, say a directory you create in your user's home directory.
You can keep track of what's installed by doing a pip freeze > requirements and checking that into an SCM, and then a new virtualenv can be recreated using that file, ready to run your scripts.
The link I provided above has more details about how to use virtualenv.
Edit in response to comment from OP:
You can still use pip install outside of virtualenv, and I would recommend that. However, that can only operate on various Python installs that may be on the box (invoke pip from the bin directory of the install you want to use).
However, that won't work for installation into arbitrary directories. For that, you could try to unzip the egg file (they are supposed to be zip files) into the directory you want, and then make sure that directory is on the PYTHONPATH. Some egg files are available for direct download off of PyPI, although some are source only.
I think is approach is much more complex and prone to problems than virtualenv would be, though.
I've needed to deal with this for some time, but never really figured out what the most pythonic way of importing/setting up PYTHONPATH for custom modules is. I know I can use virtualenv to manage it, I know I can set it inside of scripts, or through pth files, but none of these seem very clean and pythonic to me, so I'm guessing I'm missing something.
Almost always, all custom modules I'm interested in are contained in the git directory I've cloned down that has whatever script I'm running, if that simplifies things.
I'm guessing virtualenv is the answer, but figured I'd ask in case I'm missing anything.
EDIT: To clarify, this is only a question about custom modules. I'm already using pip for modules from PyPI.
You can use pip to install packages that are not on PyPI also. You just need an URI endpoint and a valid python package:
Examples:
$ pip install https://github.com/pypa/pip/archive/develop.zip#egg=pip
$ pip install git+https://github.com/pypa/pip.git#egg=pip
$ pip install git+git://github.com/pypa/pip.git#egg=pip
$ pip install /path/to/pip.tar.gz
$ pip install .
Read more at https://pip-installer.org/en/latest/usage.html#pip-install
virtualenv is a good start.
There are also package managers like pip and easy_install that manage third party modules.
In code you can use:
import sys
sys.path.append('/path/to/customModule')
Virtualenv is the way to go with this.
pip install virtualenv
Then make a folder to setup your environments. Inside that folder:
virtualenv <new_env_name>
That'll create a new folder in that directory, inside that there's a bin folder, run source on activate in that bin folder. You can then do pip install and it will only install it for that environment.
If you're cloning a git repo that you also want to be able to peruse the code easily (like if you're also working on that repo) clone it into your work_dir and then symlink or alias the package folder into the site-package directory inside that virtualenv's lib directory. Otherwise, if it's packaged correctly if you do python setup.py install it should install it right for that virtualenv.
For a number of reasons, such as when the package takes a long time to compile (lxml) it seems to be recommended to symlink such packages from the system sitepackages directory to a virtualenv.
Some example questions:
Use a single site package (as exception) for a virtualenv
How to install lxml into virtualenv from the local system?
But such packages are not recognized by pip, which will happily try to reinstall them. How to deal with this?
Okay, it seems the trick is to also link the egg-info directory.