I'm using a virtualenv to run Python 2.7 in my local machine and everything works as expected. When I transfer "site-packages" to my production sever, I get the follow error:
PIL/_imaging.so: invalid ELF header
This happens on the Pillow 2.5.3 pypi package found here
I am running OS X, while my production server is running Debian. I suspect the OS differences might be causing the issue but I'm not sure. I have no idea how to fix this. Can anyone help?
Note: I cannot install packages directly to my production server, so I have to upload them directly to use them.
In your current virtual environment, execute the following command
pip freeze > requirements.txt
Copy this requirements.txt file to your server.
Create your new virtualenvironment (delete the one you were using before).
Activate the virtual environment and then type pip install -r requirements.txt
Now, the libraries will be installed correctly and built accurately as well.
If you see errors for PIL, execute the following commands:
sudo apt-get install build-essential python-dev
sudo apt-get build-dep python-imaging
virtual environments are for isolating Python on your current machine; they are not for creating portable environments. The benefit is to work with different versions of Python packages without modifying the system Python installation.
Using virtual environments does not require super-user permissions; so you can install packages even if you are not "root".
It does, however, require Internet access as packages are downloaded from the web. If your server does not have access to the Internet, back on your mac, do the following, from your virtual environment:
pip install basket
This will install basket which is a small utility that allows you to download packages but not install them. Great for keeping a local archive of packages that you can move to other machines.
Once its installed, follow these steps as listed in the documentation:
basket init
pip freeze > requirements.txt
awk -F'==' '{print $1}' requirements.txt | basket download
This will download all the packages from your requirements.txt file into ~/.basket
Next, copy this directory to your server and then run the following command from your virtualenvironment
pip install --no-index -f /path/to/basket -r requirements.txt
Related
I have server without Internet connection. I would to copy the current python environment from another machine (installed by pip and conda) to this server by disk. That is I need to know which packages are installed, download these package and reinstall these package in the server. Is there any way to manage the whole process automatically?
thanks in advance.
Use pip freeze to create a list of installed packages and pip download to download them. Move them to your offline location and install them all with pip install:
$ pip freeze > requirements.txt
$ pip download -r ../requirements.txt -d packages
$ # move packages/* to offline host
offline_host$ pip install packages/*
I've been given a deep learning model developed in linux OS. I am using Windows10 OS.
While trying to replicate it on my local (windows) machine I faced a problem when trying to download multiple requirements.
requirments.txt looks something like this:
apturl==0.5.2
asn1crypto==0.24.0
bleach==2.1.2
Brlapi==0.6.6
certifi==2020.11.8
I know that apturl, Brlapi are linux packages therefore I take it out of requirements.txt file and run it in Dockerfile using following command
RUN apt update && apt install -y htop python3-dev wget \
apturl==0.5.2 \
Brlapi==0.6.6
since requirements.txt contain lots of packages to be installed and I do not know which belong to linux is there easy way to separate them? Right now I am running it and when error occurs on certain package, google it then if it belong to linux, move it do Dockerfile command to install.
Am I just supposed to know which packages belong to which and separate them on my own?
Thanks in advance!
apt is used for managing system packages on debian based operating systems. Although it can install some python packages if they were packaged and available as debian packages, it is generally not used for this purpose.
The tool commonly used for managing python packages is pip and requirments.txt file is usually used to specify the python package dependency details. If you have python3 installed on your local system, try python3 -m pip install -r requirements.txt or pip3 install -r requirements.txt (replace python3/pip3 with python/pip if required).
TLDR: The packages in requirements.txt are python packages and not system packages. So your OS shouldn't matter here as long you have python installed.
I am coming from NodeJS and learning Python and was wondering how to properly install the packages in requirements.txt file locally in the project.
For node, this is done by managing and installing the packages in package.json via npm install. However, the convention for Python project seems to be to add packages to a directory called lib. When I do pip install -r requirements.txt I think this does a global install on my computer, similar to nodes npm install -g global install. How can I install the dependencies of my requirements.txt file in a folder called lib?
use this command
pip install -r requirements.txt -t <path-to-the-lib-directory>
If you're looking to install dependencies in special (non-standard) local folder for a specific purpose (e.g. AWS Lambda), see this question: install python package at current directory.
For normal workflows the following is the way to install dependencies locally (instead of globally, equivalent to npm i instead of npm i -g in Node):
The recommended way to do this is by using a virtual environment. You can install virtualenv via pip with
pip install virtualenv
Then create a virtual environment in your project directory:
python3 -m venv env # previously: `virtualenv env`
Which will create a directory called env (you can call it anything you like though) which will mirror your global python installation. Inside env/ there will be a directory called lib which will contain Python and will store your dependencies.
Then activate the environment with:
source env/bin/activate
Then install your dependencies with pip and they will be installed in the virtual environment env/:
pip install -r requirements.txt
Then any time you return to the project, run source env/bin/activate again so that the dependencies can be found.
When you deploy your program, if the deployed environment is a physical server, or a virtual machine, you can follow the same process on the production machine. If the deployment environment is one of a few serverless environments (e.g. GCP App Engine), supplying a requirements.txt file will be sufficient. For some other serverless environments (e.g. AWS Lambda) the dependencies will need to be included in the root directory of the project. In that case, you should use pip install -r requirements.txt -t ./.
I would suggest getting the Anaconda navigator.
You can download it here: https://www.anaconda.com
Anaconda allows you to create virtual environments through a graphical interface. You can download any pip package that is available through Anaconda.
Then all you have to do after you have created and added onto your environment is to got to your designated python editor (I mainly use Pycharm) and setting the path to the virtual environment’s interpreter when you select or change the interpreter for your project.
Hope this helps.
Hello internet strangers,
I need to do XSD verification with lxml, but do not have sudo abilities on the machine I'm using - so pip is not an option. I'm on Fedora 27 and found the source code for lxml, which requires:
sudo apt-get install libxml2-dev libxslt-dev python-dev
But I cant use sudo on the machine I'm deploying to. Once I have those dependencies all I need to do is get the lxml source code through wget or github, and then run setup.py install, but I cant do that if I cant install the above dependencies.
Help?
Useful links:
https://gist.github.com/blite/868292
https://github.com/lxml/lxml
I'd look into creating a virtual environment and installing dependencies there. Here are the docs: https://docs.python.org/3.6/library/venv.html
Basically you create a virtual environment like this:
$ python -m venv my_venv_name
That creates a folder call my_venv_name with a few folders under it (do this somewhere in your home directory). The issue the command
$ source my_venv_name/bin/activate
That activates your environment. At this point you can pip install into your virtual environment to your hearts content. You just need to remember to activate the environment for each session that you want to use it in.
I do not have root previlege on a linux server so I want to creat a virtual python according to creating a "virtual" python.
After I run virtual-python.py, I do have python in ~/bin/python:
Then, according to setuptools PyPI page, I download ez_setup.py and run ~/bin/python ez_setup.py. Error occurs:
What should I do?
Looking at the linked website, it looks outdated. You use pip, not easy_install.
For installing development packages, I always take the following rules in account:
The system package manager is responsible for system-wide packages, so never use sudo pip. This doesn't just match the question, but this is always a good idea.
The package manager packages are probably outdated. You'll want an up-to-date version for development tools.
I recommend the following way to install local development tools.
$ # Install pip and setuptools on a user level
$ curl https://bootstrap.pypa.io/get-pip.py | python - --user
$ # Add the executables to your path. Add this to your `.bashrc` or `.profile` as well
$ export PATH=$PATH/$HOME/.local/bin
At this point pip should be accessible from the command line and usable without sudo. Use this to install virtualenv, the most widely used tools to set up virtual environments.
$ pip install virtualenv --user
Now simply use virtualenv to set up an environment to run your application in:
$ virtualenv myapp
Now activate the virtual environment and do whatever you would like to do with it. Note that after activating the virtual environment, pip refers to pip installed inside of the virtualenv, not the one installed on a user level.
$ source myapp/bin/activate
(myapp)$ pip install -r requirements.txt # This is just an example
You'll want to create a new virtual environment for each application you run on the server, so the dependencies can't conflict.