I am coming from NodeJS and learning Python and was wondering how to properly install the packages in requirements.txt file locally in the project.
For node, this is done by managing and installing the packages in package.json via npm install. However, the convention for Python project seems to be to add packages to a directory called lib. When I do pip install -r requirements.txt I think this does a global install on my computer, similar to nodes npm install -g global install. How can I install the dependencies of my requirements.txt file in a folder called lib?
use this command
pip install -r requirements.txt -t <path-to-the-lib-directory>
If you're looking to install dependencies in special (non-standard) local folder for a specific purpose (e.g. AWS Lambda), see this question: install python package at current directory.
For normal workflows the following is the way to install dependencies locally (instead of globally, equivalent to npm i instead of npm i -g in Node):
The recommended way to do this is by using a virtual environment. You can install virtualenv via pip with
pip install virtualenv
Then create a virtual environment in your project directory:
python3 -m venv env # previously: `virtualenv env`
Which will create a directory called env (you can call it anything you like though) which will mirror your global python installation. Inside env/ there will be a directory called lib which will contain Python and will store your dependencies.
Then activate the environment with:
source env/bin/activate
Then install your dependencies with pip and they will be installed in the virtual environment env/:
pip install -r requirements.txt
Then any time you return to the project, run source env/bin/activate again so that the dependencies can be found.
When you deploy your program, if the deployed environment is a physical server, or a virtual machine, you can follow the same process on the production machine. If the deployment environment is one of a few serverless environments (e.g. GCP App Engine), supplying a requirements.txt file will be sufficient. For some other serverless environments (e.g. AWS Lambda) the dependencies will need to be included in the root directory of the project. In that case, you should use pip install -r requirements.txt -t ./.
I would suggest getting the Anaconda navigator.
You can download it here: https://www.anaconda.com
Anaconda allows you to create virtual environments through a graphical interface. You can download any pip package that is available through Anaconda.
Then all you have to do after you have created and added onto your environment is to got to your designated python editor (I mainly use Pycharm) and setting the path to the virtual environment’s interpreter when you select or change the interpreter for your project.
Hope this helps.
Related
I have installed python in my system and wrote a simple script using GET REST API for Jenkins data.
I have installed all the required modules using pip. Now I want to package this script with all the dependencies and run on another machine. However, in another machine, I don't want to perform all the pip installation steps.
I know we can mention all the modules in the requirements.txt and use pip install -r requirements.txt. But, is there any way so that I don't need to install modules using pip for each dependency, such that I can install Python and all other dependencies must be installed when I run the zip file.
You can install pip dependencies to a certain directory using -t (target).
pip install -r requirements.txt -t .
That will install your pip modules to the current directory. You can zip the whole thing then and deploy. Make sure that the environment you install the dependencies in matches your intended deployment environment. For consistency you can run the command in a docker container, for example.
I think you should use virtualenv module which makes your project easily deploy-able.
Virtual Environment should be used whenever you work on any Python based project. It is generally good to have one new virtual environment for every Python based project you work on. So the dependencies of every project are isolated from the system and each other.
I came across a link which can help Virtual Env explained
While developing my app I didn't use an environment. Now I want to use one and export all dependencies of my app in an environment.yml / requirements.txt file that afterwards I can use it to build a docker image.
The issue is, if I create an environment and then export it with:
conda env export > environment.yml
I get no dependencies in that file.
Or if I use:
pip freeze --local > requirements.txt
I see all system modules that have nothing to do with my project.
I would imagine conda or pip has something that would just go through all my files in the directory I am and place all imports and their dependencies inside the environment.yml/requirements.txt file.
I can't find the command to do that..
You can use virtualenv to isolate your pip environment of your application from rest of your system. Use:
virtualenv <your_project_path>/venv
This will create a virtual environment of your app. Then use;
source venv/bin/activate
This will isolate your pip environment. Reinstall all your dependencies and run pip freeze you will see only project related dependencies.
pip freeze by default fetches all installed pip modules over the system. If you use virtualenv and then install your dependencies, your pip modules will reside in your application folder.
edit
I would recommend a good IDE based on your comments such as PyCharm. You can follow tutorial here for setting up venv and handling all your dependencies. Once done, you can run pip freeze for your requirements.txt
I have a flask app with a Pipfile and run pipenv run python setup.py sdist to create a package. I copy the package to another system.
Usually I would install it with pip and all requirements declared in install_requires in setup.py would be installed automatically.
How can I install the package and its requirements and make use of the Pipfile.lock?
If I install the package with pip, I could run pipenv install --deploy in the directory it was installed, but how can I reliably retrieve the directory my package was installed in? Is this a good way to do this?
I'm looking for the best way to install a python app with setuptools and pipenv.
I'll share my thoughts on this.
First, the abstract dependencies of your project are to be listed in setup.py's install_requires. They should not be pinned if possible and reading dependencies in setup.py from somewhere else is not recommended.
Thus, if you install a package created with python setup.py sdist, that project's Pipfile.lock will not be used. Instead the user of the package is reponsible for locking the dependencies and installing the package into a virtualenv.
To use our Pipfile.lock we need a different approach to deployment. I've collected a few.
1) git clone the repository on target machine or rsync -r it to target machine. Run pipenv install --deploy in cloned project directory. There are several ways of using the virtualenv:
Launch the app with pipenv run <appname> from the cloned project directory. Make sure you are the same user who created the virtualenv.
Retrieve the virtualenv location by running pipenv --venv from the cloned project directory as the same user who created the virtualenv and use it directly for running your app.
Set PIPENV_VENV_IN_PROJECT=1 environment variable before running pipenv install --deploy to get a consistent virtualenv location that you can then use directly for running your app.
Before running pipenv, manually create a virtualenv then run pipenv from there. Pipenv will automatically use that virtualenv instead of creating a new one. See https://stackoverflow.com/a/49388414/7662112 for a complete workflow.
2) Use pipenv install --system --deploy to set up the virtualenv from your Pipfile.lock in docker. Then just use the docker image for deployment.
3) Dump Pipfile.lock into a requirements.txt with pipenv lock --requirements > requirements.txt and build a deb package using dh-virtualenv.
I do not have root previlege on a linux server so I want to creat a virtual python according to creating a "virtual" python.
After I run virtual-python.py, I do have python in ~/bin/python:
Then, according to setuptools PyPI page, I download ez_setup.py and run ~/bin/python ez_setup.py. Error occurs:
What should I do?
Looking at the linked website, it looks outdated. You use pip, not easy_install.
For installing development packages, I always take the following rules in account:
The system package manager is responsible for system-wide packages, so never use sudo pip. This doesn't just match the question, but this is always a good idea.
The package manager packages are probably outdated. You'll want an up-to-date version for development tools.
I recommend the following way to install local development tools.
$ # Install pip and setuptools on a user level
$ curl https://bootstrap.pypa.io/get-pip.py | python - --user
$ # Add the executables to your path. Add this to your `.bashrc` or `.profile` as well
$ export PATH=$PATH/$HOME/.local/bin
At this point pip should be accessible from the command line and usable without sudo. Use this to install virtualenv, the most widely used tools to set up virtual environments.
$ pip install virtualenv --user
Now simply use virtualenv to set up an environment to run your application in:
$ virtualenv myapp
Now activate the virtual environment and do whatever you would like to do with it. Note that after activating the virtual environment, pip refers to pip installed inside of the virtualenv, not the one installed on a user level.
$ source myapp/bin/activate
(myapp)$ pip install -r requirements.txt # This is just an example
You'll want to create a new virtual environment for each application you run on the server, so the dependencies can't conflict.
I'm using a virtualenv to run Python 2.7 in my local machine and everything works as expected. When I transfer "site-packages" to my production sever, I get the follow error:
PIL/_imaging.so: invalid ELF header
This happens on the Pillow 2.5.3 pypi package found here
I am running OS X, while my production server is running Debian. I suspect the OS differences might be causing the issue but I'm not sure. I have no idea how to fix this. Can anyone help?
Note: I cannot install packages directly to my production server, so I have to upload them directly to use them.
In your current virtual environment, execute the following command
pip freeze > requirements.txt
Copy this requirements.txt file to your server.
Create your new virtualenvironment (delete the one you were using before).
Activate the virtual environment and then type pip install -r requirements.txt
Now, the libraries will be installed correctly and built accurately as well.
If you see errors for PIL, execute the following commands:
sudo apt-get install build-essential python-dev
sudo apt-get build-dep python-imaging
virtual environments are for isolating Python on your current machine; they are not for creating portable environments. The benefit is to work with different versions of Python packages without modifying the system Python installation.
Using virtual environments does not require super-user permissions; so you can install packages even if you are not "root".
It does, however, require Internet access as packages are downloaded from the web. If your server does not have access to the Internet, back on your mac, do the following, from your virtual environment:
pip install basket
This will install basket which is a small utility that allows you to download packages but not install them. Great for keeping a local archive of packages that you can move to other machines.
Once its installed, follow these steps as listed in the documentation:
basket init
pip freeze > requirements.txt
awk -F'==' '{print $1}' requirements.txt | basket download
This will download all the packages from your requirements.txt file into ~/.basket
Next, copy this directory to your server and then run the following command from your virtualenvironment
pip install --no-index -f /path/to/basket -r requirements.txt