I have a flask app with a Pipfile and run pipenv run python setup.py sdist to create a package. I copy the package to another system.
Usually I would install it with pip and all requirements declared in install_requires in setup.py would be installed automatically.
How can I install the package and its requirements and make use of the Pipfile.lock?
If I install the package with pip, I could run pipenv install --deploy in the directory it was installed, but how can I reliably retrieve the directory my package was installed in? Is this a good way to do this?
I'm looking for the best way to install a python app with setuptools and pipenv.
I'll share my thoughts on this.
First, the abstract dependencies of your project are to be listed in setup.py's install_requires. They should not be pinned if possible and reading dependencies in setup.py from somewhere else is not recommended.
Thus, if you install a package created with python setup.py sdist, that project's Pipfile.lock will not be used. Instead the user of the package is reponsible for locking the dependencies and installing the package into a virtualenv.
To use our Pipfile.lock we need a different approach to deployment. I've collected a few.
1) git clone the repository on target machine or rsync -r it to target machine. Run pipenv install --deploy in cloned project directory. There are several ways of using the virtualenv:
Launch the app with pipenv run <appname> from the cloned project directory. Make sure you are the same user who created the virtualenv.
Retrieve the virtualenv location by running pipenv --venv from the cloned project directory as the same user who created the virtualenv and use it directly for running your app.
Set PIPENV_VENV_IN_PROJECT=1 environment variable before running pipenv install --deploy to get a consistent virtualenv location that you can then use directly for running your app.
Before running pipenv, manually create a virtualenv then run pipenv from there. Pipenv will automatically use that virtualenv instead of creating a new one. See https://stackoverflow.com/a/49388414/7662112 for a complete workflow.
2) Use pipenv install --system --deploy to set up the virtualenv from your Pipfile.lock in docker. Then just use the docker image for deployment.
3) Dump Pipfile.lock into a requirements.txt with pipenv lock --requirements > requirements.txt and build a deb package using dh-virtualenv.
Related
I am coming from NodeJS and learning Python and was wondering how to properly install the packages in requirements.txt file locally in the project.
For node, this is done by managing and installing the packages in package.json via npm install. However, the convention for Python project seems to be to add packages to a directory called lib. When I do pip install -r requirements.txt I think this does a global install on my computer, similar to nodes npm install -g global install. How can I install the dependencies of my requirements.txt file in a folder called lib?
use this command
pip install -r requirements.txt -t <path-to-the-lib-directory>
If you're looking to install dependencies in special (non-standard) local folder for a specific purpose (e.g. AWS Lambda), see this question: install python package at current directory.
For normal workflows the following is the way to install dependencies locally (instead of globally, equivalent to npm i instead of npm i -g in Node):
The recommended way to do this is by using a virtual environment. You can install virtualenv via pip with
pip install virtualenv
Then create a virtual environment in your project directory:
python3 -m venv env # previously: `virtualenv env`
Which will create a directory called env (you can call it anything you like though) which will mirror your global python installation. Inside env/ there will be a directory called lib which will contain Python and will store your dependencies.
Then activate the environment with:
source env/bin/activate
Then install your dependencies with pip and they will be installed in the virtual environment env/:
pip install -r requirements.txt
Then any time you return to the project, run source env/bin/activate again so that the dependencies can be found.
When you deploy your program, if the deployed environment is a physical server, or a virtual machine, you can follow the same process on the production machine. If the deployment environment is one of a few serverless environments (e.g. GCP App Engine), supplying a requirements.txt file will be sufficient. For some other serverless environments (e.g. AWS Lambda) the dependencies will need to be included in the root directory of the project. In that case, you should use pip install -r requirements.txt -t ./.
I would suggest getting the Anaconda navigator.
You can download it here: https://www.anaconda.com
Anaconda allows you to create virtual environments through a graphical interface. You can download any pip package that is available through Anaconda.
Then all you have to do after you have created and added onto your environment is to got to your designated python editor (I mainly use Pycharm) and setting the path to the virtual environment’s interpreter when you select or change the interpreter for your project.
Hope this helps.
When I create a virtualenv for a Python project, it get's "polluted" by packages that I install for my convenience (like iPython or other packages that my editor "VS Code" depends on, like "pylint").
But these packages are not relevant for my project. So if I do pip freeze > requirements.txt, I see that only a few packages are relevant for my project.
What is the best way to clean up?
Install those packages in a global context so that I can use them in every project I begin? or
Do a pip freeze > requirements.txt, then edit the requirements file and remove not needed packages?
What we do here:
First we have the project's requirement file - the one used for deployments. This is not built using pip freeze but manually edited so it only contains relevant packages.
Then we have the "dev" requirement file with packages that are only useful for development but are required to work on the project (linters, additionnal testing stuff etc).
And finally each is free to maintain his own personal additional requirements (editor-related packages etc).
Note that using virtualenvwrapper (which really helps for development installs) you define hooks that will install packages when you create a new virtual env.
Here is an alternative solution for preparing requirements.txt manually.
The project I mentioned above prepares a requirements.txt
for your project based on the imports you did in your project's Python files.
Assuming all of your Python files in myproject, doing these in your terminal:
$ pip install pipreqs
$ pipreqs myproject
will generate a requirements.txt file for you.
This way, you can just pip install -r requirements.txt in your virtual environment instead of pip freeze > requirement.txt since you will have only the packages which are related to your project.
While developing my app I didn't use an environment. Now I want to use one and export all dependencies of my app in an environment.yml / requirements.txt file that afterwards I can use it to build a docker image.
The issue is, if I create an environment and then export it with:
conda env export > environment.yml
I get no dependencies in that file.
Or if I use:
pip freeze --local > requirements.txt
I see all system modules that have nothing to do with my project.
I would imagine conda or pip has something that would just go through all my files in the directory I am and place all imports and their dependencies inside the environment.yml/requirements.txt file.
I can't find the command to do that..
You can use virtualenv to isolate your pip environment of your application from rest of your system. Use:
virtualenv <your_project_path>/venv
This will create a virtual environment of your app. Then use;
source venv/bin/activate
This will isolate your pip environment. Reinstall all your dependencies and run pip freeze you will see only project related dependencies.
pip freeze by default fetches all installed pip modules over the system. If you use virtualenv and then install your dependencies, your pip modules will reside in your application folder.
edit
I would recommend a good IDE based on your comments such as PyCharm. You can follow tutorial here for setting up venv and handling all your dependencies. Once done, you can run pip freeze for your requirements.txt
I created a project in PyCharm that uses flask (among a few other modules) installed in a PyCharm-created virutal environment of Python 3.6.1, and the app itself works great in PyCharm. When I went to set up a requirements.txt file for other virtual requirements, however, I noticed that only virtualenv was listed in the output file. (To create the file, I went to "console" and did "pip freeze > requirements.txt".)
After testing some things out, I noticed that pip list only gave me three modules installed to begin with. I know this is not true because when I go into the settings of my interpreter, PyCharm says there are a lot of other modules installed, but it doesn't seem like they're actually installed.
How can I create a requirements file in PyCharm effectively? Do I just have to list all the modules I installed in my readme and have my users figure out how to install them all themselves?
Screenshots below:
Project Settings dialog:
Pip list output:
Use pipreqs
$ pip install pipreqs
$ pipreqs /home/project/location
Successfully saved requirements file in /home/project/location/requirements.txt
This will export the packages used in your current project directory into requirements.txt
See pipreqs
Why not pip freeze?
pip freeze only saves the packages that are installed with pip
install in your environment.
pip freeze saves all packages in the
environment including those that you don't use in your current
project. (if you don't have virtualenv)
Sometimes you just need
to create requirements.txt for a new project without installing
modules.
It's certainly strange, the only thing I can think of is that it's a problem with virtualenv on Windows.
Anyways it wouldn't be best practice to create your requirements.txt from pip freeze, because you have more packages installed than the ones that your project requires.
E.g. lets say that your project only requires Flask:
$ pip install Flask
$ pip freeze
click==6.7
Flask==0.12.2
itsdangerous==0.24
Jinja2==2.9.6
MarkupSafe==1.0
Werkzeug==0.12.2
As you can see by installing Flask many more packages were installed, but you don't have to list those in your requirements.txt, as they are not your projects requirements, they are Flask's requirements.
Therefore you should construct your requirements.txt manually. What I usually do is pip install Flask; pip freeze |grep Flask and copy the line Flask==0.12.2 to my requirements.txt, doing this every time I install something with pip.
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.