After creating a fresh folder and creating a virtual environment
$ virtualenv venv --distribute
And installing two packages
$ pip install Flask gunicorn
Then writing all of the current pip installed packages to a file
$ pip freeze > requirements.txt
$ cat requirements.txt
Flask==0.10.1
Jinja2==2.7
MarkupSafe==0.18
Werkzeug==0.9.1
distribute==0.6.34
gunicorn==17.5
itsdangerous==0.22
wsgiref==0.1.2
I get this longer than expected list of packages, who is responsible for them being installed and what are they used for? The package list in question:
wsgiref==0.1.2
itsdangerous==0.22
distribute==0.6.34
MarkupSafe==0.18
I've used pip mostly on my Ubuntu box, and didn't have these packages installed after identical commands, I've noticed this behaviour only on my mac.
wsgiref and distribute are always present in the virtualenv, even an "empty" one where you have not yet pip install'ed anything. See the accepted answer to my question Why does pip freeze report some packages in a fresh virtualenv created with --no-site-packages? for an explanation. Note this is a bug fixed in Python 3.3.
itsdangerous and MarkupSafe are relatively recent, new dependencies pulled in by newer Flask releases.
itsdangerous (docs) is required by Flask directly. Since version 0.10 - see the github commit which added this dependency.
MarkupSafe (docs) is required by Jinja2 which is required by Flask. Jinja2 added this dependency in its version 2.7 - see the github commit.
You say that these are not installed on your Ubuntu box after running identical commands. But what version of Flask and Jinja2 do you have there? If they are older than the versions on your Mac, that might explain why they didn't pull in these new dependencies.
it looks like those are Flask dependencies, (or dependencies of the flask dependencies)
pip install --no-install --verbose Flask
I was hoping pypi had a list of dependencies for each project, but I didn't see them...
Your virtualenv uses the packages installed system wide, so pip sees them along your newly installed ones.
Try adding the --no-site-packages option when creating your environment.
Or, try to explicitly run the pip instance installed in your environment
(path/to/your/env/bin/pip opts...), maybe this will tell pip to ignore system's packages (not sure about that one at all).
Related
So im pretty new to python and i need pip for something i wanna do, but everytime i install python the Scripts folder where pip should be always installs empty. I make sure that install pip is checked when downloading python. How do I get it to install pip with python?
If you installed Python, it came bundled with pip.
The default packages:
C:\Users\<Your username>\AppData\Local\Programs\Python\Python310\Lib\site-packages
A better practice is to install a venv for every project you are working on. Every venv will have its own set of dependencies(i.e. packages), and it will be easier to manage them.
You can find installed packages in your venv:
YOUR_PROJECT_PATH\venv\Lib\site-packages
EDIT
From official docs:
Pip not installed¶ It is possible that pip does not get installed by default. One potential fix is:
python -m ensurepip --default-pip
There are also additional resources
for Installing pip
Django pip freeze > requirements.txt not getting the exact packages installed in the virtual env rather it's getting all the packages i have ever installed and it's kinda not what i exactly wants, let me show some image of whats happening
there are still more packages below, please what can i do
Whenever you do
pip freeze
It prints out all the installed packages. May be you are confusing with packages installed as dependency of manually installed packages.
For example if you install Fastapi, it will also install jinja2
I can't think of any case where you want packages installed by you and not their dependencies. Its not a problem at all.
On the other hand if its actually giving you all the packages ever installed, it means you have installed all your packages in the same environment always. You should use different environments for each of your project (sometimes even more than one for a single project). In this case, create another virtual environment, install requirements and then again use pip freeze. Steps below.
python3 -m venv venv
source venv/bin/activate
pip install {required packages}
pip freeze > requirements.txt
I created a project in PyCharm that uses flask (among a few other modules) installed in a PyCharm-created virutal environment of Python 3.6.1, and the app itself works great in PyCharm. When I went to set up a requirements.txt file for other virtual requirements, however, I noticed that only virtualenv was listed in the output file. (To create the file, I went to "console" and did "pip freeze > requirements.txt".)
After testing some things out, I noticed that pip list only gave me three modules installed to begin with. I know this is not true because when I go into the settings of my interpreter, PyCharm says there are a lot of other modules installed, but it doesn't seem like they're actually installed.
How can I create a requirements file in PyCharm effectively? Do I just have to list all the modules I installed in my readme and have my users figure out how to install them all themselves?
Screenshots below:
Project Settings dialog:
Pip list output:
Use pipreqs
$ pip install pipreqs
$ pipreqs /home/project/location
Successfully saved requirements file in /home/project/location/requirements.txt
This will export the packages used in your current project directory into requirements.txt
See pipreqs
Why not pip freeze?
pip freeze only saves the packages that are installed with pip
install in your environment.
pip freeze saves all packages in the
environment including those that you don't use in your current
project. (if you don't have virtualenv)
Sometimes you just need
to create requirements.txt for a new project without installing
modules.
It's certainly strange, the only thing I can think of is that it's a problem with virtualenv on Windows.
Anyways it wouldn't be best practice to create your requirements.txt from pip freeze, because you have more packages installed than the ones that your project requires.
E.g. lets say that your project only requires Flask:
$ pip install Flask
$ pip freeze
click==6.7
Flask==0.12.2
itsdangerous==0.24
Jinja2==2.9.6
MarkupSafe==1.0
Werkzeug==0.12.2
As you can see by installing Flask many more packages were installed, but you don't have to list those in your requirements.txt, as they are not your projects requirements, they are Flask's requirements.
Therefore you should construct your requirements.txt manually. What I usually do is pip install Flask; pip freeze |grep Flask and copy the line Flask==0.12.2 to my requirements.txt, doing this every time I install something with pip.
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
I have some problems with jenkins and creating a virtualenv. I'm using the shiningpanda plugin and the "Virtualenv Builder" build step combined with pyenv.
I can install packages with "pip install package" but I cannot install requirements from a requirements file, because the subsequent packages cannot find the installed packages, e.g. numexpr cannot find/import numpy.
As I was typing my question, I found the answer to that problem: The current version (v0.21) of the shiningpanda plugin does NOT support pip's requirements.txt in virtualenv builders.
https://wiki.jenkins-ci.org/display/JENKINS/ShiningPanda+Plugin
Current version (0.23) works in our setup like this (in Build-Virtualenv Builder, with Nature: Shell):
pushd %run_dir%
SET PYTHONPATH=%CD%
python -m pip install --upgrade -r configurations/requirements.txt
This has worked well even if libraries require each other.