Sometimes I download the python source code from github and don't know how to install all the dependencies. If there is no requirements.txt file I have to create it by hands.
The question is:
Given the python source code directory is it possible to create requirements.txt automatically from the import section?
You can use the following code to generate a requirements.txt file:
pip install pipreqs
pipreqs /path/to/project
more info related to pipreqs can be found here.
Sometimes you come across pip freeze, but this saves all packages in the environment including those that you don't use in your current project.
Use Pipenv or other tools is recommended for improving your development flow.
pip3 freeze > requirements.txt # Python3
pip freeze > requirements.txt # Python2
If you do not use a virtual environment, pigar will be a good choice for you.
For python3: (I have both python 2 and 3 on my machine, where python2 is the default)
# install
pip3 install pipreqs
# Run in current directory
python3 -m pipreqs.pipreqs .
python2:
pip install pipreqs
python -m pipreqs.pipreqs .
To check your python version:
python --version
In my case, I use Anaconda, so running the following command from conda terminal inside my environment solved it, and created this requirements.txt file for me automatically:
conda list -e > requirements.txt
This was taken from this Github link pratos/condaenv.txt
If an error been seen, and you are using anaconda, try to use the .yml option:
conda env export > <environment-name>.yml
For other person to use the environment or if you are creating a new enviroment on another machine:
conda env create -f <environment-name>.yml
.yml option been found here
Kinda mind-blowing how this simple task is so complicated in Python. Here is what I think is the best way to do it automatically.
You need two tools:
1.pipreqs
pip3 install pipreqs
pipreqs will go through your project and only install the packages that your project use. Instead of all the packages in your python environment as pip freeze would do.
But there's a problem with this approach. It does not install the sub-packages.
For example, your project uses pandas==1.3.2. pandas itself uses numpy==1.21.2 among other packages. But pipreqs itself does not write the sub-packages (i.e. numpy) in requirments.txt
This is where you need to combine pipreqs with the second tool.
pip-tools
pip3 install pip-tools
pip-tools will take the packages in requirements.in and generate the requirements.txt with all the sub-packages. For example, if you have
pandas==1.3.2 in requirements.in, pip-tools would generate
numpy==1.21.2 # via pandas in requirements.txt.
But you need to manually add the package in requirements.in. Which is prone to mistake and you might forget to do this once in a while.
This is where you can use the first tool.
But both the tools write to requirements.txt. So how do you fix it?
Use the --savepath for pipreqs to write in requirements.in instead of the default requirements.txt.
To do it in one command; just do
pipreqs --savepath=requirements.in && pip-compile
There you go. Now you don't need to worry about manually maintaining the packages and you're requirements.txt will have all the sub-packages so that your build is deterministic.
TL;DR
pip3 install pipreqs
pip3 install pip-tools
Use the following to build a deterministic requirements.txt
pipreqs --savepath=requirements.in && pip-compile
As most of the answers using pipreqs didn't work for me. Here, is my answer.
To generate the requirements.txt file:
pip install pipreqs
python -m pipreqs.pipreqs --encoding utf-8 /path/to/project
I prefer using pipreqs more than pip freeze, as pip freeze saves all packages in the environment including those that you don't use in your current project. However, pipreqs only save the ones you are using in your project.
To install the requirements use:
pip3 install -r requirements.txt
I blindly followed the accepted answer of using
pip3 freeze > requirements.txt
It generated a huge file that listed all the dependencies of the entire solution, which is not what I wanted.
So you need to figure out what sort of requirements.txt you are trying to generate.
If you need a requirements.txt file that has ALL the dependencies, then use the pip3
pip3 freeze > requirements.txt
However, if you want to generate a minimal requirements.txt that only lists the dependencies you need, then use the pipreqs package. Especially helpful if you have numerous requirements.txt files in per component level in the project and not a single file on the solution wide level.
pip install pipreqs
pipreqs [path to folder]
e.g. pipreqs .
pipreqs . --force --ignore=tests (Overwrites exisiting requirements.txt, ignores the tests directory)
Firstly, your project file must be a py file which is direct python file. If your file is in ipynb format, you can convert it to py type by using the line of code below:
jupyter nbconvert --to=python
Then, you need to install pipreqs library from cmd (terminal for mac).
pip install pipreqs
Now we can create txt file by using the code below. If you are in the same path with your file, you can just write ./ . Otherwise you need to give path of your file.
pipreqs ./
or
pipreqs /home/project/location
That will create a requirements.txt file for your project.
Make sure to run pip3 for python3.7.
pip3 freeze >> yourfile.txt
Before executing the above command make sure you have created a virtual environment.
python3:
pip3 install virtualenv
python3 -m venv <myenvname>
python2:
pip install virtualenv
virtualenv <myenvname>
After that put your source code in the directory. If you run the python file now, probably it won't launch if you are using non-native modules. You can install those modules by running pip3 install <module> or pip install <module>.
This will not affect you entire module list except the environment you are in.
Now you can execute the command at the top and now you have a requirements file which contains only the modules you installed in the virtual environment. Now you can run the command at the top.
I advise everyone to use environments as it makes things easier when it comes to stuff like this.
Simple Pythonic Way
To get a list of all the REQUIREMENTS in a standard requirements.txt file, you can use the following command.
pip freeze > requirements.txt
Now, this should automatically create a standard requirements file with all of the packages installed alongside their corresponding versions.
Pretty Print on Terminal
If you just want to get a pretty print on the terminal you can use the following approach.
pip list
This lists all of the installed packages, in a pretty print format.
Custom Dependency
If you have a project folder like say, a Github Repo, and you want to get a custom requirements.txt for project You can use the following Package.
https://pypi.org/project/pipreqs/ pipreqs
Usage
$ pipreqs /home/project/location
Successfully saved requirements file in /home/project/location/requirements.txt
Contents of requirements.txt
wheel==0.23.0
Yarg==0.1.9
docopt==0.6.2
If you have installed many dependencies in your system and you need requirements.txt for a specific project, you can install first pipreqs:
$ pip install pipreqs
and execute the below command under the project folder.
$ pipreqs
This command will generate requirements.txt file for the particular project.
Automatic requirements.txt updating approach
While developing a python application with requirements.txt we have several choices:
Generate requirements.txt after development, when we want to deploy it. It is performed by pip freeze > requirements.txt or pipreqs for less messy result.
Add every module to requirements.txt manually after each install.
Install manager that will handle requirements.txt updates for us.
There are many answers for the 1-st option, the 2-d option is self-explanatory, so I would like to describe the 3-d approach. There is a library called to-requirements.txt. To install it type this:
pip install to-requirements.txt # Pip install to requirements.txt
If you read the whole command at once you would see, what it does. After installing you should setup it. Run:
requirements-txt setup
It overrides the pip scripts so that each pip install or pip uninstall updates the requirements.txt file of your project automatically with required versions of packages. The overriding is made safely, so that after uninstalling this package the pip will behave ordinary.
And you could customize the way it works. For example, disable it globally and activate it only for the required directories, activate it only for git repositories, or allow / disallow to create requirements.txt file if it does not exist.
Links:
Documentation - https://requirements-txt.readthedocs.io/en/latest/
GitHub - https://github.com/VoIlAlex/requirements-txt
PyPI - https://pypi.org/project/to-requirements.txt/
If Facing the same issue as mine i.e. not on the virtual environment and wants requirements.txt for a specific project or from the selected folder(includes children) and pipreqs is not supporting.
You can use :
import os
import sys
from fuzzywuzzy import fuzz
import subprocess
path = "C:/Users/Username/Desktop/DjangoProjects/restAPItest"
files = os.listdir(path)
pyfiles = []
for root, dirs, files in os.walk(path):
for file in files:
if file.endswith('.py'):
pyfiles.append(os.path.join(root, file))
stopWords = ['from', 'import',',','.']
importables = []
for file in pyfiles:
with open(file) as f:
content = f.readlines()
for line in content:
if "import" in line:
for sw in stopWords:
line = ' '.join(line.split(sw))
importables.append(line.strip().split(' ')[0])
importables = set(importables)
subprocess.call(f"pip freeze > {path}/requirements.txt", shell=True)
with open(path+'/requirements.txt') as req:
modules = req.readlines()
modules = {m.split('=')[0].lower() : m for m in modules}
notList = [''.join(i.split('_')) for i in sys.builtin_module_names]+['os']
new_requirements = []
for req_module in importables:
try :
new_requirements.append(modules[req_module])
except KeyError:
for k,v in modules.items():
if len(req_module)>1 and req_module not in notList:
if fuzz.partial_ratio(req_module,k) > 90:
new_requirements.append(modules[k])
new_requirements = [i for i in set(new_requirements)]
new_requirements
with open(path+'/requirements.txt','w') as req:
req.write(''.join(new_requirements))
P.S: It may have a few additional libraries as it checks on fuzzylogic.
best way for Python 3 is:
pip3 freeze > requirements.txt
it worked for me...
If you want to list only packages used inside a virtualenv use:
pip freeze -l > requirements.txt
You can just do it with command. it will create requirement.txt and add relevant modules automatically.
For Unix : pip3 freeze > requirements.txt
For Windos: pip freeze > requirements.txt
CREATE requirement.txt:
For Python 3 version command is:
pip3 freeze > requirements.txt
For Python 2 version command is:
pip freeze > requirements.txt
Install requirements.txt:
For Python 3 version command is:
pip3 install -r requirements.txt
For Python 2 version command is:
pip install -r requirements.txt
Not a complete solution, but may help to compile a shortlist on Linux.
grep --include='*.py' -rhPo '^\s*(from|import)\s+\w+' . | sed -r 's/\s*(import|from)\s+//' | sort -u > requirements.txt
Or if your are using a something like virtualenv you can just run this command to generate a requirements.txt
$ ./.venv/bin/pip freeze > requirements.txt
I created this bash command.
for l in $(pip freeze); do p=$(echo "$l" | cut -d'=' -f1); f=$(find . -type f -exec grep "$p" {} \; | grep 'import'); [[ ! -z "$f" ]] && echo "$l" ; done;
#Francis has it right - https://stackoverflow.com/a/65728461/1021819
But just to add:
With additional support for Jupyter notebooks - i.e. .ipynb files - you can now use https://pypi.org/project/pipreqsnb (same syntax as pipreqs):
pip install pipreqsnb
pipreqsnb .
[I am not an author]
Using pip freeze > requirements.txt is a bad way to create the requirements file! It can serve as a temporary solution for your problem but when managing requirements for python project it is best to do it manually.
A simple search for "import" or "from x import" will give you the list of all dependencies that need to be installed (nothing extra).
The problem with pip freeze it that it simply dumps all installed packages with strict versions, every dependency has its own dependencies and they are included in the dump.
For example, you have lib==1.0 installed, that needs sub-lib==0.5, if you use pip freeze you'll get both, but later when you wish to update the version of lib to 2.0, most likely you'll get conflicts since lib v2.0 now uses sub-lib v1.0 not v0.5 that you require... This gets complex fast for multiple dependencies.
We got into those problems in a couple of projects, since then I created an automated script to clean pip freeze's dumps, it is safe (comments unneeded dependencies) and works great.
To help solve this problem, always run requirements.txt on only local packages. By local packages I mean packages that are only in your project folder. To do this do:
Pip freeze —local > requirements.txt
Not pip freeze > requirements.txt.
Note that it’s double underscore before local.
However installing pipreqs helps too.
Pip install pipreqs.
The perfect solution though is to have a pipfile. The pipfile updates on its own whenever you install a new local package. It also has a pipfile.lock similar to package.json in JavaScript.
To do this always install your packages with pipenv not pip.
So we do pipenv
Pipenv users can generate the requirement.txt file from the project's Pipfile with:
pipenv lock --requirements
Related
Sometimes I download the python source code from github and don't know how to install all the dependencies. If there is no requirements.txt file I have to create it by hands.
The question is:
Given the python source code directory is it possible to create requirements.txt automatically from the import section?
You can use the following code to generate a requirements.txt file:
pip install pipreqs
pipreqs /path/to/project
more info related to pipreqs can be found here.
Sometimes you come across pip freeze, but this saves all packages in the environment including those that you don't use in your current project.
Use Pipenv or other tools is recommended for improving your development flow.
pip3 freeze > requirements.txt # Python3
pip freeze > requirements.txt # Python2
If you do not use a virtual environment, pigar will be a good choice for you.
For python3: (I have both python 2 and 3 on my machine, where python2 is the default)
# install
pip3 install pipreqs
# Run in current directory
python3 -m pipreqs.pipreqs .
python2:
pip install pipreqs
python -m pipreqs.pipreqs .
To check your python version:
python --version
In my case, I use Anaconda, so running the following command from conda terminal inside my environment solved it, and created this requirements.txt file for me automatically:
conda list -e > requirements.txt
This was taken from this Github link pratos/condaenv.txt
If an error been seen, and you are using anaconda, try to use the .yml option:
conda env export > <environment-name>.yml
For other person to use the environment or if you are creating a new enviroment on another machine:
conda env create -f <environment-name>.yml
.yml option been found here
Kinda mind-blowing how this simple task is so complicated in Python. Here is what I think is the best way to do it automatically.
You need two tools:
1.pipreqs
pip3 install pipreqs
pipreqs will go through your project and only install the packages that your project use. Instead of all the packages in your python environment as pip freeze would do.
But there's a problem with this approach. It does not install the sub-packages.
For example, your project uses pandas==1.3.2. pandas itself uses numpy==1.21.2 among other packages. But pipreqs itself does not write the sub-packages (i.e. numpy) in requirments.txt
This is where you need to combine pipreqs with the second tool.
pip-tools
pip3 install pip-tools
pip-tools will take the packages in requirements.in and generate the requirements.txt with all the sub-packages. For example, if you have
pandas==1.3.2 in requirements.in, pip-tools would generate
numpy==1.21.2 # via pandas in requirements.txt.
But you need to manually add the package in requirements.in. Which is prone to mistake and you might forget to do this once in a while.
This is where you can use the first tool.
But both the tools write to requirements.txt. So how do you fix it?
Use the --savepath for pipreqs to write in requirements.in instead of the default requirements.txt.
To do it in one command; just do
pipreqs --savepath=requirements.in && pip-compile
There you go. Now you don't need to worry about manually maintaining the packages and you're requirements.txt will have all the sub-packages so that your build is deterministic.
TL;DR
pip3 install pipreqs
pip3 install pip-tools
Use the following to build a deterministic requirements.txt
pipreqs --savepath=requirements.in && pip-compile
As most of the answers using pipreqs didn't work for me. Here, is my answer.
To generate the requirements.txt file:
pip install pipreqs
python -m pipreqs.pipreqs --encoding utf-8 /path/to/project
I prefer using pipreqs more than pip freeze, as pip freeze saves all packages in the environment including those that you don't use in your current project. However, pipreqs only save the ones you are using in your project.
To install the requirements use:
pip3 install -r requirements.txt
I blindly followed the accepted answer of using
pip3 freeze > requirements.txt
It generated a huge file that listed all the dependencies of the entire solution, which is not what I wanted.
So you need to figure out what sort of requirements.txt you are trying to generate.
If you need a requirements.txt file that has ALL the dependencies, then use the pip3
pip3 freeze > requirements.txt
However, if you want to generate a minimal requirements.txt that only lists the dependencies you need, then use the pipreqs package. Especially helpful if you have numerous requirements.txt files in per component level in the project and not a single file on the solution wide level.
pip install pipreqs
pipreqs [path to folder]
e.g. pipreqs .
pipreqs . --force --ignore=tests (Overwrites exisiting requirements.txt, ignores the tests directory)
Firstly, your project file must be a py file which is direct python file. If your file is in ipynb format, you can convert it to py type by using the line of code below:
jupyter nbconvert --to=python
Then, you need to install pipreqs library from cmd (terminal for mac).
pip install pipreqs
Now we can create txt file by using the code below. If you are in the same path with your file, you can just write ./ . Otherwise you need to give path of your file.
pipreqs ./
or
pipreqs /home/project/location
That will create a requirements.txt file for your project.
Make sure to run pip3 for python3.7.
pip3 freeze >> yourfile.txt
Before executing the above command make sure you have created a virtual environment.
python3:
pip3 install virtualenv
python3 -m venv <myenvname>
python2:
pip install virtualenv
virtualenv <myenvname>
After that put your source code in the directory. If you run the python file now, probably it won't launch if you are using non-native modules. You can install those modules by running pip3 install <module> or pip install <module>.
This will not affect you entire module list except the environment you are in.
Now you can execute the command at the top and now you have a requirements file which contains only the modules you installed in the virtual environment. Now you can run the command at the top.
I advise everyone to use environments as it makes things easier when it comes to stuff like this.
Simple Pythonic Way
To get a list of all the REQUIREMENTS in a standard requirements.txt file, you can use the following command.
pip freeze > requirements.txt
Now, this should automatically create a standard requirements file with all of the packages installed alongside their corresponding versions.
Pretty Print on Terminal
If you just want to get a pretty print on the terminal you can use the following approach.
pip list
This lists all of the installed packages, in a pretty print format.
Custom Dependency
If you have a project folder like say, a Github Repo, and you want to get a custom requirements.txt for project You can use the following Package.
https://pypi.org/project/pipreqs/ pipreqs
Usage
$ pipreqs /home/project/location
Successfully saved requirements file in /home/project/location/requirements.txt
Contents of requirements.txt
wheel==0.23.0
Yarg==0.1.9
docopt==0.6.2
If you have installed many dependencies in your system and you need requirements.txt for a specific project, you can install first pipreqs:
$ pip install pipreqs
and execute the below command under the project folder.
$ pipreqs
This command will generate requirements.txt file for the particular project.
Automatic requirements.txt updating approach
While developing a python application with requirements.txt we have several choices:
Generate requirements.txt after development, when we want to deploy it. It is performed by pip freeze > requirements.txt or pipreqs for less messy result.
Add every module to requirements.txt manually after each install.
Install manager that will handle requirements.txt updates for us.
There are many answers for the 1-st option, the 2-d option is self-explanatory, so I would like to describe the 3-d approach. There is a library called to-requirements.txt. To install it type this:
pip install to-requirements.txt # Pip install to requirements.txt
If you read the whole command at once you would see, what it does. After installing you should setup it. Run:
requirements-txt setup
It overrides the pip scripts so that each pip install or pip uninstall updates the requirements.txt file of your project automatically with required versions of packages. The overriding is made safely, so that after uninstalling this package the pip will behave ordinary.
And you could customize the way it works. For example, disable it globally and activate it only for the required directories, activate it only for git repositories, or allow / disallow to create requirements.txt file if it does not exist.
Links:
Documentation - https://requirements-txt.readthedocs.io/en/latest/
GitHub - https://github.com/VoIlAlex/requirements-txt
PyPI - https://pypi.org/project/to-requirements.txt/
If Facing the same issue as mine i.e. not on the virtual environment and wants requirements.txt for a specific project or from the selected folder(includes children) and pipreqs is not supporting.
You can use :
import os
import sys
from fuzzywuzzy import fuzz
import subprocess
path = "C:/Users/Username/Desktop/DjangoProjects/restAPItest"
files = os.listdir(path)
pyfiles = []
for root, dirs, files in os.walk(path):
for file in files:
if file.endswith('.py'):
pyfiles.append(os.path.join(root, file))
stopWords = ['from', 'import',',','.']
importables = []
for file in pyfiles:
with open(file) as f:
content = f.readlines()
for line in content:
if "import" in line:
for sw in stopWords:
line = ' '.join(line.split(sw))
importables.append(line.strip().split(' ')[0])
importables = set(importables)
subprocess.call(f"pip freeze > {path}/requirements.txt", shell=True)
with open(path+'/requirements.txt') as req:
modules = req.readlines()
modules = {m.split('=')[0].lower() : m for m in modules}
notList = [''.join(i.split('_')) for i in sys.builtin_module_names]+['os']
new_requirements = []
for req_module in importables:
try :
new_requirements.append(modules[req_module])
except KeyError:
for k,v in modules.items():
if len(req_module)>1 and req_module not in notList:
if fuzz.partial_ratio(req_module,k) > 90:
new_requirements.append(modules[k])
new_requirements = [i for i in set(new_requirements)]
new_requirements
with open(path+'/requirements.txt','w') as req:
req.write(''.join(new_requirements))
P.S: It may have a few additional libraries as it checks on fuzzylogic.
best way for Python 3 is:
pip3 freeze > requirements.txt
it worked for me...
If you want to list only packages used inside a virtualenv use:
pip freeze -l > requirements.txt
You can just do it with command. it will create requirement.txt and add relevant modules automatically.
For Unix : pip3 freeze > requirements.txt
For Windos: pip freeze > requirements.txt
CREATE requirement.txt:
For Python 3 version command is:
pip3 freeze > requirements.txt
For Python 2 version command is:
pip freeze > requirements.txt
Install requirements.txt:
For Python 3 version command is:
pip3 install -r requirements.txt
For Python 2 version command is:
pip install -r requirements.txt
Not a complete solution, but may help to compile a shortlist on Linux.
grep --include='*.py' -rhPo '^\s*(from|import)\s+\w+' . | sed -r 's/\s*(import|from)\s+//' | sort -u > requirements.txt
Or if your are using a something like virtualenv you can just run this command to generate a requirements.txt
$ ./.venv/bin/pip freeze > requirements.txt
I created this bash command.
for l in $(pip freeze); do p=$(echo "$l" | cut -d'=' -f1); f=$(find . -type f -exec grep "$p" {} \; | grep 'import'); [[ ! -z "$f" ]] && echo "$l" ; done;
#Francis has it right - https://stackoverflow.com/a/65728461/1021819
But just to add:
With additional support for Jupyter notebooks - i.e. .ipynb files - you can now use https://pypi.org/project/pipreqsnb (same syntax as pipreqs):
pip install pipreqsnb
pipreqsnb .
[I am not an author]
Using pip freeze > requirements.txt is a bad way to create the requirements file! It can serve as a temporary solution for your problem but when managing requirements for python project it is best to do it manually.
A simple search for "import" or "from x import" will give you the list of all dependencies that need to be installed (nothing extra).
The problem with pip freeze it that it simply dumps all installed packages with strict versions, every dependency has its own dependencies and they are included in the dump.
For example, you have lib==1.0 installed, that needs sub-lib==0.5, if you use pip freeze you'll get both, but later when you wish to update the version of lib to 2.0, most likely you'll get conflicts since lib v2.0 now uses sub-lib v1.0 not v0.5 that you require... This gets complex fast for multiple dependencies.
We got into those problems in a couple of projects, since then I created an automated script to clean pip freeze's dumps, it is safe (comments unneeded dependencies) and works great.
To help solve this problem, always run requirements.txt on only local packages. By local packages I mean packages that are only in your project folder. To do this do:
Pip freeze —local > requirements.txt
Not pip freeze > requirements.txt.
Note that it’s double underscore before local.
However installing pipreqs helps too.
Pip install pipreqs.
The perfect solution though is to have a pipfile. The pipfile updates on its own whenever you install a new local package. It also has a pipfile.lock similar to package.json in JavaScript.
To do this always install your packages with pipenv not pip.
So we do pipenv
Pipenv users can generate the requirement.txt file from the project's Pipfile with:
pipenv lock --requirements
Sometimes I download the python source code from github and don't know how to install all the dependencies. If there is no requirements.txt file I have to create it by hands.
The question is:
Given the python source code directory is it possible to create requirements.txt automatically from the import section?
You can use the following code to generate a requirements.txt file:
pip install pipreqs
pipreqs /path/to/project
more info related to pipreqs can be found here.
Sometimes you come across pip freeze, but this saves all packages in the environment including those that you don't use in your current project.
Use Pipenv or other tools is recommended for improving your development flow.
pip3 freeze > requirements.txt # Python3
pip freeze > requirements.txt # Python2
If you do not use a virtual environment, pigar will be a good choice for you.
For python3: (I have both python 2 and 3 on my machine, where python2 is the default)
# install
pip3 install pipreqs
# Run in current directory
python3 -m pipreqs.pipreqs .
python2:
pip install pipreqs
python -m pipreqs.pipreqs .
To check your python version:
python --version
In my case, I use Anaconda, so running the following command from conda terminal inside my environment solved it, and created this requirements.txt file for me automatically:
conda list -e > requirements.txt
This was taken from this Github link pratos/condaenv.txt
If an error been seen, and you are using anaconda, try to use the .yml option:
conda env export > <environment-name>.yml
For other person to use the environment or if you are creating a new enviroment on another machine:
conda env create -f <environment-name>.yml
.yml option been found here
Kinda mind-blowing how this simple task is so complicated in Python. Here is what I think is the best way to do it automatically.
You need two tools:
1.pipreqs
pip3 install pipreqs
pipreqs will go through your project and only install the packages that your project use. Instead of all the packages in your python environment as pip freeze would do.
But there's a problem with this approach. It does not install the sub-packages.
For example, your project uses pandas==1.3.2. pandas itself uses numpy==1.21.2 among other packages. But pipreqs itself does not write the sub-packages (i.e. numpy) in requirments.txt
This is where you need to combine pipreqs with the second tool.
pip-tools
pip3 install pip-tools
pip-tools will take the packages in requirements.in and generate the requirements.txt with all the sub-packages. For example, if you have
pandas==1.3.2 in requirements.in, pip-tools would generate
numpy==1.21.2 # via pandas in requirements.txt.
But you need to manually add the package in requirements.in. Which is prone to mistake and you might forget to do this once in a while.
This is where you can use the first tool.
But both the tools write to requirements.txt. So how do you fix it?
Use the --savepath for pipreqs to write in requirements.in instead of the default requirements.txt.
To do it in one command; just do
pipreqs --savepath=requirements.in && pip-compile
There you go. Now you don't need to worry about manually maintaining the packages and you're requirements.txt will have all the sub-packages so that your build is deterministic.
TL;DR
pip3 install pipreqs
pip3 install pip-tools
Use the following to build a deterministic requirements.txt
pipreqs --savepath=requirements.in && pip-compile
As most of the answers using pipreqs didn't work for me. Here, is my answer.
To generate the requirements.txt file:
pip install pipreqs
python -m pipreqs.pipreqs --encoding utf-8 /path/to/project
I prefer using pipreqs more than pip freeze, as pip freeze saves all packages in the environment including those that you don't use in your current project. However, pipreqs only save the ones you are using in your project.
To install the requirements use:
pip3 install -r requirements.txt
I blindly followed the accepted answer of using
pip3 freeze > requirements.txt
It generated a huge file that listed all the dependencies of the entire solution, which is not what I wanted.
So you need to figure out what sort of requirements.txt you are trying to generate.
If you need a requirements.txt file that has ALL the dependencies, then use the pip3
pip3 freeze > requirements.txt
However, if you want to generate a minimal requirements.txt that only lists the dependencies you need, then use the pipreqs package. Especially helpful if you have numerous requirements.txt files in per component level in the project and not a single file on the solution wide level.
pip install pipreqs
pipreqs [path to folder]
e.g. pipreqs .
pipreqs . --force --ignore=tests (Overwrites exisiting requirements.txt, ignores the tests directory)
Firstly, your project file must be a py file which is direct python file. If your file is in ipynb format, you can convert it to py type by using the line of code below:
jupyter nbconvert --to=python
Then, you need to install pipreqs library from cmd (terminal for mac).
pip install pipreqs
Now we can create txt file by using the code below. If you are in the same path with your file, you can just write ./ . Otherwise you need to give path of your file.
pipreqs ./
or
pipreqs /home/project/location
That will create a requirements.txt file for your project.
Make sure to run pip3 for python3.7.
pip3 freeze >> yourfile.txt
Before executing the above command make sure you have created a virtual environment.
python3:
pip3 install virtualenv
python3 -m venv <myenvname>
python2:
pip install virtualenv
virtualenv <myenvname>
After that put your source code in the directory. If you run the python file now, probably it won't launch if you are using non-native modules. You can install those modules by running pip3 install <module> or pip install <module>.
This will not affect you entire module list except the environment you are in.
Now you can execute the command at the top and now you have a requirements file which contains only the modules you installed in the virtual environment. Now you can run the command at the top.
I advise everyone to use environments as it makes things easier when it comes to stuff like this.
Simple Pythonic Way
To get a list of all the REQUIREMENTS in a standard requirements.txt file, you can use the following command.
pip freeze > requirements.txt
Now, this should automatically create a standard requirements file with all of the packages installed alongside their corresponding versions.
Pretty Print on Terminal
If you just want to get a pretty print on the terminal you can use the following approach.
pip list
This lists all of the installed packages, in a pretty print format.
Custom Dependency
If you have a project folder like say, a Github Repo, and you want to get a custom requirements.txt for project You can use the following Package.
https://pypi.org/project/pipreqs/ pipreqs
Usage
$ pipreqs /home/project/location
Successfully saved requirements file in /home/project/location/requirements.txt
Contents of requirements.txt
wheel==0.23.0
Yarg==0.1.9
docopt==0.6.2
If you have installed many dependencies in your system and you need requirements.txt for a specific project, you can install first pipreqs:
$ pip install pipreqs
and execute the below command under the project folder.
$ pipreqs
This command will generate requirements.txt file for the particular project.
Automatic requirements.txt updating approach
While developing a python application with requirements.txt we have several choices:
Generate requirements.txt after development, when we want to deploy it. It is performed by pip freeze > requirements.txt or pipreqs for less messy result.
Add every module to requirements.txt manually after each install.
Install manager that will handle requirements.txt updates for us.
There are many answers for the 1-st option, the 2-d option is self-explanatory, so I would like to describe the 3-d approach. There is a library called to-requirements.txt. To install it type this:
pip install to-requirements.txt # Pip install to requirements.txt
If you read the whole command at once you would see, what it does. After installing you should setup it. Run:
requirements-txt setup
It overrides the pip scripts so that each pip install or pip uninstall updates the requirements.txt file of your project automatically with required versions of packages. The overriding is made safely, so that after uninstalling this package the pip will behave ordinary.
And you could customize the way it works. For example, disable it globally and activate it only for the required directories, activate it only for git repositories, or allow / disallow to create requirements.txt file if it does not exist.
Links:
Documentation - https://requirements-txt.readthedocs.io/en/latest/
GitHub - https://github.com/VoIlAlex/requirements-txt
PyPI - https://pypi.org/project/to-requirements.txt/
If Facing the same issue as mine i.e. not on the virtual environment and wants requirements.txt for a specific project or from the selected folder(includes children) and pipreqs is not supporting.
You can use :
import os
import sys
from fuzzywuzzy import fuzz
import subprocess
path = "C:/Users/Username/Desktop/DjangoProjects/restAPItest"
files = os.listdir(path)
pyfiles = []
for root, dirs, files in os.walk(path):
for file in files:
if file.endswith('.py'):
pyfiles.append(os.path.join(root, file))
stopWords = ['from', 'import',',','.']
importables = []
for file in pyfiles:
with open(file) as f:
content = f.readlines()
for line in content:
if "import" in line:
for sw in stopWords:
line = ' '.join(line.split(sw))
importables.append(line.strip().split(' ')[0])
importables = set(importables)
subprocess.call(f"pip freeze > {path}/requirements.txt", shell=True)
with open(path+'/requirements.txt') as req:
modules = req.readlines()
modules = {m.split('=')[0].lower() : m for m in modules}
notList = [''.join(i.split('_')) for i in sys.builtin_module_names]+['os']
new_requirements = []
for req_module in importables:
try :
new_requirements.append(modules[req_module])
except KeyError:
for k,v in modules.items():
if len(req_module)>1 and req_module not in notList:
if fuzz.partial_ratio(req_module,k) > 90:
new_requirements.append(modules[k])
new_requirements = [i for i in set(new_requirements)]
new_requirements
with open(path+'/requirements.txt','w') as req:
req.write(''.join(new_requirements))
P.S: It may have a few additional libraries as it checks on fuzzylogic.
best way for Python 3 is:
pip3 freeze > requirements.txt
it worked for me...
If you want to list only packages used inside a virtualenv use:
pip freeze -l > requirements.txt
You can just do it with command. it will create requirement.txt and add relevant modules automatically.
For Unix : pip3 freeze > requirements.txt
For Windos: pip freeze > requirements.txt
CREATE requirement.txt:
For Python 3 version command is:
pip3 freeze > requirements.txt
For Python 2 version command is:
pip freeze > requirements.txt
Install requirements.txt:
For Python 3 version command is:
pip3 install -r requirements.txt
For Python 2 version command is:
pip install -r requirements.txt
Not a complete solution, but may help to compile a shortlist on Linux.
grep --include='*.py' -rhPo '^\s*(from|import)\s+\w+' . | sed -r 's/\s*(import|from)\s+//' | sort -u > requirements.txt
Or if your are using a something like virtualenv you can just run this command to generate a requirements.txt
$ ./.venv/bin/pip freeze > requirements.txt
I created this bash command.
for l in $(pip freeze); do p=$(echo "$l" | cut -d'=' -f1); f=$(find . -type f -exec grep "$p" {} \; | grep 'import'); [[ ! -z "$f" ]] && echo "$l" ; done;
#Francis has it right - https://stackoverflow.com/a/65728461/1021819
But just to add:
With additional support for Jupyter notebooks - i.e. .ipynb files - you can now use https://pypi.org/project/pipreqsnb (same syntax as pipreqs):
pip install pipreqsnb
pipreqsnb .
[I am not an author]
Using pip freeze > requirements.txt is a bad way to create the requirements file! It can serve as a temporary solution for your problem but when managing requirements for python project it is best to do it manually.
A simple search for "import" or "from x import" will give you the list of all dependencies that need to be installed (nothing extra).
The problem with pip freeze it that it simply dumps all installed packages with strict versions, every dependency has its own dependencies and they are included in the dump.
For example, you have lib==1.0 installed, that needs sub-lib==0.5, if you use pip freeze you'll get both, but later when you wish to update the version of lib to 2.0, most likely you'll get conflicts since lib v2.0 now uses sub-lib v1.0 not v0.5 that you require... This gets complex fast for multiple dependencies.
We got into those problems in a couple of projects, since then I created an automated script to clean pip freeze's dumps, it is safe (comments unneeded dependencies) and works great.
To help solve this problem, always run requirements.txt on only local packages. By local packages I mean packages that are only in your project folder. To do this do:
Pip freeze —local > requirements.txt
Not pip freeze > requirements.txt.
Note that it’s double underscore before local.
However installing pipreqs helps too.
Pip install pipreqs.
The perfect solution though is to have a pipfile. The pipfile updates on its own whenever you install a new local package. It also has a pipfile.lock similar to package.json in JavaScript.
To do this always install your packages with pipenv not pip.
So we do pipenv
Pipenv users can generate the requirement.txt file from the project's Pipfile with:
pipenv lock --requirements
I've been looking around for a package manager that can be used with python. I want to list project dependencies in a file.
For example ruby uses Gemfile where you can use bundle install.
How can I achieve this in Python?
The pip tool is becoming the standard in equivalent of Ruby's gems.
Like distribute, pip uses the PyPI package repository (by default) for resolving and downloading dependencies.
pip can install dependencies from a file listing project dependencies (called requirements.txt by convention):
pip install -r requirements.txt
You can "freeze" the current packages on the Python path using pip as well:
pip freeze > requirements.txt
When used in combination with the virtualenv package, you can reliably create project Python environments with a project's required dependencies.
Pipenv
(I know it's an old question, and it already has an answer but for anyone coming here looking for a different answer like me.)
I've found a very good equivalent for npm, It's called pipenv. It handles both virtualenv and pip requirements at the same time so it's more like npm.
Simple Use Case
pip install pipenv
then you can make a new virtualenv with third version of python, as well as making a pipfile that will be filled with your projects requirement and other stuff:
pipenv install --three
using your created virtualenv:
pipenv shell
installing a new python package:
pipenv install requests
running your .py file is like:
pipenv run python somefile.py
you can find it's doc here.
Python uses pip for a package manager. The pip install command has a -r <file> option to install packages from the specified requirements file.
Install command:
pip install -r requirements.txt
Example requirements.txt contents:
Foo >= 1.2
PickyThing <1.6,>1.9,!=1.9.6,<2.0a0,==2.4c1
SomethingWhoseVersionIDontCareAbout
See the Requirements Parsing section of the docs for a full description of the format: https://pip.pypa.io/en/stable/user_guide/#requirements-files
This is how I restrict pip's scope to the current project. It feels like the opposite if you're coming from NodeJS's npm or PHP's composer where you explicitly specify global installations with -g or --global.
If you don't already have virtualenv installed, then install it globally with:
pip install virtualenv
Each Python project should have its own virtualenv installation. It's easy to set one up, just cd to your project's root and:
python3 -m virtualenv env # creates env folder with everything you need
Activate virtualenv:
source env/bin/activate
Now, any interaction with pip is contained within your project.
Run pip install package_name==version for each of your dependencies. They are installed in ./env/lib/python3.x/site-packages/
When you want to save your project's dependencies to a file, run:
pip freeze > requirements.txt
You actually don't need -l or --local if you're in an activated project-specific virtualenv (which you should be).
Now, when you want to install your dependencies from requirements.txt, set up your virtualenv, and run:
pip install -r requirements.txt
That's all.
This is an old question but things are constantly evolving.
Further to the other answer about pipenv. There is also a python package manger called poetry.
There is a detailed comparison between pipenv and poerty here: Feature comparison between npm, pip, pipenv and poetry package managers. It also links the features to common npm features.
Here is a comparison of pipenv vs poetry vs pdm: https://dev.to/frostming/a-review-pipenv-vs-poetry-vs-pdm-39b4
The conclusion is that pdm is the winner.
But in my experience, poetry is easier than pdm to integrate with IDEs.
To install dependences, the appengine-python-flask-skeleton docs advise running this command:
pip install -r requirements.txt -t lib
That works simply enough.
Now say I want to add the Requests package.
Ideally I just add it to the requirements.txt file:
# This requirements file lists all third-party dependencies for this project.
#
# Run 'pip install -r requirements.txt -t lib/' to install these dependencies
# in `lib/` subdirectory.
#
# Note: The `lib` directory is added to `sys.path` by `appengine_config.py`.
Flask==0.10
requests
And then re-run the command:
pip install -r requirements.txt -t lib
However, as this Github issue for pip notes, pip is not idempotent with the -T option recommended by Google here. The existing flask packages will be re-added and this will lead to the following error when running the devapp
ImportError: cannot import name exceptions
How can I best work around this problem?
Like said, updating pip solves the issue for many, but for what it's worth I think you can get around all of this if the use of virtualenv is an option. Symlink /path/to/virtualenv's/sitepackages/ to lib/ and just always keep an up to date requirements.txt file. There are no duplication of packages this way and one won't have to manually install dependencies. See also https://stackoverflow.com/a/30447848/2295256
Upgrading to the latest version of pip solved my problem (that issue had been closed):
pip install -U pip
Otherwise, as noted in that thread, you can always just wipe out your lib directory and reinstall from scratch. One note of warning: if you manually added additional packages to the lib directory not tracked in requirements.txt, they would be lost and have to be re-installed manually.
In nodejs, I can do npm install package --save-dev to save the installed package into the package.
How do I achieve the same thing in Python package manager pip? I would like to save the package name and its version into, say, requirements.pip just after installing the package using something like pip install package --save-dev requirements.pip.
There isn't an equivalent with pip.
Best way is to pip install package && pip freeze > requirements.txt
You can see all the available options on their documentation page.
If it really bothers you, it wouldn't be too difficult to write a custom bash script (pips) that takes a -s argument and freezes to your requirements.txt file automatically.
Edit 1
Since writing this there has been no change in providing an auto --save-dev option similar to NPM however Kenneth Reitz (author of requests and many more) has released some more info about a better pip workflow to better handle pip updates.
Edit 2
Linked from the "better pip workflow" article above it is now recommended to use pipenv to manage requirements and virtual environments. Having used this a lot recently I would like to summarise how simple the transition is:
Install pipenv (on Mac)
brew install pipenv
pipenv creates and manages it's own virtual environments so in a project with an existing requirements.txt, installing all requirements (I use Python3.7 but you can remove the --three if you do not) is as simple as:
pipenv --three install
Activating the virtualenv to run commands is also easy
pipenv shell
Installing requirements will automatically update the Pipfile and Pipfile.lock
pipenv install <package>
It's also possible to update out-of-date packages
pipenv update
I highly recommend checking it out especially if coming from a npm background as it has a similar feel to package.json and package-lock.json
This simple line is a starting point. You can easily built a bash command to reuse the PACKAGE in the line.
pip install PACKAGE && pip freeze | grep PACKAGE >> requirements.txt
Thanks to #devsnd for the simple bash function example:
function pip-install-save {
pip install $1 && pip freeze | grep $1 >> requirements.txt
}
To use it, just run:
pip-install-save some-package
I've created python package that wraps around the actual pip called pipm. All pip commands will work as it is, plus they will be reflected in the requirements file. Unlike pip-save (inactive for sometime), a similar tool I found and wasn't able to use, it can handle many files and environments(test, dev, production, etc. ). It also has a command to upgrade all/any of your dependencies.
installation
pipm install pkg-name
installation as development dependency
pipm install pkg-name --dev
installation as testing dependency
pipm install pkg-name --test
removal
pipm uninstall pkg-name
update all your dependencies
pipm update
install all your dependencies from the requirements file
pipm install
including development dependencies
pipm install --dev
Update: apparently, pipenv is not officially endorsed by Python maintainers, and the previously-linked page is owned by a different organization. The tool has its pros and cons, but the below solution still achieves the result that the OP is seeking.
pipenv is a dependency management tool that wraps pip and, among other things, provides what you're asking:
https://pipenv.kennethreitz.org/en/latest/#example-pipenv-workflow
$ pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package your provided.
A Pipfile is a direct equivalent of package.json, while Pipfile.lock corresponds to package-lock.json.
you can manually save it in a Makefile (or a text file and then imported in your Makefile):
PYTHON=.venv/bin/python # path to pyphon
PIP=.venv/bin/pip # path to pip
SOURCE_VENV=. .venv/bin/activate
install:
virtualenv .venv
$(SOURCE_VENV) && $(PIP) install -e PACKAGE
$(SOURCE_VENV) && $(PIP) install -r requirements.txt # other required packages
and then just run make install
How about make a shell function to do this ?
Add below code to your ~/.profile or ~/.bashrc
pips() {
local pkg=$1
if [ -z "$1" ]; then
echo "usage: pips <pkg name>"
return 1
fi
local _ins="pip install $pkg"
eval $_ins
pip freeze | grep $pkg -i >> requirements.txt
}
then run source ~/.profile or source ~/.bashrc to import it to your current terminal
when you want to install && save a package, just run, for example pips requests.
after package was installed, its version will be save into requirements.txt in your current directory.
I am using this small command line to install a package and save its version in requirements.txt :
pkg=package && pip install $pkg && echo $(pip freeze | grep -i $pkg) >> requirements.txt
I made a quick hack on pip to add --save option to install/uninstall commands.
Please have a look at my blog for more information about this hack:
http://blog.abhiomkar.in/2015/11/12/pip-save-npm-like-behaviour-to-pip/
Installation (GitHub):
https://github.com/abhiomkar/pip-save
Hope this helps.
What about this one:
pip freeze >> requirements.txt