How can I run target make install only if requirements.txt is changed ?
I don't want to upgrade packages each time when I do make install
I found some workaround by creating fake file _requirements.txt.pyc but is ugly and dirty. It will refuse install pip requirements second time because requirements.txt has no changes
$ make install-pip-requirements
make: Nothing to be done for 'install-pip-requirements'.
But my goal is to do:
# first time,
$ make install # create virtual environment, install requirements
# second time
$ make install # detected and skipping creating virtual env,
# detect that requirements.txt have no changes
# and skipping installing again all python packages
make: Nothing to be done for 'install'.
Python package looks like:
.
├── Makefile
├── README.rst
├── lambda_handler.py
└── requirements.txt
I am using file, Makefile, for some automation in python:
/opt/virtual_env:
# create virtual env if folder not exists
python -m venv /opt/virtual_env
virtual: /opt/virtual_env
# if requirements.txt is modified than execute pip install
_requirements.txt.pyc: requirements.txt
/opt/virtual_env/bin/pip install -r --upgrade requirements.txt
echo > _requirements.txt.pyc
requirements: SOME MAGIG OR SOME make flags
pip install -r requirements.txt
install-pip-requirements: _requirements.txt.pyc
install: virtual requirements
I am sure that
Must be a better way
to do this;)
Not sure it will answer your question at this point. The better way is to use a fully fledged Python PIP project template.
We use cookiecutter to create a particular pip package with this cookiecutter template.
It has a Makefile, which does not constantly re-install all the dependencies and it makes use of Python tox, which allows running a project tests in different python envs automatically. You still can develop in dev virtualenv, but we update it only when new package is added, everything else is handle by tox.
But, what you show so far is trying to write a Python build from scratch, which was done with numerous project templates. If you really want to understand what is going on there, you can analyze these templates.
As followup: Because you expect it to work with a makefile, I'd suggest removing the --upgrade flag from the pip command. I suspect your requirements do not include versions that are needed for the project to work. We made an experience, that not putting versions there might badly brake things. Thus our requirements.txt looks like:
configure==0.5
falcon==0.3.0
futures==3.0.5
gevent==1.1.1
greenlet==0.4.9
gunicorn==19.4.5
hiredis==0.2.0
python-mimeparse==1.5.2
PyYAML==3.11
redis==2.10.5
six==1.10.0
eventlet==0.18.4
Using the requirements without --upgrade causes pip simply verify what is in virtualenv and what not. Everything that satisfies the required version will be skipped (no download). You can also reference git versions in requirements like that:
-e git+http://some-url-here/path-to/repository.git#branch-name-OR-commit-id#egg=package-name-how-to-appear-in-pip-freeze
#Andrei.Danciuc, make just needs two files to compare; you can use any of the output files from running pip install.
For example, I usually use a "vendored" folder, so I can alias the path to the "vendored" folder instead of using a dummy file.
# Only run install if requirements.txt is newer than vendored folder
vendored-folder := vendored
.PHONY: install
install: $(vendored-folder)
$(vendored-folder): requirements.txt
rm -rf $(vendored-folder)
pip install -r requirements.txt -t $(vendored-folder)
If you don't use a vendored folder, this code below should work for both virtualenv and global setups.
# Only run install if requirements.txt is newer than SITE_PACKAGES location
.PHONY: install
SITE_PACKAGES := $(shell pip show pip | grep '^Location' | cut -f2 -d':')
install: $(SITE_PACKAGES)
$(SITE_PACKAGES): requirements.txt
pip install -r requirements.txt
Related
I am building a python project -- potion. I want to use Github actions to automate some linting & testing before merging a new branch to master.
To do that, I am using a slight modification of a Github recommended python actions starter workflow -- Python Application.
During the step of "Install dependencies" within the job, I am getting an error. This is because pip is trying to install my local package potion and failing.
The code that is failing if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
The corresponding error is:
ERROR: git+https#github.com:<github_username>/potion.git#82210990ac6190306ab1183d5e5b9962545f7714#egg=potion is not a valid editable requirement. It should either be a path to a local project or a VCS URL (beginning with bzr+http, bzr+https, bzr+ssh, bzr+sftp, bzr+ftp, bzr+lp, bzr+file, git+http, git+https, git+ssh, git+git, git+file, hg+file, hg+http, hg+https, hg+ssh, hg+static-http, svn+ssh, svn+http, svn+https, svn+svn, svn+file).
Error: Process completed with exit code 1.
Most likely, the job is not able install the package potion because it is not able to find it. I installed it on my own computer using pip install -e . and later used pip freeze > requirements.txt to create the requirements file.
Since I use this package for testing therefore I need to install this package so that pytest can run its tests properly.
How can I install a local package (which is under active development) on Github Actions?
Here is part of the Github workflow file python-app.yml
...
steps:
- uses: actions/checkout#v2
- name: Set up Python 3.8
uses: actions/setup-python#v2
with:
python-version: 3.8
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Lint with flake8
...
Note 1: I have already tried changing from git+git#github.com:<github_username>... to git_git#github.com/<github_username>.... Pay attention to / instead of :.
Note 2: I have also tried using other protocols such as git+https, git+ssh, etc.
Note 3: I have also tried to remove the alphanumeric #8221... after git url ...potion.git
The "package under test", potion in your case, should not be part of the requirements.txt. Instead, simply add your line
pip install -e .
after the line with pip install -r requirements.txt in it. That installs the already checked out package in development mode and makes it available locally for an import.
Alternatively, you could put that line at the latest needed point, i.e. right before you run pytest.
My startup command I have for my discord.py bot installs requirements from a requirements.txt file and puts them into the specified folder (the requirements folder) but the issue I'm having is that the bot doesn't look into that folder for dependencies and just looks at the root folder instead
Is there a way I could make the bot look at a specific folder for dependencies instead of the root one? Because if I were to use the root folder, it'd make it look messy and harder to navigate
Here is the command I use:
pip install -U --target /home/container/requirements -r requirements.txt; fi; /usr/local/bin/python /home/container/bot.py
If I understand correctly, You need to use a virtual environment.
You can do that with may ways venv poetry pipenv.
I recommend you use pipenv all you have to do is install your requirements.txt to the environment and then run your code inside of it.
pipenv install -r requirements.txt
I am creating a setup.py for my project:
install_requires = [
"rasa==1.0.1",
"beautifulsoup4==4.7.1",
"bs4==0.0.1",
"pyowm==2.10.0",
"flask-restful==0.3.7",
"google-cloud-translate==1.6.0",
"gensim==3.8.0",
]
And accordingly I have the requirements.txt:
../my-rasa
beautifulsoup4==4.7.1
bs4==0.0.1
pyowm==2.10.0
flask-restful==0.3.7
google-cloud-translate==1.6.0
gensim==3.8.0
Then I want to install them into my local environment:
pip install -r requirements.txt
pip install -e .
One question I am having is that, in the 7 dependencies, all others are from public repository, while rasa==1.0.1 is in my local directory. In such a mixture of dependencies, how can I install requirements.txt and further install them into my local environment?
You specify the path to the local dir in requirements.txt. Make sure the dir specified contains path to setup.py of rasa
/local/rasa
beautifulsoup4==4.7.1
bs4==0.0.1
pyowm==2.10.0
flask-restful==0.3.7
google-cloud-translate==1.6.0
gensim==3.8.0
More details on requirements.txt can be found here
https://pip.pypa.io/en/stable/user_guide/#requirements-files
Logically, a Requirements file is just a list of pip install arguments placed in a file
I've got the following error:
ERROR: Directory is not installable. Neither 'setup.py' nor 'pyproject.toml'
Background is that I'm following a guide online to expose an ML model via API Gateway on AWS that can be found here:
Hosting your ML model on AWS Lambdas + API Gateway
I'm trying to pull some python packages into a local folder using the following command:
pip install -r requirements.txt --no-deps --target python/lib/python3.6/site-packages/
I have also tried this:
pip install -r requirements.txt --no-deps -t python/lib/python3.6/site-packages/
and all I get is the above error.
Google is pretty bare when it comes to help with this issue, any ideas?
thanks,
Does this work?
You can create a new folder e.g. lib, and run this command:
pip3 install <your_python_module_name> -t lib/
Would suggest making the path explicit to requirements.txt, e.g. ./requirements.txt if you're running the command in the same directory
Also may need to add a basic setup.py to the folder where you're trying to install. The pip docs mention that this will happen if there's no setup.py file:
When looking at the items to be installed, pip checks what type of
item each is, in the following order:
Project or archive URL.
Local directory (which must contain a
setup.py, or pip will report an error).
Local file (a sdist or wheel
format archive, following the naming conventions for those formats).
A
requirement, as specified in PEP 440.
https://pip.pypa.io/en/stable/cli/pip_install/#argument-handling
Please try this:
ADD requirements.txt ./
pip install -r requirements.txt --no-deps -t python/lib/python3.6/site-packages/
syntax: ADD source destination
'ADD requirements.txt ./' adds requirements.txt (assumed to be at the cwd) to the docker image's './' folder.
Thus creating a layer from which the daemon has the context to the location of requirements.txt in the docker image.
More about it in dockerfile-best-practices
you can change your directory as follow
import os
os.chdir(path)
instead of:
cd path
also try to use:
!pip freeze > requirements.txt
instead of:
pip install -r requirements.txt
then execute your code:
!pip install .
or
!pip install -e .
in conclusion try this:
import os
os.chdir(path)
!pip freeze > requirements.txt
!pip install .
To create Python virtual environments I use virtualenv and pip. The workflow is very simple:
$ virtualenv project
$ cd project
$ . bin/activate
$ pip install -r /path/to/requirements/req1.txt
$ pip install -r /path/to/requirements/req2.txt
The number of different requirement files can grow enough to make handy to have a way to include them at once, so I'd rather prefer to say:
$ pip install -r /path/to/requirements/req1_req2.txt
with req1_req2.txt containing something like:
include /path/to/requirements/req1.txt
include /path/to/requirements/req2.txt
or otherwise:
$ pip install -r /path/to/requirements/*.txt
None of that works and however much simple it could be, I can't figure out how to do what I want.
Any suggestion?
The -r flag isn't restricted to command-line use only, it can also be used inside requirements files. So running pip install -r req-1-and-2.txt when req-1-and-2.txt contains this:
-r req-1.txt
-r req-2.txt
will install everything specified in req-1.txt and req-2.txt.
Just on a note, you can also split the requirements based on your groupings and embed them in a single file ( or again can prepare multiple requirements file based on your environment), that you can execute.
For example, the test requirements here:
requirements-test.txt
pylint==2.4.4
pytest==5.3.2
The dev requirements here:
requirements-dev.txt
boto3>=1.12.11
Master requirements file containing your other requirements:
requirements.txt
-r requirements-dev.txt
-r requirements-test.txt
Now, you can just install the requirements file embedding your other requirements
pip3 install -r requirements.txt