I am working on a github project that will eventually be released on both conda and PyPl. Because of PyPl (and current development needs) I need to make the installation work using this command:
python -m pip install -e . (from inside the github cloned folder - pip 21)
However, my project has a dependency from a forked github repo. The problem I have is that I do not find a way to install the package from that fork using the command above, ....but I have to. Any trick I could find either got me nothing or the prod version of the repo. The only two things that worked are (1) manual or scripted launch of pip install [cannot do this, need setup.py to install things] (2) using conda [cannot rely on conda alone, I need the regular pip to work].
Can you please help me or point me in the right direction?
I use a requirements.txt that contains this line at the bottom:
...
-e git+https://github.com/jojurgens/pyqode.qt.git#master#egg=pyqode.qt
and a setup.py that contains this
with open(here / "requirements.txt", encoding="utf-8") as f:
requirements = f.read().splitlines()
setup(
name="qiskit_metal",
version="0.0.2",
install_requires=requirements,
)
On execution I get this error:
...error in setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Parse error at "'-e git+h'": Expected W:(abcd...)
Actually I was finally able to find an answer. For who may need this info, here is what I did:
In requirements.txt I have changed the line to this:
pyqode.qt # git+https://github.com/jojurgens/pyqode.qt.git#egg=pyqode.qt
that alone took care of the error.
That also solved my other (not described here) problem about "updating" the conda environment without corrupting it. So now I update the environment with just the same install command (as opposed to the update command, which does not work) python -m pip install -e .
I think what you should specify is the version of Shapely on the requirements, for example Shapely==<version>. If you want to install from differente source (and not from pip), do it outside of requirements.txt. For that you can evoke a bash command inside of your python code.
I'm aware there are many similar questions but I have been through them all to no avail.
On Ubuntu 18.04, I have Python 2 and Python 3.6. I create a venv using the command below and attempt to install a package using pip. However, it attempts to install on the global system and not in the venv.
python3 -m venv v1
When I run 'which python' it correctly picks the python within the venv. I have checked he v1/bin folder and pip is installed. The path within the pip script is correctly pointed to toward python in the venv.
I have tried reinstalling python3 and venv, destroying and recreating the virtual environment and many other things. Wondering is there some rational way to understand and solve this.
The problem in my case was that the mounted drive I was working on was not mounted as executable. So pip couldn't be executed from within the venv on the mount.
This was confirmed because I was able to get a pip install using 'python -m pip install numpy' but when importing libraries, e.g. 'import numpy', was then faced with further error of:
multiarray_umath.cpython-36m-x86_64-linux-gnu.so: failed to map segment from shared object
which led back to the permissions issue as per github issue below. Fix for that by dvdabelle in comments then fixes dependent and original issue.
https://github.com/numpy/numpy/issues/15102
In his case, he could just switch drive. I have to use this drive. So the fix was to unmount my /data disk where I was working and remount it with exec option!
sudo umount /data
sudo mount -o exec /dev/sda4 /data
'which pip' now points to the pip in the venv correctly
Note: to make it permanent add the exec switch to the line for the drive in fstab as per https://download.tuxfamily.org/linuxvillage/Informatique/Fstab/fstab.html (make exec the last parameter in the options or user will override it) E.g.
UUID=1332d6c6-da31-4b0a-ac48-a87a39af7fec /data auto rw,user,auto,exec 0 0
I apologize ahead of time if these questions are very dumb. I'm pretty new to sourcing python code from github. The link I am attempting to use is a study from link: https://github.com/malllabiisc/ConfGCN. So far, what I have tried is downloading the code as a zip. Then, I followed the instructions from Github and downloaded Ubuntu to run the shell file setup.sh. However, I am running into errors as after running sudo bash setup.sh in Ubuntu, it gives me this error:
Install python dependencies
setup.sh: line 11: pip: command not found
I have checked out the respective files this references. It calls for:
echo "Install python dependencies"
pip install -r requirements.txt
Inside the requirements.txt file it has a variety of python packages I have already installed inside a Venv in Pycharm. It specifically calls for:
numpy==1.16.0
tensorflow==1.12.1
scipy==1.2.0
networkx==2.2
Previous lines in setup.sh run perfectly fine in terms of updating files included in the folder. Another question I have is in general on how to setup a python package. I am currently using Pycharm CE 2020 and I've attempted creating a python package inside of my workspace. I noticed that it auto generates a init.py file. How can I integrate my downloads from GitHub into my Pycharm Project?
There is no reason to run setup.sh as root because it is just supposed to install some packages which do not necessitates Sudo access. You can simply create a virtual environment and run setup.sh. For setting up environment just run:
$ virtualenv -p /usr/bin/python3.6 myenv # Create an environment
$ source myenv/bin/activate # Load the environment
(myenv) $ ./setup.sh
Once the environment is ready, you should be able to run the code. You can make Pycharm use that environment for executing the code.
I added the pip installation folder in my python site-packages directory to my PATH, but I can still only run it via python -m pip in my git bash. Just pip gives me command not found.
I looked around the pip directory and I don't see any binaries, so this makes sense. Yet clearly pip is a thing that is normally used on the command line without python -m. So what component am I missing here?
Try adding C:/path/to/python/Scripts/ to PATH
There should be a pip.exe there!
in your Anaconda directory there is a pip file in the direcctory Scripts.
add this path : C:\Users#yourname\Anaconda3\Scripts
i did the same and it was successful !!
(when i wrote pip in git bash it worked)
When would the -e, or --editable option be useful with pip install?
For some projects the last line in requirements.txt is -e .. What does it do exactly?
As the man page says it:
-e,--editable <path/url>
Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.
So you would use this when trying to install a package locally, most often in the case when you are developing it on your system. It will just link the package to the original location, basically meaning any changes to the original package would reflect directly in your environment.
Some nuggets around the same here and here.
An example run can be:
pip install -e .
or
pip install -e ~/ultimate-utils/ultimate-utils-proj-src/
note the second is the full path to where the setup.py would be at.
Concrete example of using --editable in development
If you play with this test package as in:
cd ~
git clone https://github.com/cirosantilli/vcdvcd
cd vcdvcd
git checkout 5dd4205c37ed0244ecaf443d8106fadb2f9cfbb8
python -m pip install --editable . --user
it outputs:
Obtaining file:///home/ciro/bak/git/vcdvcd
Installing collected packages: vcdvcd
Attempting uninstall: vcdvcd
Found existing installation: vcdvcd 1.0.6
Can't uninstall 'vcdvcd'. No files were found to uninstall.
Running setup.py develop for vcdvcd
Successfully installed vcdvcd-1.0.6
The Can't uninstall 'vcdvcd' is normal: it tried to uninstall any existing vcdvcd to then replace them with the "symlink-like mechanism" that is produced in the following steps, but failed because there were no previous installations.
Then it generates a file:
~/.local/lib/python3.8/site-packages/vcdvcd.egg-link
which contains:
/home/ciro/vcdvcd
.
and acts as a "symlink" to the Python interpreter.
So now, if I make any changes to the git source code under /home/ciro/vcdvcd, it reflects automatically on importers who can from any directory do:
python -c 'import vcdvcd'
Note however that at my pip version at least, binary files installed with --editable, such as the vcdcat script provided by that package via scripts= on setup.py, do not get symlinked, just copied to:
~/.local/bin/vcdcat
just like for regular installs, and therefore updates to the git repository won't directly affect them.
By comparison, a regular non --editable install from the git source:
python -m pip uninstall vcdvcd
python -m pip install --user .
produces a copy of the installed files under:
~/.local/lib/python3.8/site-packages/vcdvcd
Uninstall of an editable package as done above requires a new enough pip as mentioned at: How to uninstall editable packages with pip (installed with -e)
Tested in Python 3.8, pip 20.0.2, Ubuntu 20.04.
Recommendation: develop directly in-tree whenever possible
The editable setup is useful when you are testing your patch to a package through another project.
If however you can fully test your change in-tree, just do that instead of generating an editable install which is more complex.
E.g., the vcdvcd package above is setup in a way that you can just cd into the source and do ./vcdcat without pip installing the package itself (in general, you might need to install dependencies from requirements.txt though), and the import vcdvcd that that executable does (or possibly your own custom test) just finds the package correctly in the same directory it lives in.
From Working in "development" mode:
Although not required, it’s common to locally install your project in
“editable” or “develop” mode while you’re working on it. This allows
your project to be both installed and editable in project form.
Assuming you’re in the root of your project directory, then run:
pip install -e .
Although somewhat cryptic, -e is short for
--editable, and . refers to the current working directory, so together, it means to install the current directory (i.e. your
project) in editable mode.
Some additional insights into the internals of setuptools and distutils from “Development Mode”:
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
To do this, use the setup.py develop command. It works very similarly
to setup.py install, except that it doesn’t actually install anything.
Instead, it creates a special .egg-link file in the deployment
directory, that links to your project’s source code. And, if your
deployment directory is Python’s site-packages directory, it will also
update the easy-install.pth file to include your project’s source
code, thereby making it available on sys.path for all programs using
that Python installation.
It is important to note that pip uninstall can not uninstall a module that has been installed with pip install -e. So if you go down this route, be prepared for things to get very messy if you ever need to uninstall. A partial solution is to (1) reinstall, keeping a record of files created, as in sudo python3 -m setup.py install --record installed_files.txt, and then (2) manually delete all the files listed, as in e.g. sudo rm -r /usr/local/lib/python3.7/dist-packages/tdc7201-0.1a2-py3.7.egg/ (for release 0.1a2 of module tdc7201). This does not 100% clean everything up however; even after you've done it, importing the (removed!) local library may succeed, and attempting to install the same version from a remote server may fail to do anything (because it thinks your (deleted!) local version is already up to date).
As suggested in previous answers, there is no symlinks that are getting created.
How does '-e' option work? -> It just updates the file "PYTHONDIR/site-packages/easy-install.pth" with the project path specified in the 'command pip install -e'.
So each time python search for a package it will check this directory as well => any changes to the files in this directory is instantly reflected.