I'm moving on from single scripts to a bigger python application.
It's an application with multiple packages.
package1-> package1/.py files
package2-> package2/.py files
As package 1 should be able to be used stand alone, I keep it in a separate git repo.
I'd love to do in package2: import package1
It feels like the easiest way to do it would be having project1 (in its git repo) in a subdirectory of project2, but that doesnt sound like a nice solution.
Some answers I found feels dated and I couldn't get it to work. (python setup.py install)
Adding package1 location to the PATH is a solution, but it's not very nice if I want to distribute it to co-workers. Ideally, I "install" the package as easily as possible.
I read "pip" would be preferred, but would need some directions where to start looking for creating a package. Also, distribution would be only local.
(python3.6. Code will be used on linux and windows. )
excerpt from an excellent (but kindof hidden) answer using pip given by np8 in question
Importing modules from parent folder:
checkout his answer!
--
1) Add a setup.py to the root folder
The contents of the setup.py can be simply
from setuptools import setup, find_packages
setup(name='myproject', version='1.0', packages=find_packages())
Basically "any" setup.py would work. This is just a minimal working example.
2) Use a virtual environment
3) pip install your project in editable state
Install your top level package myproject using pip. The trick is to use the -e flag when doing the install. This way it is installed in an editable state, and all the edits made to the .py files will be automatically included in the installed package.
In the root directory, run
pip install -e . (note the dot, it stands for "current directory")
You can also see that it is installed by using pip freeze
Related
I would like to use the following sdk in my python project -> https://github.com/LBank-exchange/lbank-api-sdk-v2. It has sdk's for 3 languages (I just want the python one). I tried to install it using the command:
pip install git+https://github.com/LBank-exchange/lbank-api-sdk-v2.git#egg=lbank
which gave the error
does not appear to be a Python project: neither 'setup.py' nor 'pyproject.toml' found.
Looks like the developer didn't bother to package it properly. It it was me using it, I would fork it on GH, add the setup.py and use the fork. Maybe a good exercise for you?
Meanwhile, to just get it to work, in your project "root":
git clone https://github.com/LBank-exchange/lbank-api-sdk-v2.git
ln -s lbank-api-sdk-v2/python-sdk-api/LBank ./LBank
Then in your code just import LBank. This will leave the cloned repo untouched (so you can git pull to update it later) and just link the module directory to the root. Alternatively you can just include the api directory in sys.path for imports to work.
Think there is nothing to install, if you want to be able to "import" and use it like other packages you install through pip install you can just add the folder to your sys-path:
import sys
sys.path.append("path")
Short version:
Is it possible to use the -e parameter in requirements.txt with a path where the editable package should be installed?
First approach
requirements.txt:
-e git+https://github.com/snake-soft/imap-storage.git#egg=imap-storage
Pro: Automated install
Contra: Editable directory is inside virtualenv src folder (not in workspace)
Second approach (Edit: Don't use this until you know what you're doing, look at bottom)
If i clone the repo and installed it like this (virtualenv activated):
cd /home/user/workspace
git clone https://github.com/snake-soft/imap-storage.git
pip install -e .
Gives the structure i want:
workspace
├── imap-storage
├── django-project # uses imap-storage module
I have what i want. The repository (imap-storage) lays parallel to the django-project, that uses it.
It is importable because it is installed inside the virtualenv.
Pro: Editable directory is inside my workspace
Contra: Not automated, not intuitive
Goal
pip install -r requirements.txt to install module from git (like first approach)
Module is in pythonpath of virtualenv -> importable
Editable working dir of the module is in my workspace (like second approach)
PS: Or am i completely wrong-thinking and should go for something completely different?
Why did i ask such a crazy question?
I thought i could make my life a little bit easier when both (package and Django project that is using this package) are laying editable inside my workspace because i work on them in parallel.
My résumé
I tried it a little bit with the second approach and at the end, i decided to prefer the first approach.
Reason
With both methods pydev won't show it as an installed package.
When mix both methots like that:
install package via requirements.txt (with the -e switch)
uninstall it
clone it into (eg. ~/workspace/)
install it with the 'pip install -e .' inside the package
Then you will end up in a bad situation.
The 'virtualenv/src/' directory won't be deleted and is recognized as source for the package inside pydev.
When running the Django instance that uses that package, it runs the package-code from '~/workspace/'.
Suggestion
Use the first approach, import that source dir as project in pydev ('virtualenv/src/') and make a link inside the file-manager of your choice.
It will save you from a complicated mistake.
When would the -e, or --editable option be useful with pip install?
For some projects the last line in requirements.txt is -e .. What does it do exactly?
As the man page says it:
-e,--editable <path/url>
Install a project in editable mode (i.e. setuptools "develop mode") from a local project path or a VCS url.
So you would use this when trying to install a package locally, most often in the case when you are developing it on your system. It will just link the package to the original location, basically meaning any changes to the original package would reflect directly in your environment.
Some nuggets around the same here and here.
An example run can be:
pip install -e .
or
pip install -e ~/ultimate-utils/ultimate-utils-proj-src/
note the second is the full path to where the setup.py would be at.
Concrete example of using --editable in development
If you play with this test package as in:
cd ~
git clone https://github.com/cirosantilli/vcdvcd
cd vcdvcd
git checkout 5dd4205c37ed0244ecaf443d8106fadb2f9cfbb8
python -m pip install --editable . --user
it outputs:
Obtaining file:///home/ciro/bak/git/vcdvcd
Installing collected packages: vcdvcd
Attempting uninstall: vcdvcd
Found existing installation: vcdvcd 1.0.6
Can't uninstall 'vcdvcd'. No files were found to uninstall.
Running setup.py develop for vcdvcd
Successfully installed vcdvcd-1.0.6
The Can't uninstall 'vcdvcd' is normal: it tried to uninstall any existing vcdvcd to then replace them with the "symlink-like mechanism" that is produced in the following steps, but failed because there were no previous installations.
Then it generates a file:
~/.local/lib/python3.8/site-packages/vcdvcd.egg-link
which contains:
/home/ciro/vcdvcd
.
and acts as a "symlink" to the Python interpreter.
So now, if I make any changes to the git source code under /home/ciro/vcdvcd, it reflects automatically on importers who can from any directory do:
python -c 'import vcdvcd'
Note however that at my pip version at least, binary files installed with --editable, such as the vcdcat script provided by that package via scripts= on setup.py, do not get symlinked, just copied to:
~/.local/bin/vcdcat
just like for regular installs, and therefore updates to the git repository won't directly affect them.
By comparison, a regular non --editable install from the git source:
python -m pip uninstall vcdvcd
python -m pip install --user .
produces a copy of the installed files under:
~/.local/lib/python3.8/site-packages/vcdvcd
Uninstall of an editable package as done above requires a new enough pip as mentioned at: How to uninstall editable packages with pip (installed with -e)
Tested in Python 3.8, pip 20.0.2, Ubuntu 20.04.
Recommendation: develop directly in-tree whenever possible
The editable setup is useful when you are testing your patch to a package through another project.
If however you can fully test your change in-tree, just do that instead of generating an editable install which is more complex.
E.g., the vcdvcd package above is setup in a way that you can just cd into the source and do ./vcdcat without pip installing the package itself (in general, you might need to install dependencies from requirements.txt though), and the import vcdvcd that that executable does (or possibly your own custom test) just finds the package correctly in the same directory it lives in.
From Working in "development" mode:
Although not required, it’s common to locally install your project in
“editable” or “develop” mode while you’re working on it. This allows
your project to be both installed and editable in project form.
Assuming you’re in the root of your project directory, then run:
pip install -e .
Although somewhat cryptic, -e is short for
--editable, and . refers to the current working directory, so together, it means to install the current directory (i.e. your
project) in editable mode.
Some additional insights into the internals of setuptools and distutils from “Development Mode”:
Under normal circumstances, the distutils assume that you are going to
build a distribution of your project, not use it in its “raw” or
“unbuilt” form. If you were to use the distutils that way, you would
have to rebuild and reinstall your project every time you made a
change to it during development.
Another problem that sometimes comes up with the distutils is that you
may need to do development on two related projects at the same time.
You may need to put both projects’ packages in the same directory to
run them, but need to keep them separate for revision control
purposes. How can you do this?
Setuptools allows you to deploy your projects for use in a common
directory or staging area, but without copying any files. Thus, you
can edit each project’s code in its checkout directory, and only need
to run build commands when you change a project’s C extensions or
similarly compiled files. You can even deploy a project into another
project’s checkout directory, if that’s your preferred way of working
(as opposed to using a common independent staging area or the
site-packages directory).
To do this, use the setup.py develop command. It works very similarly
to setup.py install, except that it doesn’t actually install anything.
Instead, it creates a special .egg-link file in the deployment
directory, that links to your project’s source code. And, if your
deployment directory is Python’s site-packages directory, it will also
update the easy-install.pth file to include your project’s source
code, thereby making it available on sys.path for all programs using
that Python installation.
It is important to note that pip uninstall can not uninstall a module that has been installed with pip install -e. So if you go down this route, be prepared for things to get very messy if you ever need to uninstall. A partial solution is to (1) reinstall, keeping a record of files created, as in sudo python3 -m setup.py install --record installed_files.txt, and then (2) manually delete all the files listed, as in e.g. sudo rm -r /usr/local/lib/python3.7/dist-packages/tdc7201-0.1a2-py3.7.egg/ (for release 0.1a2 of module tdc7201). This does not 100% clean everything up however; even after you've done it, importing the (removed!) local library may succeed, and attempting to install the same version from a remote server may fail to do anything (because it thinks your (deleted!) local version is already up to date).
As suggested in previous answers, there is no symlinks that are getting created.
How does '-e' option work? -> It just updates the file "PYTHONDIR/site-packages/easy-install.pth" with the project path specified in the 'command pip install -e'.
So each time python search for a package it will check this directory as well => any changes to the files in this directory is instantly reflected.
I've needed to deal with this for some time, but never really figured out what the most pythonic way of importing/setting up PYTHONPATH for custom modules is. I know I can use virtualenv to manage it, I know I can set it inside of scripts, or through pth files, but none of these seem very clean and pythonic to me, so I'm guessing I'm missing something.
Almost always, all custom modules I'm interested in are contained in the git directory I've cloned down that has whatever script I'm running, if that simplifies things.
I'm guessing virtualenv is the answer, but figured I'd ask in case I'm missing anything.
EDIT: To clarify, this is only a question about custom modules. I'm already using pip for modules from PyPI.
You can use pip to install packages that are not on PyPI also. You just need an URI endpoint and a valid python package:
Examples:
$ pip install https://github.com/pypa/pip/archive/develop.zip#egg=pip
$ pip install git+https://github.com/pypa/pip.git#egg=pip
$ pip install git+git://github.com/pypa/pip.git#egg=pip
$ pip install /path/to/pip.tar.gz
$ pip install .
Read more at https://pip-installer.org/en/latest/usage.html#pip-install
virtualenv is a good start.
There are also package managers like pip and easy_install that manage third party modules.
In code you can use:
import sys
sys.path.append('/path/to/customModule')
Virtualenv is the way to go with this.
pip install virtualenv
Then make a folder to setup your environments. Inside that folder:
virtualenv <new_env_name>
That'll create a new folder in that directory, inside that there's a bin folder, run source on activate in that bin folder. You can then do pip install and it will only install it for that environment.
If you're cloning a git repo that you also want to be able to peruse the code easily (like if you're also working on that repo) clone it into your work_dir and then symlink or alias the package folder into the site-package directory inside that virtualenv's lib directory. Otherwise, if it's packaged correctly if you do python setup.py install it should install it right for that virtualenv.
Is it possible to specify (editable) source dependencies in setup.py that are known to reside on the local file system?
Consider the following directory structure, all of which lives in a single VCS repository:
projects
utils
setup.py
...
app1
setup.py
... # app1 files depend on ../utils
app2
setup.py
... # app2 files depend on ../utils
Given the following commands:
cd projects
mkvirtualenv app1
pip install -e app1
I'd like to have all the dependencies for app1 installed, including "utils", which is an "editable" dependency. Likewise, if I did the same for app2.
I've tried playing with all different combinations of file://... URLs in install_requires and dependency_links to no avail. I'd like to use a dependency link URL like src+file://../utils, which would tell setuptools that the source for the package is on the file system at this relative path. Is there a way to do this?
i managed to provide relative local dependency in setup.py with:
setup(
install_requires=[
'utils # file://localhost/%s/../utils/' % os.getcwd().replace('\\', '/'),
],
)
but maybe someone know better solution
I had an identical problem where I needed to depend on modules in a sibling folder. I was able to find a solution after stumbling upon https://caremad.io/2013/07/setup-vs-requirement/
I ended up requirements.txt to refer specifically to the file I wanted, and then installing everything with
pip install -r requirements.txt
requirements.txt
-e ../utils
-e .
And setup.py has all my other dependencies, including utils. When pip tries to install app1 itself, it realizes that the utils dependency has already been filled, and so passes over it, while installing the other requirements.
When I want to work with a set of projects interrelated, I install all of them using /setup.py develop.
If I mistakenly or later I want to make a pip-installed module editable, I clone the source, and do a python setup.py develop on it too, substituting the existing one.
Just to get sure, I erase the reference in the virtualenv's site-packages and the package itself.