Wheel not built when pip installing from source - causes ModuleNotFoundError - python

I have written a python package mypackage , hosted on github, that I now want to install as a dependency to another project (specifically branch branch).
I can build it locally using setuptools and a pyproject.toml file by running python3 -m pip install . at the top of mypackage/ and can then successfully import it into python.
I pushed this to github and now try to install it using
python3 -m pip install "git+https://github.com/mygituser/mypackage.git#branch"
This runs without warning, and if I then run python3 -m pip list I can see mypackage listed there. However, if I enter python and run import mypackage I get the error ModuleNotFoundError: No module named 'mypackage'.
Comparing the verbose outputs I can see that for the local install after installing the dependencies I get
running bdist_wheel
running build
running build_py
running egg_info
writing mypackage/mypackage.egg-info/PKG-INFO
writing dependency_links to mypackage/mypackage.egg-info/dependency_links.txt
...
...
etc. which is absent for the github build process (this just ends after dependencies are installed).
Am I missing a setting in pyproject.toml or somewhere to tell it to build a wheel so I can import mypackage it in python?
I have tried appending with #egg=mypackage and adding wheel to the build dependencies, but none of this has worked.
TIA

In addition to the changes you made to your pyprojcet.toml file here: https://github.com/jatkinson1000/archeryutils/commit/f5fb491aa37961a65796ce4be3616d8be23548ed
You also need to include a __init__.py file in your round_data_files folder so it gets included in the package install.
https://github.com/jatkinson1000/archeryutils/pull/9

Related

setup.py does not find package installed via conda

I'm developing my own conda package. I'm using a setup.py in the process of generating that package. When developing, it can be useful to conda install --no-deps package and ./setup.py develop in the repo. The install works, but setup.py errors out because it cannot satisfy a dependency. There is another package with the same name but to low version on pypi. However, the correct version of this dependency is already installed via conda. Why does it first try to install the package before checking if it is already installed? How can I make setup.py realize that nothing needs to be installed?
How is the package built?
I have this blt.bat:
"%PYTHON%" setup.py install --single-version-externally-managed --record=record.txt
if errorlevel 1 exit 1
and the dependencies listed in the meta.yaml. It falls to setup.py to define the package version (read from a .py file), define the entry points and the included non-Python files. It gets the dependencies by parsing the meta.yaml.
What is the error message?
This is the output of setup.py develop:
$ python setup.py develop
running develop
C:\ProgramData\Miniconda3\envs\my_env\lib\site-packages\setuptools\command\easy_install.py:144: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
C:\ProgramData\Miniconda3\envs\my_env\lib\site-packages\setuptools\command\install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
warnings.warn(
running egg_info
writing my_package.egg-info\PKG-INFO
writing dependency_links to my_package.egg-info\dependency_links.txt
writing entry points to my_package.egg-info\entry_points.txt
writing requirements to my_package.egg-info\requires.txt
writing top-level names to my_package.egg-info\top_level.txt
reading manifest file 'my_package.egg-info\SOURCES.txt'
writing manifest file 'my_package.egg-info\SOURCES.txt'
running build_ext
Creating c:\programdata\miniconda3\envs\my_env\lib\site-packages\my_package.egg-link (link to .)
my_package 6.1.0 is already the active version in easy-install.pth
Installing entry_point0-script.py script to C:\ProgramData\Miniconda3\envs\my_env\Scripts
Installing entry_point0.exe script to C:\ProgramData\Miniconda3\envs\my_env\Scripts
Installing entry_point1-script.py script to C:\ProgramData\Miniconda3\envs\my_env\Scripts
Installing entry_point1.exe script to C:\ProgramData\Miniconda3\envs\my_env\Scripts
Installed c:\users\jpoppinga\my_git_repository
Processing dependencies for my_package==6.1.0
Searching for my_dep<13.1,>=13.0.1
Reading https://pypi.org/simple/my_dep/
C:\ProgramData\Miniconda3\envs\my_env\lib\site-packages\pkg_resources\__init__.py:123: PkgResourcesDeprecationWarning: is an invalid version and will not be supported in a future release
warnings.warn(
No local packages or working download links found for my_dep<13.1,>=13.0.1
error: Could not find suitable distribution for Requirement.parse('my_dep<13.1,>=13.0.1')

installing and building multiple packages from source with pip

I want to create a Python package. To make this work, the user needs to install multiple python packages. I also need the user to install a package that is currently only supported for installing straight from the source (i.e. pip install -e .). How do I create my own source package, which depends on another source package in a clean way? In my opinion, it would be best if the user can just run python setup.py once; which installs my package and all the requirements in requirements.txt as well as the other package straight from the source.
I added a setup.py file with the following content:
setuptools.setup(
dependency_links=["git+https://github.com/facebookresearch/pytorch-dp.git#egg=pytorch-dp"],
packages=setuptools.find_packages(),
python_requires=">=3.6",
)
When I run the setup file I get:
Moving pytorch_dp-0.1-py3.6.egg to /usr/local/lib/python3.6/dist-packages
Adding pytorch-dp 0.1 to easy-install.pth file
but then if I try to import the package torchdp:
import torchdp
I get the error:
ModuleNotFoundError: No module named 'torchdp'
I'm using a Google Colab Notebook for GPU support.*
Turned out I had to restart google colab, and then it worked. Yikes.

Setuptools: Include subdirectories in package_data

I believe this question was asked before but I'm still a little stuck. I'm trying to install a Python package that has some data files with subdirectories. Here's my setup:
setup.py
src/
mypkg/
__init__.py
module.py
data/
tables.dat
spoons.dat
sub/
forks.dat
Following the docs I tried to add:
setup(...,
packages=['mypkg'],
package_dir={'mypkg': 'src/mypkg'},
package_data={'mypkg': ['data/*.dat', 'data/sub/*.dat']},
)
I install the module with python setup.py install (though eventually I'll use python setup.py sdist upload to upload the package to pypi so others can pip install the module.
After running the python setup.py install command, to find the module location, I then import mypkg and print(mypkg.__file__). In the package directory, however, I can see data but not data/sub. Does anyone know what I'm missing? Any help is greatly appreciated!
Ah, it turns out the above works fine!
To install the module to my site-packages/mypkg location, I just had to use: python setup.py sdist and then pip install dist/mypkg-0.0.1.tar.gz.
Then my data files were in site-packages/mypkg.
I had the same issue, in my case the problem was the package was installed and when executing
pip install .
in my local it didn't reinstall, so the packages weren't included.
Uninstall before install was the key for me

Copy configuration file on installation

I am trying to package my Python project, which comes with a configuration dotfile that I want copied into the user's home directory on installation. The quick guide to packaging says that this can be done using the data_files argument to setuptools.setup. So this is what I have:
data_files = [(os.path.expanduser("~"), [".my_config"])]
This appears to work fine if I use python setup.py install, but when I upload my package to PyPI and run pip install the dotfile isn't copied.
FWIW, I've put the dotfile in the MANIFEST.in and also tried including the package_data argument to setup. None of these steps appear to make a difference. If I pip install and poke around the site-packages directory, just the source files are here.
How can I achieve what I'm looking for?
This is an issue I had once to experience myself. Its root is that when you are building a wheel file, all the absolute paths specified in data_files will be relativized to the target site-packages directory, see this issue on github. This influences installations performed by pip install as it will build a wheel out of any source package (.tar.gz, .tar.bz2 or .zip) and install the resulting wheel:
$ pip install spam-0.1.tar.gz
Processing ./spam-0.1.tar.gz
Building wheels for collected packages: spam
Running setup.py bdist_wheel for spam ... done
Stored in directory: /Users/hoefling/Library/Caches/pip/wheels/d0/95/be/bc79f1d589d90d67139481a3e706bcc54578fdbf891aef75c0
Successfully built spam
Installing collected packages: spam
Successfully installed spam-0.1
Checking installed files yields:
$ pip show -f spam
Name: spam
Version: 0.1
...
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires:
Files:
Users/hoefling/.my_config
spam-0.1.dist-info/DESCRIPTION.rst
spam-0.1.dist-info/INSTALLER
spam-0.1.dist-info/METADATA
spam-0.1.dist-info/RECORD
spam-0.1.dist-info/WHEEL
spam-0.1.dist-info/metadata.json
spam-0.1.dist-info/top_level.txt
Note the path meant to be absolute is relative to the Location dir. In the example, .my_config would be placed under /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages/Users/hoefling/.my_config.
It gets even better because these built wheels are cached on your disk, so next time you reinstall the package and the built wheel still exists in pip's cache, it will be used for the installation and you won't even see any mentions of building a wheel in the terminal log.
There is no real solution to avoid this. The most decent workaround I found is to prohibit "binary" packages when installing to enforce the execution of package's setup.py on installation:
$ pip install spam-0.1.tar.gz --no-binary=spam
Processing ./spam-0.1.tar.gz
Skipping bdist_wheel for spam, due to binaries being disabled for it.
Installing collected packages: spam
Running setup.py install for spam ... done
Successfully installed spam-0.1
The file is now placed correctly:
$ pip show -f spam
Name: spam
Version: 0.1
...
Location: /Users/hoefling/.virtualenvs/stackoverflow/lib/python3.6/site-packages
Requires:
Files:
../../../../../.my_config
spam-0.1-py3.6.egg-info/PKG-INFO
spam-0.1-py3.6.egg-info/SOURCES.txt
spam-0.1-py3.6.egg-info/dependency_links.txt
spam-0.1-py3.6.egg-info/top_level.txt
Unfortunately, the user must be separately informed about calling pip install with the extra key (via readme, webpage FAQ or similar) as there is no possibility to prohibit building the wheel in package metadata.
As the result, I do not include files with absolute paths anymore. Instead, I install them with the python sources in the site-packages dir. In the python code, I have to add additional logic for the existence checks and file copying if necessary:
# program entrypoint
if __name__ == '__main__':
config = os.path.join(os.path.expanduser('~'), '.my_config')
if not os.path.exists(config):
shutil.copyfile('.my_config', config)
main.run()
Besides what #hoefling said, I suggest you not using data_files at all! Because it's really unpredictable where the files will be copied to. You could test this by giving the directory something like '', '/', or '/anything/you/want'.
I suggest you use package_data instead, which just copies the files under the distributed package root on installation. Then you can copy that to anywhere you want at run time.
For more on package_data, refer to Python Doc https://docs.python.org/2/distutils/setupscript.html#installing-package-data

Perplexing Python Package Predicament

First of all, I am running Python 2.7.5 on a Mac. I am trying to install a package and can't get it to work despite different approaches. The package is called python-fs-stack and is available here. https://pypi.python.org/pypi/python-fs-stack. I tried pip install python-fs-stack, sudo pip install python-fs-stack, easy_install python-fs-stack, sudo easy_install python-fs-stack and nothing worked. I then downloaded the package and got an error about a README.rst not being found. I commented out this line in the setup.py file and re-ran it. It got a lot farther, but then there was another error. Here is the output:
>>sudo python /Downloads/python-fs-stack-0.2/setup.py install
running install
running bdist_egg
running egg_info
writing python_fs_stack.egg-info/PKG-INFO
writing top-level names to python_fs_stack.egg-info/top_level.txt
writing dependency_links to python_fs_stack.egg-info/dependency_links.txt
warning: manifest_maker: standard file 'setup.py' not found
error: package directory 'familysearch' does not exist
I would really like to be able to get this package up and running, but I am at a loss. What should I try?

Categories

Resources