I am writing a custom yocto recipe that should install a python package from a .whl file.
I tried it using a recipe that contains:
inherit pypi setuptools
PYPI_SRC_URI="http://ci.tensorflow.org/view/Nightly/job/nightly-pi-zero/lastSuccessfulBuild/artifact/output-artifacts/tensorflow-1.5.0rc1-cp27-none-any.whl“
But it does not work that way, it states, that a setup.py file is missing and when trying to write a custom do_compile task that runs pip install <PATH-TO-WHL> it says, that pip is an unkown command.
When installing .whl files directly onto the target system one would type the following:
pip install <path-to-whl-file>
Thanks for your help!
.whl package is just a .zip file with python sources, and precompiled binaries for certain platform.
So, you can do something like this:
COMPATIBLE_HOST = "i686.*-mingw.*"
SRC_URI = "https://files.pythonhosted.org/packages/d8/9d/7a8cad803ef73f47134ae5c3804e20b54149ce62a7d1337204f3cf2d1fa1/MarkupSafe-1.1.1-cp35-cp35m-win32.whl;downloadfilename=MarkupSafe-1.1.1-cp35-cp35m-win32.zip;subdir=${BP}"
SRC_URI[md5sum] = "a948c70a1241389d7120db90d69079ca"
SRC_URI[sha256sum] = "6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1"
inherit nativesdk python3-dir
LICENSE = "BSD-3-Clause"
PV = "1.1.1"
PN = "nativesdk-python3-markupsafe"
LIC_FILES_CHKSUM = "file:///${S}/MarkupSafe-1.1.1.dist-info/LICENSE.rst;md5=ffeffa59c90c9c4a033c7574f8f3fb75"
do_unpack[depends] += "unzip-native:do_populate_sysroot"
PROVIDES += "nativesdk-python3-markupsafe"
DEPENDS += "nativesdk-python3"
FILES_${PN} += "\
${libdir}/${PYTHON_DIR}/site-packages/* \
"
do_install() {
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/MarkupSafe-1.1.1.dist-info
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/markupsafe
install -m 644 ${S}/markupsafe/* ${D}${libdir}/${PYTHON_DIR}/site-packages/markupsafe/
install -m 644 ${S}/MarkupSafe-1.1.1.dist-info/* ${D}${libdir}/${PYTHON_DIR}/site-packages/MarkupSafe-1.1.1.dist-info/
}
I haven't tested it, yet, but it already forms proper nativesdk package.
Note downloadfilename= parameter to the SRC_URI - without it, .whl file would not be extracted.
Basing on https://stackoverflow.com/a/57694762/5422708 answer I would like to share working recipe for engineering_notation 0.6.0 module which also provides only .whl package:
SUMMARY = "To easily work with human-readable engineering notation."
HOMEPAGE = "https://github.com/slightlynybbled/engineering_notation"
SRC_URI = "https://files.pythonhosted.org/packages/d4/c4/4712b8020b8a3ada129581be891e2dbbd6a4cf54195ea2b80f89bbc51756/engineering_notation-${PV}-py3-none-any.whl;downloadfilename=engineering_notation-${PV}-py3-none-any.zip;subdir=${BP}"
SRC_URI[md5sum] = "5684efec41bc0738bb1fe625a71ffaf7"
SRC_URI[sha256sum] = "1ea1e450d575b4804723d0711b0609d2711dffac2f4b5548ee632c16a636d9f6"
inherit python3-dir
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file:///${S}/engineering_notation-${PV}.dist-info/METADATA;md5=5a9ae92d9fbf02bbcd5e4e94e6356cd3"
do_unpack[depends] += "unzip-native:do_populate_sysroot"
DEPENDS += "python3"
FILES:${PN} += "\
${libdir}/${PYTHON_DIR}/site-packages/engineering_notation \
${libdir}/${PYTHON_DIR}/site-packages/engineering_notation-${PV}.dist-info \
"
do_install() {
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation-${PV}.dist-info
install -m 644 ${S}/engineering_notation/* ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation/
install -m 644 ${S}/engineering_notation-${PV}.dist-info/* ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation-${PV}.dist-info/
}
Related
Short version:
How can I poetry install a package where one of the dependencies is a local tarball/zip file? It doesn't seem to work, yet it is shown in the poetry docs?
I can poetry install the package when the dependency is pulled from gitlab, but the install fails when I manually download the dependency from gitlab as a tarball and try to poetry install with the dependency in the tarball.
Long version:
I am trying to use poetry to install two packages that I have developed:
a base package called my_package
an extension called extension_of_my_package.
Both packages are in private repos in gitlab, and both have a pyproject.toml containing their dependency list. I can successfully poetry install the extended package (extension_of_my_package) when the base package my_package is downloaded from gitlab. i.e. the pyproject.toml file in extension_of_my_package has a tool.poetry.source section that gives the location of the my_package private repo on gitlab.
However, external users cannot access my private repo, so I need to ensure the
packages can be installed from tarballs (that I download from gitlab and give to the client).
To install extension_of_my_package I do this:
tar xzf extension_of_my_package.tgz
cd extension_of_my_package/python
and then edit the pyproject.toml, changing the dependency on my_package to point to
the local tarball:
my_package = { path = "/path/to/my_package.tgz"}
and then run poetry install. This fails with the error message:
> poetry install
Updating dependencies
Resolving dependencies... (9.3s)
TypeError
expected string or bytes-like object
at /home/user/.poetry/lib/poetry/_vendor/py3.8/poetry/core/utils/helpers.py:27 in canonicalize_name
23│ _canonicalize_regex = re.compile(r"[-_]+")
24│
25│
26│ def canonicalize_name(name): # type: (str) -> str
→ 27│ return _canonicalize_regex.sub("-", name).lower()
28│
29│
30│ def module_name(name): # type: (str) -> str
31│ return canonicalize_name(name).replace(".", "_").replace("-", "_")
According to the poetry docs it is possible to install from a local file:
[tool.poetry.dependencies]
# directory
my-package = { path = "../my-package/", develop = false }
# file
my-package = { path = "../my-package/dist/my-package-0.1.0.tar.gz" }
I also tried using my-package = { file = ... instead of my-package = { path = ..., but it didn't work either.
I tried adding a minimal setup.py file to my_package (see this post), but that didn't help.
I tried converting my_package (in tarball format) to a wheel. I can successfully poetry install when my package is in wheel format, but my_packages's dependencies are not installed. I can't see how to include the dependency info in the wheel. When I created the wheel I tried specifying the
dependency info in two ways:
in setup.cfg:
[metadata]
name = my_package
version = 0.1.0
description = My Package
license = Proprietary
[options]
packages = find:
install_requires =
matplotlib >=3.2.0
and
in setup.py"
from setuptools import setup
setup(
name=`my_package`,
version="0.1.0,
packages=['.my_package'],
install_requires=['matplotlib >= 3.2.0',]
)
To rule out any problem with my own package, I created a minimal test and tried to poetry install a publicly available package (tqdm) from its zip file (downloaded from github). It also fails. The pyproject.toml for this minimal test is:
[tool.poetry]
name = "tester"
version = "0.0.1"
description = "test package"
authors = [ "me" ]
packages = [
{ include = "tester" }
]
[tool.poetry.dependencies]
python = ">=3.7,<3.9"
tqdm = {file = "/home/user/tqdm-master.zip"}
and the error message is:
> poetry install
Updating dependencies
Resolving dependencies... (13.0s)
RuntimeError
Unable to determine package info from path: /home/user/tqdm-master.zip
at /home/user/.poetry/lib/poetry/puzzle/provider.py:251 in get_package_from_file
247│ package = PackageInfo.from_path(path=file_path).to_package(
248│ root_dir=file_path
249│ )
250│ except PackageInfoError:
→ 251│ raise RuntimeError(
252│ "Unable to determine package info from path: {}".format(file_path)
253│ )
254│
255│ return package
I am using poetry version 1.1.13.
I am open to any alternative approaches, so long as all the dependencies are checked.
I have some .proto gRPC files I want to compile as part of the setup.py script. This requires running from grpc_tools import protoc and calling protoc before setup(args). The goal is to compile and install the pb files from pip install pkgname.
E.g.
# setup.py
# generate our pb2 files in the temp directory structure
compile_protobufs(pkgname)
# this will package the generated files and put them in site-packages or .whl
setup(
name=pkgname,
install_requires=['grpcio-tools', ...],
...
)
This works as intended, I get the pb files in my site-packages or in the wheel without them having to exist in the source folder. However, this pattern means I cannot naively pip install pkgname from scratch, as the step compile_protobufs depends on grpcio-tools, which does not get installed until setup().
I could use setup_requires, but that is on the chopping block. I could just install the dependencies first (right now I use RUN pip install -r build-require.txt && pip install pkgname/ ), but it still seems like there ought to be a cleaner way.
Am I even going about this pattern correctly or am I missing some packaging idiom?
My criteria:
Generally this is run inside a container, so minimizing external deps
I want the _pb2.py files regenerated each time I pip install
These files need to also make their way into any .whl or tar.
Looks like it is already documented here:
https://github.com/grpc/grpc/tree/master/tools/distrib/python/grpcio_tools#usage
So your setup.py could look like this:
#!/usr/bin/env python3
import distutils.command.install
import setuptools
class build_package_protos(setuptools.Command):
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
from grpc_tools import command
command.build_package_protos(self.distribution.package_dir[''])
class install(distutils.command.install.install):
_sub_command = ('build_package_protos', None,)
_sub_commands = distutils.command.install.install.sub_commands
sub_commands = [_sub_command] + _sub_commands
def setup():
setuptools.setup(
# see 'setup.cfg'
cmdclass={
'build_package_protos': build_package_protos,
'install': install,
},
setup_requires=[
'grpcio-tools',
],
)
if __name__ == '__main__':
setup()
We have a setup.py file for one of our Python modules. In setup.py we are actually using the numpy module to do some setup. The module itself also utilizes numpy.
The setup.py works fine on our host machines.
However, when we run the following bitbake recipe for our embedded platform, the build fails as it cannot find numpy. I checked and I cannot find what the PYTHONPATH variable is set to. Running bitbake -e | grep ^PYTHONPATH= returns nothing; so perhaps this is the problem?
It may be that we are going about this wrong as well. Any insight into why its not working or advice on how to accomplish using bitbake with our setup.py would be greatly appreciated.
Our recipe. I tried to use the matplotlib recipe as a baseline as I know that matplotlib has a numpy dependency.
DESCRIPTION = "our-pythonlib"
AUTHOR = "Author"
MAINTAINER = "${AUTHOR}"
SECTION = "company-apps"
LICENSE = "CLOSED"
S="${THISDIR}/our-pythonlib"
inherit distutils
# depend on following packages to work:
RDEPENDS_${PN} += " \
python-numpy \
python-ctypes \
python-json \
"
EXTRA_OECONF = "--disable-docs --with-python-includes=${STAGING_INCDIR}/../"
inherit distutils
do_compile_prepend() {
BUILD_SYS=${BUILD_SYS} HOST_SYS=${HOST_SYS} \
${STAGING_BINDIR_NATIVE}/python setup.py build ${DISTUTILS_BUILD_ARGS} || \
true
}
# need to export these variables for python-config to work
export PYTHONPATH
export BUILD_SYS
export HOST_SYS
export STAGING_INCDIR
I am bitbaking a couple of python libraries and got this warning while adding second one of them:
WARNING: The recipe is trying to install files into a shared area when those files already exist. Those files are:
/home/ilya/beaglebone-dany/build/tmp/sysroots/beaglebone/usr/lib/python2.7/site-packages/site.py
/home/ilya/beaglebone-dany/build/tmp/sysroots/beaglebone/usr/lib/python2.7/site-packages/site.pyo
The libraries both use inherit distutils. So this is okay as far as bitbake goes, but when I tried to install the second package via opkg, I got this error:
# opkg install http://yocto.local:8080/python-requests_1.2.0-r0_armv7a-vfp-neon.ipk
Downloading http://yocto.local:8080/python-requests_1.2.0-r0_armv7a-vfp-neon.ipk.
Installing python-requests (1.2.0-r0) to root...
Configuring python-requests.
# opkg install http://yocto.local:8000/python-mylib_0.0.1-r0_armv7a-vfp-neon.ipk
Downloading http://yocto.local:8080/python-mylib_0.0.1-r0_armv7a-vfp-neon.ipk.
Installing python-mylib (0.0.1-r0) to root...
Collected errors:
* check_data_file_clashes: Package mylib-python wants to install file /usr/lib/python2.7/site-packages/site.py
But that file is already provided by package * python-requests
* check_data_file_clashes: Package mylib-python wants to install file /usr/lib/python2.7/site-packages/site.pyo
But that file is already provided by package * python-requests
* opkg_install_cmd: Cannot install package mylib-python.
Both recipes look just like so:
DESCRIPTION = "Requests: HTTP for Humans"
HOMEPAGE = "http://docs.python-requests.org/en/latest/"
SECTION = "devel/python"
LICENSE = "Apache-2.0"
DEPENDS = "python"
RDEPENDS_${PN} = "python-core"
PR = "r0"
SRC_URI = "git://github.com/kennethreitz/requests;protocol=git"
S = "${WORKDIR}/git/"
inherit distutils
#NOTE: I'm not 100% sure whether I still need to export these?
export BUILD_SYS
export HOST_SYS
export STAGING_INCDIR
export STAGING_LIBDIR
BBCLASSEXTEND = "native"
I have copied this from pycurl recipe, which also have had these lines that I removed:
do_install_append() {
rm -rf ${D}${datadir}/share
}
To get rid of the conflicting /usr/lib/python2.7/site-packages/site.py, one needs to avoid shipping this file by doing this:
do_install_append() {
rm -f ${D}${libdir}/python*/site-packages/site.py*
}
There had been another issue with the original version of recipe, the files it installed contained just .egg directory. I wasn't able to import the resulting package.
It turns out that using inherit setuptools instead of inherit distutils works.
I'm not a Python expert, but all setuptools class does is just this:
inherit distutils
DEPENDS += "python-setuptools-native"
DISTUTILS_INSTALL_ARGS = "--root=${D} \
--single-version-externally-managed \
--prefix=${prefix} \
--install-lib=${PYTHON_SITEPACKAGES_DIR} \
--install-data=${datadir}"
It turns out that some modules (e.g. PyBBIO) do not recognise --single-version-externally-managed, so then you have to use inherit distutils and you do get a working package.
Below is full recipe for the python-requests package which should soon be available upstream, in case you are intending to use it.
DESCRIPTION = "Requests: HTTP for Humans"
HOMEPAGE = "http://docs.python-requests.org/en/latest/"
SECTION = "devel/python"
LICENSE = "Apache-2.0"
#DEPENDS = "python-core"
RDEPENDS_${PN} = "python"
PR = "r0"
SRC_URI = "git://github.com/kennethreitz/requests;protocol=git"
S = "${WORKDIR}/git/"
inherit setuptools
# need to export these variables for python-config to work
export BUILD_SYS
export HOST_SYS
export STAGING_INCDIR
export STAGING_LIBDIR
BBCLASSEXTEND = "native"
do_install_append() {
rm -f ${D}${libdir}/python*/site-packages/site.py*
}
I installed a package from git hub:
pip install -e git+http://github.com/un33k/django-uuslug.git#egg=django-uuslug
Then I did:
pip freeze > req.txt
I get:
django-uuslug==0.1
Now if I do a pip install -r req.txt, I get a package not found error, which due to the fact that django-uuslug is not on pypi.
Why is freeze not remembering the full path as it was given during the install?
I had the same issue. I believe it's a problem whenever the packages are in a subdirectory(e.g. src). Here's the patch that fixed it for me.
--- a/setup.py
+++ b/setup.py
## -11,13 +11,9 ## setup(
license = 'BSD',
description = "MAC address model and form fields for Django apps.",
long_description = read('README.rst'),
-
author = 'Ryan Nowakowski',
author_email = 'me#example.com',
-
- packages = find_packages('src'),
- package_dir = {'': 'src'},
-
+ packages = ['macaddress'],
install_requires = ['setuptools'],
requires = ['netaddr'],
#tests_requires = ['django'],
I fixed it, don't know how, but I had to change the setup.py
pip install -e git+http://github.com/un33k/django-uuslug.git#egg=django-uuslug
If you find similar issue, and find yourself on this question, just look at the setup.py in the above package. Perhaps you can tell me how I fixed it. I just moved things around a bit.