I am bitbaking a couple of python libraries and got this warning while adding second one of them:
WARNING: The recipe is trying to install files into a shared area when those files already exist. Those files are:
/home/ilya/beaglebone-dany/build/tmp/sysroots/beaglebone/usr/lib/python2.7/site-packages/site.py
/home/ilya/beaglebone-dany/build/tmp/sysroots/beaglebone/usr/lib/python2.7/site-packages/site.pyo
The libraries both use inherit distutils. So this is okay as far as bitbake goes, but when I tried to install the second package via opkg, I got this error:
# opkg install http://yocto.local:8080/python-requests_1.2.0-r0_armv7a-vfp-neon.ipk
Downloading http://yocto.local:8080/python-requests_1.2.0-r0_armv7a-vfp-neon.ipk.
Installing python-requests (1.2.0-r0) to root...
Configuring python-requests.
# opkg install http://yocto.local:8000/python-mylib_0.0.1-r0_armv7a-vfp-neon.ipk
Downloading http://yocto.local:8080/python-mylib_0.0.1-r0_armv7a-vfp-neon.ipk.
Installing python-mylib (0.0.1-r0) to root...
Collected errors:
* check_data_file_clashes: Package mylib-python wants to install file /usr/lib/python2.7/site-packages/site.py
But that file is already provided by package * python-requests
* check_data_file_clashes: Package mylib-python wants to install file /usr/lib/python2.7/site-packages/site.pyo
But that file is already provided by package * python-requests
* opkg_install_cmd: Cannot install package mylib-python.
Both recipes look just like so:
DESCRIPTION = "Requests: HTTP for Humans"
HOMEPAGE = "http://docs.python-requests.org/en/latest/"
SECTION = "devel/python"
LICENSE = "Apache-2.0"
DEPENDS = "python"
RDEPENDS_${PN} = "python-core"
PR = "r0"
SRC_URI = "git://github.com/kennethreitz/requests;protocol=git"
S = "${WORKDIR}/git/"
inherit distutils
#NOTE: I'm not 100% sure whether I still need to export these?
export BUILD_SYS
export HOST_SYS
export STAGING_INCDIR
export STAGING_LIBDIR
BBCLASSEXTEND = "native"
I have copied this from pycurl recipe, which also have had these lines that I removed:
do_install_append() {
rm -rf ${D}${datadir}/share
}
To get rid of the conflicting /usr/lib/python2.7/site-packages/site.py, one needs to avoid shipping this file by doing this:
do_install_append() {
rm -f ${D}${libdir}/python*/site-packages/site.py*
}
There had been another issue with the original version of recipe, the files it installed contained just .egg directory. I wasn't able to import the resulting package.
It turns out that using inherit setuptools instead of inherit distutils works.
I'm not a Python expert, but all setuptools class does is just this:
inherit distutils
DEPENDS += "python-setuptools-native"
DISTUTILS_INSTALL_ARGS = "--root=${D} \
--single-version-externally-managed \
--prefix=${prefix} \
--install-lib=${PYTHON_SITEPACKAGES_DIR} \
--install-data=${datadir}"
It turns out that some modules (e.g. PyBBIO) do not recognise --single-version-externally-managed, so then you have to use inherit distutils and you do get a working package.
Below is full recipe for the python-requests package which should soon be available upstream, in case you are intending to use it.
DESCRIPTION = "Requests: HTTP for Humans"
HOMEPAGE = "http://docs.python-requests.org/en/latest/"
SECTION = "devel/python"
LICENSE = "Apache-2.0"
#DEPENDS = "python-core"
RDEPENDS_${PN} = "python"
PR = "r0"
SRC_URI = "git://github.com/kennethreitz/requests;protocol=git"
S = "${WORKDIR}/git/"
inherit setuptools
# need to export these variables for python-config to work
export BUILD_SYS
export HOST_SYS
export STAGING_INCDIR
export STAGING_LIBDIR
BBCLASSEXTEND = "native"
do_install_append() {
rm -f ${D}${libdir}/python*/site-packages/site.py*
}
Related
I use it on my windows machine by downloading its binary. I also use it in Heroku from its herokus build pack. I don't know what operating system replit use. But I try every possible commed like.
!pip install ta-lib
!pip install talib-binary
It's not working with replit. I thought it work like google co-lab but its not the same.
can anyone use TA-LIB with replit. if so. How you install it?
Getting TA-Lib work on Replit
(by installing it from sources)
Create a new replit with Nix toolset with a Python template.
In main.py write:
import talib
print (talib.__ta_version__)
This will be our test case. If ta-lib is installed the python main.py (executed in Shell) will return something like:
$ python main.py
b'0.6.0-dev (Jan 1 1980 00:00:00)'
We need to prepare a tools for building TA-Lib sources. There is a replit.nix file in your project's root folder (in my case it was ~/BrownDutifulLinux). Every time you execute a command like cmake the Nix reports that:
cmake: command not installed. Multiple versions of this command were found in Nix.
Select one to run (or press Ctrl-C to cancel):
cmake.out
cmakeCurses.out
cmakeWithGui.out
cmakeMinimal.out
cmake_2_8.out
If you select cmake.out it will add a record about it into the replit.nix file. And next time you call cmake, it will know which cmake version to launch. Perhaps you may manually edit replit.nix file... But if you're going to add such commands in a my way, note that you must execute them in Shell in your project root folder as replit.nix file is located in it. Otherwise Nix won't remember your choice.
After all my replit.nix file (you may see its content with cat replit.nix) content was:
{ pkgs }: {
deps = [
pkgs.libtool
pkgs.automake
pkgs.autoconf
pkgs.cmake
pkgs.python38Full
];
env = {
PYTHON_LD_LIBRARY_PATH = pkgs.lib.makeLibraryPath [
# Needed for pandas / numpy
pkgs.stdenv.cc.cc.lib
pkgs.zlib
# Needed for pygame
pkgs.glib
# Needed for matplotlib
pkgs.xorg.libX11
];
PYTHONBIN = "${pkgs.python38Full}/bin/python3.8";
LANG = "en_US.UTF-8";
};
}
Which means I executed libtool, autoconf, automake and cmake in Shell. I always choose a generic suggestion from Nix, without a specific version. Note: some commands may report errors as we executing them in a wrong way just to add to a replit.nix.
3.
Once build tools are set up we need to get and build TA-Lib C library sources. To do that execute in Shell:
git clone https://github.com/TA-Lib/ta-lib.git
then
cd ta-lib/
libtoolize
autoreconf --install
./configure
If configure script is completed without any problems, build the library with:
make -j4
It will end up with some compilation errors, but they are related to some additional tools which are used to add new TA-Lib indicators and build at the end, but not the library itself. The library will be successfully compiled and you should be able to see it with:
$ ls ./src/.libs/
libta_lib.a libta_lib.lai libta_lib.so.0
libta_lib.la libta_lib.so libta_lib.so.0.0.0
Now we have our C library built, but we can't install it to a system default folders. So we have to use the library as is from the folders where it was build. All we need is just one more additional preparation:
mkdir ./include/ta-lib
cp ./include/*.h ./include/ta-lib/
This will copy a library headers to a subfolder, as they are designed to be used from a such subfolder (which they don't have due to impossibility to perform the installation step).
4.
Now we have TA-Lib C library built and prepared to be used locally from its build folders. All we need after that - is to compile the Python wrapper for it. But Python wrapper will look for a library only in system default folders, so we need to instruct it where our library is.
To do this, execute pwd and remember the absolute path to your project's root folder. In my case it was:
/home/runner/FormalPleasedOffice
Then adjust the paths (there are two) in a following command to lead to your project path:
TA_INCLUDE_PATH=/home/runner/FormalPleasedOffice/ta-lib/include/ TA_LIBRARY_PATH=/home/runner/FormalPleasedOffice/ta-lib/src/.libs/ pip install ta-lib
This is one line command, not a two commands.If the paths would be shorter it would look like:
TA_INCLUDE_PATH=/path1/ TA_LIBRARY_PATH=/path2/ pip install ta-lib.
After execution of this command the wrapper will be installed with two additional paths where it will look for a library and its header files.
That's actually all.
An alternative way would be to clone the wrapper sources, edit its setup.py and install wrapper manually. Just for the record this would be:
cd ~/Your_project
git clone https://github.com/mrjbq7/ta-lib.git ta-lib-wrapper
cd ta-lib-wrapper
Here edit the setup.py. Find the lines include_dirs = [ and library_dirs = [ and append your paths to these lists. Then you just need to:
python setup.py build
pip install .
Note the dot at the end.
5.
Go to the project's folder and try our python script:
$python main.py
b'0.6.0-dev (Jan 1 1980 00:00:00)'
Bingo!
The #truf answer is correct.
after you add the
pkgs.libtool
pkgs.automake
pkgs.autoconf
pkgs.cmake
in the replit.nix dippendancies.
git clone https://github.com/TA-Lib/ta-lib.git
cd ta-lib/
libtoolize
autoreconf --install
./configure
make -j4
mkdir ./include/ta-lib
cp ./include/*.h ./include/ta-lib/
TA_INCLUDE_PATH=/home/runner/FormalPleasedOffice/ta-lib/include/ TA_LIBRARY_PATH=/home/runner/FormalPleasedOffice/ta-lib/src/.libs/ pip install ta-lib
Note : FormalPleasedOffice should be your project name
Done.
Here is the youtube video :
https://www.youtube.com/watch?v=u20y-nUMo5I
I am trying to create a yocto recipe for scikit-learn package. It depends on scipy pacakge. I was able to successfully build the scipy package using : https://github.com/gpanders/meta-scipy.
When I run bitbake python3-scikit-learn, i am getting the below error:
ModuleNotFoundError: No module named 'scipy'
I am executing the commands in the below order.
Once I have cloned/copied the scipy recipes and the patches listed in the meta-scipy, i am running bitbake python3-scipy and the build was successful.
Then, I created a recipe file with the name python3-scikit-learn_0.23.2.bb and the contents are as below.
PYPI_PACKAGE = "scikit-learn"
LICENSE = "BSD"
LIC_FILES_CHKSUM = "file://PKG-INFO;beginline=8;endline=8;md5=40ee42dc5a49f1617c5c78f16c50e065"
SRC_URI[sha256sum] = "20766f515e6cd6f954554387dfae705d93c7b544ec0e6c6a5d8e006f6f7ef480"
inherit pypi setuptools3
#DEPENDS = "${PYTHON_PN}-numpy-native ${PYTHON_PN}-numpy ${PYTHON_PN}-scipy ${PYTHON_PN}-joblib ${PYTHON_PN}"
DEPENDS = "${PYTHON_PN}-numpy-native ${PYTHON_PN}-numpy ${PYTHON_PN}-scipy ${PYTHON_PN}"
RDEPENDS_${PN} += "${PYTHON_PN}-numpy ${PYTHON_PN}-scipy"
When I run the bitbake python3-scikit-learn, i am getting this ModuleNotFoundError: No module named 'scipy'
Checked the path where the devshell python3 is looking (poky/build/tmp-glibc/work/aarch64-oe-linux/python3-scikit-learn/0.23.2-r0/recipe-sysroot-native/usr/lib/python3.8/site-packages), and i can only see the numpy package there, but scipy package is not there.
ls command output :
numpy
numpy-1.17.4-py3.8.egg-info
pkg_resources
__pycache__
README.txt
setuptools
setuptools-45.2.0-py3.8.egg-info
Can someone point me on how to include the python3-scipy package, so that it will be included/copied to the devshell. Or do I need to update/fix something else.
Appreciate any guidance on this.
you can also run:
bitbake -c devshell python3-scipy
and see exactly where the recipe is packaging everything into the rootfs. The rootfs is by default this:
https://git.yoctoproject.org/cgit/cgit.cgi/poky/tree/meta/conf/bitbake.conf#n457
IMAGE_ROOTFS = "${WORKDIR}/rootfs"
so check out what python3-scipy puts into ${WORKDIR}/rootfs (${WORKDIR} is where you get thrown into after you execute devshell, so just cd into rootfs from there).
If python3-scipy puts it into somewhere that is not on the PATH, you can add that to your path.
you can see how python3-scipy looks for libraries:
https://github.com/gpanders/meta-scipy/blob/1c07824202af668ef1539c3de392cf737c5ba3fd/recipes-devtools/python/python3-scipy_1.5.3.bb#L29
# Tell Numpy to look in target sysroot site-packages directory for libraries
LDFLAGS_append = " -L${STAGING_LIBDIR}/${PYTHON_DIR}/site-packages/numpy/core/lib"
I am writing a custom yocto recipe that should install a python package from a .whl file.
I tried it using a recipe that contains:
inherit pypi setuptools
PYPI_SRC_URI="http://ci.tensorflow.org/view/Nightly/job/nightly-pi-zero/lastSuccessfulBuild/artifact/output-artifacts/tensorflow-1.5.0rc1-cp27-none-any.whl“
But it does not work that way, it states, that a setup.py file is missing and when trying to write a custom do_compile task that runs pip install <PATH-TO-WHL> it says, that pip is an unkown command.
When installing .whl files directly onto the target system one would type the following:
pip install <path-to-whl-file>
Thanks for your help!
.whl package is just a .zip file with python sources, and precompiled binaries for certain platform.
So, you can do something like this:
COMPATIBLE_HOST = "i686.*-mingw.*"
SRC_URI = "https://files.pythonhosted.org/packages/d8/9d/7a8cad803ef73f47134ae5c3804e20b54149ce62a7d1337204f3cf2d1fa1/MarkupSafe-1.1.1-cp35-cp35m-win32.whl;downloadfilename=MarkupSafe-1.1.1-cp35-cp35m-win32.zip;subdir=${BP}"
SRC_URI[md5sum] = "a948c70a1241389d7120db90d69079ca"
SRC_URI[sha256sum] = "6dd73240d2af64df90aa7c4e7481e23825ea70af4b4922f8ede5b9e35f78a3b1"
inherit nativesdk python3-dir
LICENSE = "BSD-3-Clause"
PV = "1.1.1"
PN = "nativesdk-python3-markupsafe"
LIC_FILES_CHKSUM = "file:///${S}/MarkupSafe-1.1.1.dist-info/LICENSE.rst;md5=ffeffa59c90c9c4a033c7574f8f3fb75"
do_unpack[depends] += "unzip-native:do_populate_sysroot"
PROVIDES += "nativesdk-python3-markupsafe"
DEPENDS += "nativesdk-python3"
FILES_${PN} += "\
${libdir}/${PYTHON_DIR}/site-packages/* \
"
do_install() {
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/MarkupSafe-1.1.1.dist-info
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/markupsafe
install -m 644 ${S}/markupsafe/* ${D}${libdir}/${PYTHON_DIR}/site-packages/markupsafe/
install -m 644 ${S}/MarkupSafe-1.1.1.dist-info/* ${D}${libdir}/${PYTHON_DIR}/site-packages/MarkupSafe-1.1.1.dist-info/
}
I haven't tested it, yet, but it already forms proper nativesdk package.
Note downloadfilename= parameter to the SRC_URI - without it, .whl file would not be extracted.
Basing on https://stackoverflow.com/a/57694762/5422708 answer I would like to share working recipe for engineering_notation 0.6.0 module which also provides only .whl package:
SUMMARY = "To easily work with human-readable engineering notation."
HOMEPAGE = "https://github.com/slightlynybbled/engineering_notation"
SRC_URI = "https://files.pythonhosted.org/packages/d4/c4/4712b8020b8a3ada129581be891e2dbbd6a4cf54195ea2b80f89bbc51756/engineering_notation-${PV}-py3-none-any.whl;downloadfilename=engineering_notation-${PV}-py3-none-any.zip;subdir=${BP}"
SRC_URI[md5sum] = "5684efec41bc0738bb1fe625a71ffaf7"
SRC_URI[sha256sum] = "1ea1e450d575b4804723d0711b0609d2711dffac2f4b5548ee632c16a636d9f6"
inherit python3-dir
LICENSE = "MIT"
LIC_FILES_CHKSUM = "file:///${S}/engineering_notation-${PV}.dist-info/METADATA;md5=5a9ae92d9fbf02bbcd5e4e94e6356cd3"
do_unpack[depends] += "unzip-native:do_populate_sysroot"
DEPENDS += "python3"
FILES:${PN} += "\
${libdir}/${PYTHON_DIR}/site-packages/engineering_notation \
${libdir}/${PYTHON_DIR}/site-packages/engineering_notation-${PV}.dist-info \
"
do_install() {
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation
install -d ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation-${PV}.dist-info
install -m 644 ${S}/engineering_notation/* ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation/
install -m 644 ${S}/engineering_notation-${PV}.dist-info/* ${D}${libdir}/${PYTHON_DIR}/site-packages/engineering_notation-${PV}.dist-info/
}
We have a setup.py file for one of our Python modules. In setup.py we are actually using the numpy module to do some setup. The module itself also utilizes numpy.
The setup.py works fine on our host machines.
However, when we run the following bitbake recipe for our embedded platform, the build fails as it cannot find numpy. I checked and I cannot find what the PYTHONPATH variable is set to. Running bitbake -e | grep ^PYTHONPATH= returns nothing; so perhaps this is the problem?
It may be that we are going about this wrong as well. Any insight into why its not working or advice on how to accomplish using bitbake with our setup.py would be greatly appreciated.
Our recipe. I tried to use the matplotlib recipe as a baseline as I know that matplotlib has a numpy dependency.
DESCRIPTION = "our-pythonlib"
AUTHOR = "Author"
MAINTAINER = "${AUTHOR}"
SECTION = "company-apps"
LICENSE = "CLOSED"
S="${THISDIR}/our-pythonlib"
inherit distutils
# depend on following packages to work:
RDEPENDS_${PN} += " \
python-numpy \
python-ctypes \
python-json \
"
EXTRA_OECONF = "--disable-docs --with-python-includes=${STAGING_INCDIR}/../"
inherit distutils
do_compile_prepend() {
BUILD_SYS=${BUILD_SYS} HOST_SYS=${HOST_SYS} \
${STAGING_BINDIR_NATIVE}/python setup.py build ${DISTUTILS_BUILD_ARGS} || \
true
}
# need to export these variables for python-config to work
export PYTHONPATH
export BUILD_SYS
export HOST_SYS
export STAGING_INCDIR
I installed a package from git hub:
pip install -e git+http://github.com/un33k/django-uuslug.git#egg=django-uuslug
Then I did:
pip freeze > req.txt
I get:
django-uuslug==0.1
Now if I do a pip install -r req.txt, I get a package not found error, which due to the fact that django-uuslug is not on pypi.
Why is freeze not remembering the full path as it was given during the install?
I had the same issue. I believe it's a problem whenever the packages are in a subdirectory(e.g. src). Here's the patch that fixed it for me.
--- a/setup.py
+++ b/setup.py
## -11,13 +11,9 ## setup(
license = 'BSD',
description = "MAC address model and form fields for Django apps.",
long_description = read('README.rst'),
-
author = 'Ryan Nowakowski',
author_email = 'me#example.com',
-
- packages = find_packages('src'),
- package_dir = {'': 'src'},
-
+ packages = ['macaddress'],
install_requires = ['setuptools'],
requires = ['netaddr'],
#tests_requires = ['django'],
I fixed it, don't know how, but I had to change the setup.py
pip install -e git+http://github.com/un33k/django-uuslug.git#egg=django-uuslug
If you find similar issue, and find yourself on this question, just look at the setup.py in the above package. Perhaps you can tell me how I fixed it. I just moved things around a bit.