Here's my setup.py
setup(
name='shipane_sdk',
version='1.0.0.a5',
# ...
data_files=[(os.path.join(os.path.expanduser('~'), '.shipane_sdk', 'config'), ['config/scheduler-example.ini'])],
# ...
)
Packing & Uploading commands:
python setup.py sdist
python setup.py bdist_wheel --universal
twine upload dist/*
Installing command:
pip install shipane_sdk
But, it doesn't install the config/scheduler-example.ini under ~/.shipane_sdk
The pip documents says:
setuptools allows absolute “data_files” paths, and pip honors them as
absolute, when installing from sdist. This is not true when installing
from wheel distributions. Wheels don’t support absolute paths, and
they end up being installed relative to “site-packages”. For
discussion see wheel Issue #92.
Do you know how to do installing from sdist?
There are multiple solutions to this problem and it is all very confusing how inconsistent the packaging tools work. Some time ago I found the following workaround worked for me best with sdist (note that it doesn't work with wheels!):
Instead of using data_files, attach the files to your package using MANIFEST.in, which in your case could look like this:
include config/scheduler-example.ini
Copy the files "manually" to chosen location using this snippet in setup.py:
if 'install' in sys.argv:
from pkg_resources import Requirement, resource_filename
import os
import shutil
# retrieve the temporary path where the package has been extracted to for installation
conf_path_temp = resource_filename(Requirement.parse(APP_NAME), "conf")
# if the config directory tree doesn't exist, create it
if not os.path.exists(CONFIG_PATH):
os.makedirs(CONFIG_PATH)
# copy every file from given location to the specified ``CONFIG_PATH``
for file_name in os.listdir(conf_path_temp):
file_path_full = os.path.join(conf_path_temp, file_name)
if os.path.isfile(file_path_full):
shutil.copy(file_path_full, CONFIG_PATH)
In my case "conf" was the subdirectory in the package that contained my data files and they were supposed to be installed into CONFIG_PATH which was something like /etc/APP_NAME
Related
I'm working on a python package which is based on pybind11 and build by cmake.
In this project a prebuild thirdparty library along with header files will be downloaded as a dependancy to pybind11 module.
cmake file is like this:
# unpackage.sh will download a tar file contains libvega.so and some so files libvega.so depends, and they will be unpacked to build dir
add_custom_target(
build-vega
COMMAND bash ${CMAKE_BINARY_DIR}/unpack.sh)
list(APPEND VEGA_LIBRARIES ${VEGA_INSTALL_DIR}/lib/libvega.so)
pybind11_add_module(${CMAKE_PROJECT_NAME})
add_dependencies(${CMAKE_PROJECT_NAME} build-vega)
list(APPEND SRC_FILES vegapy.cpp)
target_sources(${CMAKE_PROJECT_NAME} PRIVATE ${SRC_FILES})
target_link_libraries(${CMAKE_PROJECT_NAME} PRIVATE ${VEGA_LIBRARIES})
the setup file is based on cmake-example, only changes some names.
when install package by pip install ., vegapy.cpython-38-x86_64-linux-gnu.so will be installed to python dist-packages, but it depends on the so file inside project build dir.
root#ubuntu:/usr/local/lib/python3.8/dist-packages# ldd vegapy.cpython-38-x86_64-linux-gnu.so
libvega.so => /data/jgq/pv/build/temp.linux-x86_64-3.8/vegapy/vega/lib/libvega.so (0x00007f65a00cc000)
libexpreval.so => not found # depended by libvega.so
libmanage.so => not found # depended by libvega.so
in this way, when vegapy is imported in python, it will work, but if /data/jgq/pv/build/temp.linux-x86_64-3.8/vegapy/vega/lib is deleted, import will fail for libvega.so can not be found even I copy those so files into dist-packages.
I expect that:
libvega.so and its dependancies will be installed along with vegapy.cpython-38-x86_64-linux-gnu.so into a same dir.
vegapy should link to installed so files in dist-packages when python import vegapy package
How to add dependencies inside setup.py file ? Like, I am writing this script on VM and want to check whether certain dependencies like, jdk or docker is there or not, and if there is no dependencies installed, then need to install automatically on VM using this script.
Please do tell me as soon as possible, as it is required for my project.
In simplest form, you can add (python) dependencies which can be install via pip as follow:
from setuptools import setup
setup(
...
install_requires=["install-jdk", "docker>=4.3"],
...
)
Alternatively, write down a requirement.txt file and then use it:
with open("requirements.txt") as requirements_file:
requirements = requirements_file.readlines()
requirements = [x[:-1] for x in requirements]
setup(
...
install_requires=requirements,
...
)
Whenever you'll execute python setup.py install then these dependencies will be checked against the available libraries in your VM and if they are not available (or version mismatch) then it will be installed (or replaced). More information can be found here.
Refer the https://github.com/boto/s3transfer/blob/develop/setup.py and check the requires variables.
You can refer many other open source projects
You can add dependencies using setuptools, however it can only check dependencies on python packages.
Because of that, you could check jdk and docker installation before setup(), manually.
You could call system like the code below and check the reponse.
import os
os.system("java -version")
os.system("docker version --format \'{{.Server.Version}}\'")
I'm trying to build an egg for my python project using setuptools, however, whenever I build an egg all of the contents are built with the first letter if each file/folder removed.
For example, my parent folder is called dp which gets renamed to p. I.e. when I unzip the egg file, I see a parent folder named p and another folder named GG-INFO (these should be named dp and EGG-INFO respectively). All of the other folders inside folder p are named correctly.
This is an issue because I reference functions in modules within that folder - e.g. from dp.module import function which doesn't work because it complains about not finding the folder dp (which is true since for some reason it's been renamed p).
My setup.py file looks like this:
from setuptools import setup, find_packages
setup(
name="dp",
version="1.0",
author="XXXX",
author_email="XXXX",
description="Data pipeline for XXX algorithm.",
long_description_content_type="text/markdown",
url="XXXX",
packages=find_packages(),
package_data={'': ['*.sql', '*.json', '*.txt']},
include_package_data=True,
classifiers=[
"Programming Language :: Python :: 3"
],
python_requires='>=3.6',
install_requires=['argparse', 'boto3', 'datetime', 'mmlspark', 'pandas', 'pyspark', 'pypandoc', 'scikit-learn',
'numpy', 'googleads', 'mlflow']
)
I've tried renaming the parent directory and the same thing happens. I'm running this via PyCharm (updated to the latest version) on Mac OS Mojave.
Would appreciate any ideas on how to fix this.
Update:
I used a different method to generate the egg which unblocked me, but the issue still remains with the initial method.
Steps to reproduce
Create a new project in Pycharm
Add a setup.py file to root, see above.
Tools -> Run setup.py task -> bdist.egg
Generates an egg. Rename the extension to file_name.zip, unzip the file, check the contents of the folder.
I found that the first letter of the folder names was always missing (i changed the names of the folder and it consistently removed the first letter).
Workaround
Instead of building an egg via Pycharm, i used the command python setup.py bdist_egg in the terminal which created an egg without any issues.
I think this confirms it is a Pycharm bug(?). A colleague managed to intermittently reproduce this bug using Pycharm.
Give the wheel a try.
pip install wheel setuptools pip -U
pip wheel --no-deps --wheel-dir=build .
I have some .proto gRPC files I want to compile as part of the setup.py script. This requires running from grpc_tools import protoc and calling protoc before setup(args). The goal is to compile and install the pb files from pip install pkgname.
E.g.
# setup.py
# generate our pb2 files in the temp directory structure
compile_protobufs(pkgname)
# this will package the generated files and put them in site-packages or .whl
setup(
name=pkgname,
install_requires=['grpcio-tools', ...],
...
)
This works as intended, I get the pb files in my site-packages or in the wheel without them having to exist in the source folder. However, this pattern means I cannot naively pip install pkgname from scratch, as the step compile_protobufs depends on grpcio-tools, which does not get installed until setup().
I could use setup_requires, but that is on the chopping block. I could just install the dependencies first (right now I use RUN pip install -r build-require.txt && pip install pkgname/ ), but it still seems like there ought to be a cleaner way.
Am I even going about this pattern correctly or am I missing some packaging idiom?
My criteria:
Generally this is run inside a container, so minimizing external deps
I want the _pb2.py files regenerated each time I pip install
These files need to also make their way into any .whl or tar.
Looks like it is already documented here:
https://github.com/grpc/grpc/tree/master/tools/distrib/python/grpcio_tools#usage
So your setup.py could look like this:
#!/usr/bin/env python3
import distutils.command.install
import setuptools
class build_package_protos(setuptools.Command):
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
from grpc_tools import command
command.build_package_protos(self.distribution.package_dir[''])
class install(distutils.command.install.install):
_sub_command = ('build_package_protos', None,)
_sub_commands = distutils.command.install.install.sub_commands
sub_commands = [_sub_command] + _sub_commands
def setup():
setuptools.setup(
# see 'setup.cfg'
cmdclass={
'build_package_protos': build_package_protos,
'install': install,
},
setup_requires=[
'grpcio-tools',
],
)
if __name__ == '__main__':
setup()
I'm trying to provide a bash completion script for my CLI tool that is written in Python. According to the Python Packaging Authority, data_files in setup.py is exactly what I need:
Although configuring package_data is sufficient for most needs, in some cases you may need to place data files outside of your packages. The data_files directive allows you to do that. It is mostly useful if you need to install files which are used by other programs, which may be unaware of Python packages.
So I added the completion file like this:
data_files=[
('/usr/share/bash-completion/completions', ['completion/dotenv']),
],
and try to test it with:
pip install -e .
In my virtual environment. However, the completion script gets not installed. Did I forgot something or is pip broken? The full project can be found here
I had the same issue and I have implemented a workaround.
It seems to me that python setup.py develop or (pip install -e .) does not run the same function than python setup.py install.
In fact, I have noticed by looking in the source code that python setup.py install run build_py :
https://github.com/python/cpython/blob/master/Lib/distutils/command/build_py.py#L134
https://github.com/pypa/setuptools/blob/master/setuptools/command/build_py.py
After a few digging I have opted to override the develop command as follow. The following code is python3.6:
""" SetupTool Entry Point """
import sys
from pathlib import Path
from shutil import copy2
from setuptools import find_packages, setup
from setuptools.command.develop import develop
# create os_data_files that will be used by the default install command
os_data_files = [
(
f"{sys.prefix}/config", # providing absolute path, sys.prefix will be different in venv
[
"src/my_package/config/properties.env",
],
),
]
def build_package_data():
""" implement the necessary function for develop """
for dest_dir, filenames in os_data_files:
for filename in filenames:
print(
"CUSTOM SETUP.PY (build_package_data): copy %s to %s"
% (filename, dest_dir)
)
copy2(filename, dest_dir)
def make_dirstruct():
""" Set the the logging path """
for subdir in ["config"]:
print("CUSTOM SETUP.PY (make_dirstruct): creating %s" % subdir)
(Path(BASE_DIR) / subdir).mkdir(parents=True, exist_ok=True)
class CustomDevelopCommand(develop):
""" Customized setuptools install command """
def run(self):
develop.run(self)
make_dirstruct()
build_package_data()
# provide the relevant information for stackoverflow
setup(
package_dir={"": "src"},
packages=find_packages("src"),
data_files=os_data_files,
cmdclass={"develop": CustomDevelopCommand},
)