Pip install package prior to setup() in setup.py - python

I have some .proto gRPC files I want to compile as part of the setup.py script. This requires running from grpc_tools import protoc and calling protoc before setup(args). The goal is to compile and install the pb files from pip install pkgname.
E.g.
# setup.py
# generate our pb2 files in the temp directory structure
compile_protobufs(pkgname)
# this will package the generated files and put them in site-packages or .whl
setup(
name=pkgname,
install_requires=['grpcio-tools', ...],
...
)
This works as intended, I get the pb files in my site-packages or in the wheel without them having to exist in the source folder. However, this pattern means I cannot naively pip install pkgname from scratch, as the step compile_protobufs depends on grpcio-tools, which does not get installed until setup().
I could use setup_requires, but that is on the chopping block. I could just install the dependencies first (right now I use RUN pip install -r build-require.txt && pip install pkgname/ ), but it still seems like there ought to be a cleaner way.
Am I even going about this pattern correctly or am I missing some packaging idiom?
My criteria:
Generally this is run inside a container, so minimizing external deps
I want the _pb2.py files regenerated each time I pip install
These files need to also make their way into any .whl or tar.

Looks like it is already documented here:
https://github.com/grpc/grpc/tree/master/tools/distrib/python/grpcio_tools#usage
So your setup.py could look like this:
#!/usr/bin/env python3
import distutils.command.install
import setuptools
class build_package_protos(setuptools.Command):
user_options = []
def initialize_options(self):
pass
def finalize_options(self):
pass
def run(self):
from grpc_tools import command
command.build_package_protos(self.distribution.package_dir[''])
class install(distutils.command.install.install):
_sub_command = ('build_package_protos', None,)
_sub_commands = distutils.command.install.install.sub_commands
sub_commands = [_sub_command] + _sub_commands
def setup():
setuptools.setup(
# see 'setup.cfg'
cmdclass={
'build_package_protos': build_package_protos,
'install': install,
},
setup_requires=[
'grpcio-tools',
],
)
if __name__ == '__main__':
setup()

Related

Cannot execute function in python setup.py [duplicate]

Is it possible to specify a post-install Python script file as part of the setuptools setup.py file so that a user can run the command:
python setup.py install
on a local project file archive, or
pip install <name>
for a PyPI project and the script will be run at the completion of the standard setuptools install? I am looking to perform post-install tasks that can be coded in a single Python script file (e.g. deliver a custom post-install message to the user, pull additional data files from a different remote source repository).
I came across this SO answer from several years ago that addresses the topic and it sounds as though the consensus at that time was that you need to create an install subcommand. If that is still the case, would it be possible for someone to provide an example of how to do this so that it is not necessary for the user to enter a second command to run the script?
Note: The solution below only works when installing a source distribution zip or tarball, or installing in editable mode from a source tree. It will not work when installing from a binary wheel (.whl)
This solution is more transparent:
You will make a few additions to setup.py and there is no need for an extra file.
Also you need to consider two different post-installations; one for development/editable mode and the other one for install mode.
Add these two classes that includes your post-install script to setup.py:
from setuptools import setup
from setuptools.command.develop import develop
from setuptools.command.install import install
class PostDevelopCommand(develop):
"""Post-installation for development mode."""
def run(self):
develop.run(self)
# PUT YOUR POST-INSTALL SCRIPT HERE or CALL A FUNCTION
class PostInstallCommand(install):
"""Post-installation for installation mode."""
def run(self):
install.run(self)
# PUT YOUR POST-INSTALL SCRIPT HERE or CALL A FUNCTION
and insert cmdclass argument to setup() function in setup.py:
setup(
...
cmdclass={
'develop': PostDevelopCommand,
'install': PostInstallCommand,
},
...
)
You can even call shell commands during installation, like in this example which does pre-installation preparation:
from setuptools import setup
from setuptools.command.develop import develop
from setuptools.command.install import install
from subprocess import check_call
class PreDevelopCommand(develop):
"""Pre-installation for development mode."""
def run(self):
check_call("apt-get install this-package".split())
develop.run(self)
class PreInstallCommand(install):
"""Pre-installation for installation mode."""
def run(self):
check_call("apt-get install this-package".split())
install.run(self)
setup(
...
P.S. there are no any pre-install entry points available on setuptools. Read this discussion if you are wondering why there is none.
Note: The solution below only works when installing a source distribution zip or tarball, or installing in editable mode from a source tree. It will not work when installing from a binary wheel (.whl)
This is the only strategy that has worked for me when the post-install script requires that the package dependencies have already been installed:
import atexit
from setuptools.command.install import install
def _post_install():
print('POST INSTALL')
class new_install(install):
def __init__(self, *args, **kwargs):
super(new_install, self).__init__(*args, **kwargs)
atexit.register(_post_install)
setuptools.setup(
cmdclass={'install': new_install},
Note: The solution below only works when installing a source distribution zip or tarball, or installing in editable mode from a source tree. It will not work when installing from a binary wheel (.whl)
A solution could be to include a post_setup.py in setup.py's directory. post_setup.py will contain a function which does the post-install and setup.py will only import and launch it at the appropriate time.
In setup.py:
from distutils.core import setup
from distutils.command.install_data import install_data
try:
from post_setup import main as post_install
except ImportError:
post_install = lambda: None
class my_install(install_data):
def run(self):
install_data.run(self)
post_install()
if __name__ == '__main__':
setup(
...
cmdclass={'install_data': my_install},
...
)
In post_setup.py:
def main():
"""Do here your post-install"""
pass
if __name__ == '__main__':
main()
With the common idea of launching setup.py from its directory, you will be able to import post_setup.py else it will launch an empty function.
In post_setup.py, the if __name__ == '__main__': statement allows you to manually launch post-install from command line.
Combining the answers from #Apalala, #Zulu and #mertyildiran; this worked for me in a Python 3.5 environment:
import atexit
import os
import sys
from setuptools import setup
from setuptools.command.install import install
class CustomInstall(install):
def run(self):
def _post_install():
def find_module_path():
for p in sys.path:
if os.path.isdir(p) and my_name in os.listdir(p):
return os.path.join(p, my_name)
install_path = find_module_path()
# Add your post install code here
atexit.register(_post_install)
install.run(self)
setup(
cmdclass={'install': CustomInstall},
...
This also gives you access the to the installation path of the package in install_path, to do some shell work on.
I think the easiest way to perform the post-install, and keep the requirements, is to decorate the call to setup(...):
from setup tools import setup
def _post_install(setup):
def _post_actions():
do_things()
_post_actions()
return setup
setup = _post_install(
setup(
name='NAME',
install_requires=['...
)
)
This will run setup() when declaring setup. Once done with the requirements installation, it will run the _post_install() function, which will run the inner function _post_actions().
If using atexit, there is no need to create a new cmdclass. You can simply create your atexit register right before the setup() call. It does the same thing.
Also, if you need dependencies to be installed first, this does not work with pip install since your atexit handler will be called before pip moves the packages into place.
I wasn't able to solve a problem with any presented recommendations, so here is what helped me.
You can call function, that you want to run after installation just after setup() in setup.py, like that:
from setuptools import setup
def _post_install():
<your code>
setup(...)
_post_install()

Setup.py, , setuptools, cmdclass - Custom commands not working

I am trying to create a directory upon a package installation. The function to create the directory, by itself, successfully creates it. Additionally, when I run "python3.7 setup.py install", the directory is created.
Why does this not work when using pip though? I don't see any errors. When I added print statements, I do not see them.
I have chosen to use setuptools' 'bdist_egg' function instead of the 'install' function for reasons found in here:
Running custom setuptools build during install
from sys import platform
from setuptools import setup
from os import mkdir, chmod, path
from setuptools.command.bdist_egg import bdist_egg as _bdist_egg
class OverrideInstall(_bdist_egg):
def run(self):
_bdist_egg.run(self)
# create log directory
log = "/var/log/FOO"
mode = 0o777
if not path.exists(log):
mkdir(log)
chmod(log, mode)
setup(
name='cox-nams',
version='FOO',
description='FOO',
<-- output omitted for brevity / security>
cmdclass={"bdist_egg": OverrideInstall},
)
Apparently not supported with pip install.

Installing data_files in setup.py with pip install -e

I'm trying to provide a bash completion script for my CLI tool that is written in Python. According to the Python Packaging Authority, data_files in setup.py is exactly what I need:
Although configuring package_data is sufficient for most needs, in some cases you may need to place data files outside of your packages. The data_files directive allows you to do that. It is mostly useful if you need to install files which are used by other programs, which may be unaware of Python packages.
So I added the completion file like this:
data_files=[
('/usr/share/bash-completion/completions', ['completion/dotenv']),
],
and try to test it with:
pip install -e .
In my virtual environment. However, the completion script gets not installed. Did I forgot something or is pip broken? The full project can be found here
I had the same issue and I have implemented a workaround.
It seems to me that python setup.py develop or (pip install -e .) does not run the same function than python setup.py install.
In fact, I have noticed by looking in the source code that python setup.py install run build_py :
https://github.com/python/cpython/blob/master/Lib/distutils/command/build_py.py#L134
https://github.com/pypa/setuptools/blob/master/setuptools/command/build_py.py
After a few digging I have opted to override the develop command as follow. The following code is python3.6:
""" SetupTool Entry Point """
import sys
from pathlib import Path
from shutil import copy2
from setuptools import find_packages, setup
from setuptools.command.develop import develop
# create os_data_files that will be used by the default install command
os_data_files = [
(
f"{sys.prefix}/config", # providing absolute path, sys.prefix will be different in venv
[
"src/my_package/config/properties.env",
],
),
]
def build_package_data():
""" implement the necessary function for develop """
for dest_dir, filenames in os_data_files:
for filename in filenames:
print(
"CUSTOM SETUP.PY (build_package_data): copy %s to %s"
% (filename, dest_dir)
)
copy2(filename, dest_dir)
def make_dirstruct():
""" Set the the logging path """
for subdir in ["config"]:
print("CUSTOM SETUP.PY (make_dirstruct): creating %s" % subdir)
(Path(BASE_DIR) / subdir).mkdir(parents=True, exist_ok=True)
class CustomDevelopCommand(develop):
""" Customized setuptools install command """
def run(self):
develop.run(self)
make_dirstruct()
build_package_data()
# provide the relevant information for stackoverflow
setup(
package_dir={"": "src"},
packages=find_packages("src"),
data_files=os_data_files,
cmdclass={"develop": CustomDevelopCommand},
)

How to install data_files of python package into home directory

Here's my setup.py
setup(
name='shipane_sdk',
version='1.0.0.a5',
# ...
data_files=[(os.path.join(os.path.expanduser('~'), '.shipane_sdk', 'config'), ['config/scheduler-example.ini'])],
# ...
)
Packing & Uploading commands:
python setup.py sdist
python setup.py bdist_wheel --universal
twine upload dist/*
Installing command:
pip install shipane_sdk
But, it doesn't install the config/scheduler-example.ini under ~/.shipane_sdk
The pip documents says:
setuptools allows absolute “data_files” paths, and pip honors them as
absolute, when installing from sdist. This is not true when installing
from wheel distributions. Wheels don’t support absolute paths, and
they end up being installed relative to “site-packages”. For
discussion see wheel Issue #92.
Do you know how to do installing from sdist?
There are multiple solutions to this problem and it is all very confusing how inconsistent the packaging tools work. Some time ago I found the following workaround worked for me best with sdist (note that it doesn't work with wheels!):
Instead of using data_files, attach the files to your package using MANIFEST.in, which in your case could look like this:
include config/scheduler-example.ini
Copy the files "manually" to chosen location using this snippet in setup.py:
if 'install' in sys.argv:
from pkg_resources import Requirement, resource_filename
import os
import shutil
# retrieve the temporary path where the package has been extracted to for installation
conf_path_temp = resource_filename(Requirement.parse(APP_NAME), "conf")
# if the config directory tree doesn't exist, create it
if not os.path.exists(CONFIG_PATH):
os.makedirs(CONFIG_PATH)
# copy every file from given location to the specified ``CONFIG_PATH``
for file_name in os.listdir(conf_path_temp):
file_path_full = os.path.join(conf_path_temp, file_name)
if os.path.isfile(file_path_full):
shutil.copy(file_path_full, CONFIG_PATH)
In my case "conf" was the subdirectory in the package that contained my data files and they were supposed to be installed into CONFIG_PATH which was something like /etc/APP_NAME

How to obtain arguments passed to setup.py from pip with '--install-option'?

I am using pip 1.4.1, attempting to install a package from a local path, for example:
pip install /path/to/my/local/package
This does what I want, which is more or less the equivalent of running python /path/to/my/local/package/setup.py install, but I would like to pass some additional options/arguments to my package's setup.py install.
I understand from the pip documentation that this is possible with the --install-option option, for example:
pip install --install-option="--some-option" /path/to/my/local/package
This post from the python-virtualenv Google Group suggests this is possible.
What I do not understand is how to obtain the passed-in "--some-option" from within setup.py. I tried looking at sys.argv, but no matter what I put for "--install-option=", sys.argv is always this:
['-c', 'egg_info', '--egg-base', 'pip-egg-info']
How can I get the values of things passed in as "--install-option" from pip install?
You need to extend the install command with a custom command of your own. In the run method you can expose the value of the option to setup.py (in my example I use a global variable).
from setuptools.command.install import install
class InstallCommand(install):
user_options = install.user_options + [
('someopt', None, None), # a 'flag' option
#('someval=', None, None) # an option that takes a value
]
def initialize_options(self):
install.initialize_options(self)
self.someopt = None
#self.someval = None
def finalize_options(self):
#print("value of someopt is", self.someopt)
install.finalize_options(self)
def run(self):
global someopt
someopt = self.someopt # will be 1 or None
install.run(self)
Register the custom install command with the setup function.
setup(
cmdclass={
'install': InstallCommand,
},
:
It seems that the order of your arguments is off
pip install /path/to/my/local/package --install-option="--someopt"
For consistency, you can add an option to both setup.py install and setup.py develop (aka pip install -e): (building off Ronen Botzer's answer)
from setuptools import setup
from setuptools.command.install import install
from setuptools.command.develop import develop
class CommandMixin(object):
user_options = [
('someopt', None, 'a flag option'),
('someval=', None, 'an option that takes a value')
]
def initialize_options(self):
super().initialize_options()
# Initialize options
self.someopt = None
self.someval = 0
def finalize_options(self):
# Validate options
if self.someval < 0:
raise ValueError("Illegal someval!")
super().finalize_options()
def run(self):
# Use options
global someopt
someopt = self.someopt # will be 1 or None
super().run()
class InstallCommand(CommandMixin, install):
user_options = getattr(install, 'user_options', []) + CommandMixin.user_options
class DevelopCommand(CommandMixin, develop):
user_options = getattr(develop, 'user_options', []) + CommandMixin.user_options
setup(
...,
cmdclass={
'install': InstallCommand,
'develop': DevelopCommand,
}
Then you can pass options to pip like:
pip install --install-option="--someval=1" --install-option="--someopt" .
Or in develop mode:
pip install -e --install-option="--someval=1" .
It works well and also documented.
from setuptools.command.install import install
class InstallCommand(install):
user_options = install.user_options + [
('engine=', None, '<description for this custom option>'),
]
def initialize_options(self):
install.initialize_options(self)
self.engine = None
def finalize_options(self):
print("value of engine is", self.engine)
install.finalize_options(self)
def run(self):
print(self.engine)
install.run(self)
setup(
...
cmdclass={'install': InstallCommand}
...
)
One of common mistakes is to pass setup options to pip like you pass it to setup directly. Use options from pip like that:
pip install . --install-option="--engine=rabbitmq"
But this way is a wrong way:
pip install . --install-option="--engine rabbitmq"
Absence of equal sign causes well known error:
error: option --engines rabbitmq not recognized
I was having this problem installing pyside.
I needed to specify the --qmake option.
This is the form you need:
pip install --install-option="--qmake=/usr/lib64/qt4/bin/qmake" PySide
On top of this great anwser.
One more thing to notice is that --install-options doesn't work with wheel
Since version 7.0 pip supports controlling the command line options given to setup.py via requirements files. This disables the use of wheels (cached or otherwise) for that package, as setup.py does not exist for wheels.
However, when you build the wheel with setup.py, you can use
python setup.py bdist_wheel install -your-options
To customize the install phase and this will affect the .dist-info of the wheel package.

Categories

Resources