Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed last month.
The community reviewed whether to reopen this question last month and left it closed:
Original close reason(s) were not resolved
Improve this question
After studying this page:
http://docs.python.org/distutils/builtdist.html
I am hoping to find some setup.py files to study so as to make my own (with the goal of making a fedora rpm file).
Could the s.o. community point me towards some good examples?
Complete walkthrough of writing setup.py scripts here. (with some examples)
If you'd like a real-world example, I could point you towards the setup.py scripts of a couple major projects. Django's is here, pyglet's is here. You can just browse the source of other projects for a file named setup.py for more examples.
These aren't simple examples; the tutorial link I gave has those. These are more complex, but also more practical.
You may find the HitchHiker's Guide to Packaging helpful, even though it is incomplete. I'd start with the Quick Start tutorial. Try also just browsing through Python packages on the Python Package Index. Just download the tarball, unpack it, and have a look at the setup.py file. Or even better, only bother looking through packages that list a public source code repository such as one hosted on GitHub or BitBucket. You're bound to run into one on the front page.
My final suggestion is to just go for it and try making one; don't be afraid to fail. I really didn't understand it until I started making them myself. It's trivial to create a new package on PyPI and just as easy to remove it. So, create a dummy package and play around.
Minimal example
from setuptools import setup, find_packages
setup(
name="foo",
version="1.0",
packages=find_packages(),
)
More info in docs
READ THIS FIRST https://packaging.python.org/en/latest/current.html
Installation Tool Recommendations
Use pip to install Python packages
from PyPI.
Use virtualenv, or pyvenv to isolate application specific dependencies from a shared Python installation.
Use pip wheel to create a cache of wheel distributions, for the purpose of > speeding up subsequent installations.
If you’re looking for management of fully integrated cross-platform software stacks, consider buildout (primarily focused on the web development community) or Hashdist, or conda (both primarily focused on the scientific community).
Packaging Tool Recommendations
Use setuptools to define projects and create Source Distributions.
Use the bdist_wheel setuptools extension available from the wheel project to create wheels. This is especially beneficial, if your project contains binary extensions.
Use twine for uploading distributions to PyPI.
This anwser has aged, and indeed there is a rescue plan for python packaging world called
wheels way
I qoute pythonwheels.com here:
What are wheels?
Wheels are the new standard of python distribution
and are intended to replace eggs. Support is offered in pip >= 1.4 and
setuptools >= 0.8.
Advantages of wheels
Faster installation for pure python and native C extension packages.
Avoids arbitrary code execution for installation. (Avoids setup.py)
Installation of a C extension does not require a compiler on Windows
or OS X.
Allows better caching for testing and continuous
integration.
Creates .pyc files as part of installation to ensure
they match the python interpreter used.
More consistent installs across platforms and machines.
The full story of correct python packaging (and about wheels) is covered at packaging.python.org
conda way
For scientific computing (this is also recommended on packaging.python.org, see above) I would consider using CONDA packaging which can be seen as a 3rd party service build on top of PyPI and pip tools. It also works great on setting up your own version of binstar so I would imagine it can do the trick for sophisticated custom enterprise package management.
Conda can be installed into a user folder (no super user permisssions) and works like magic with
conda install
and powerful virtual env expansion.
eggs way
This option was related to python-distribute.org and is largerly outdated (as well as the site) so let me point you to one of the ready to use yet compact setup.py examples I like:
A very practical example/implementation of mixing scripts and single python files into setup.py is giving here
Even better one from hyperopt
This quote was taken from the guide on the state of setup.py and still applies:
setup.py gone!
distutils gone!
distribute gone!
pip and virtualenv here to stay!
eggs ... gone!
I add one more point (from me)
wheels!
I would recommend to get some understanding of packaging-ecosystem (from the guide pointed by gotgenes) before attempting mindless copy-pasting.
Most of examples out there in the Internet start with
from distutils.core import setup
but this for example does not support building an egg python setup.py bdist_egg (as well as some other old features), which were available in
from setuptools import setup
And the reason is that they are deprecated.
Now according to the guide
Warning
Please use the Distribute package rather than the Setuptools package
because there are problems in this package that can and will not be
fixed.
deprecated setuptools are to be replaced by distutils2, which "will be part of the standard library in Python 3.3". I must say I liked setuptools and eggs and have not yet been completely convinced by convenience of distutils2. It requires
pip install Distutils2
and to install
python -m distutils2.run install
PS
Packaging never was trivial (one learns this by trying to develop a new one), so I assume a lot of things have gone for reason. I just hope this time it will be is done correctly.
I recommend the setup.py of the Python Packaging User Guide's example project.
The Python Packaging User Guide "aims to be the authoritative resource on how to package, publish and install Python distributions using current tools".
As of December 2022 however, the sample project has switched from setup.py to pyproject.toml. For a new project you may want to consider doing the same.
Here is the utility I wrote to generate a simple setup.py file (template) with useful comments and links. I hope, it will be useful.
Installation
sudo pip install setup-py-cli
Usage
To generate setup.py file just type in the terminal.
setup-py
Now setup.py file should occur in the current directory.
Generated setup.py
from distutils.core import setup
from setuptools import find_packages
import os
# User-friendly description from README.md
current_directory = os.path.dirname(os.path.abspath(__file__))
try:
with open(os.path.join(current_directory, 'README.md'), encoding='utf-8') as f:
long_description = f.read()
except Exception:
long_description = ''
setup(
# Name of the package
name=<name of current directory>,
# Packages to include into the distribution
packages=find_packages('.'),
# Start with a small number and increase it with every change you make
# https://semver.org
version='1.0.0',
# Chose a license from here: https://help.github.com/articles/licensing-a-repository
# For example: MIT
license='',
# Short description of your library
description='',
# Long description of your library
long_description = long_description,
long_description_context_type = 'text/markdown',
# Your name
author='',
# Your email
author_email='',
# Either the link to your github or to your website
url='',
# Link from which the project can be downloaded
download_url='',
# List of keyword arguments
keywords=[],
# List of packages to install with this one
install_requires=[],
# https://pypi.org/classifiers/
classifiers=[]
)
Content of the generated setup.py:
automatically fulfilled package name based on the name of the current directory.
some basic fields to fulfill.
clarifying comments and links to useful resources.
automatically inserted description from README.md or an empty string if there is no README.md.
Here is the link to the repository. Fill free to enhance the solution.
Look at this complete example https://github.com/marcindulak/python-mycli of a small python package. It is based on packaging recommendations from https://packaging.python.org/en/latest/distributing.html, uses setup.py with distutils and in addition shows how to create RPM and deb packages.
The project's setup.py is included below (see the repo for the full source):
#!/usr/bin/env python
import os
import sys
from distutils.core import setup
name = "mycli"
rootdir = os.path.abspath(os.path.dirname(__file__))
# Restructured text project description read from file
long_description = open(os.path.join(rootdir, 'README.md')).read()
# Python 2.4 or later needed
if sys.version_info < (2, 4, 0, 'final', 0):
raise SystemExit, 'Python 2.4 or later is required!'
# Build a list of all project modules
packages = []
for dirname, dirnames, filenames in os.walk(name):
if '__init__.py' in filenames:
packages.append(dirname.replace('/', '.'))
package_dir = {name: name}
# Data files used e.g. in tests
package_data = {name: [os.path.join(name, 'tests', 'prt.txt')]}
# The current version number - MSI accepts only version X.X.X
exec(open(os.path.join(name, 'version.py')).read())
# Scripts
scripts = []
for dirname, dirnames, filenames in os.walk('scripts'):
for filename in filenames:
if not filename.endswith('.bat'):
scripts.append(os.path.join(dirname, filename))
# Provide bat executables in the tarball (always for Win)
if 'sdist' in sys.argv or os.name in ['ce', 'nt']:
for s in scripts[:]:
scripts.append(s + '.bat')
# Data_files (e.g. doc) needs (directory, files-in-this-directory) tuples
data_files = []
for dirname, dirnames, filenames in os.walk('doc'):
fileslist = []
for filename in filenames:
fullname = os.path.join(dirname, filename)
fileslist.append(fullname)
data_files.append(('share/' + name + '/' + dirname, fileslist))
setup(name='python-' + name,
version=version, # PEP440
description='mycli - shows some argparse features',
long_description=long_description,
url='https://github.com/marcindulak/python-mycli',
author='Marcin Dulak',
author_email='X.Y#Z.com',
license='ASL',
# https://pypi.python.org/pypi?%3Aaction=list_classifiers
classifiers=[
'Development Status :: 1 - Planning',
'Environment :: Console',
'License :: OSI Approved :: Apache Software License',
'Natural Language :: English',
'Operating System :: OS Independent',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.4',
'Programming Language :: Python :: 2.5',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.2',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
],
keywords='argparse distutils cli unittest RPM spec deb',
packages=packages,
package_dir=package_dir,
package_data=package_data,
scripts=scripts,
data_files=data_files,
)
and and RPM spec file which more or less follows Fedora/EPEL packaging guidelines may look like:
# Failsafe backport of Python2-macros for RHEL <= 6
%{!?python_sitelib: %global python_sitelib %(%{__python} -c "from distutils.sysconfig import get_python_lib; print(get_python_lib())")}
%{!?python_sitearch: %global python_sitearch %(%{__python} -c "from distutils.sysconfig import get_python_lib; print(get_python_lib(1))")}
%{!?python_version: %global python_version %(%{__python} -c "import sys; sys.stdout.write(sys.version[:3])")}
%{!?__python2: %global __python2 %{__python}}
%{!?python2_sitelib: %global python2_sitelib %{python_sitelib}}
%{!?python2_sitearch: %global python2_sitearch %{python_sitearch}}
%{!?python2_version: %global python2_version %{python_version}}
%{!?python2_minor_version: %define python2_minor_version %(%{__python} -c "import sys ; print sys.version[2:3]")}
%global upstream_name mycli
Name: python-%{upstream_name}
Version: 0.0.1
Release: 1%{?dist}
Summary: A Python program that demonstrates usage of argparse
%{?el5:Group: Applications/Scientific}
License: ASL 2.0
URL: https://github.com/marcindulak/%{name}
Source0: https://github.com/marcindulak/%{name}/%{name}-%{version}.tar.gz
%{?el5:BuildRoot: %(mktemp -ud %{_tmppath}/%{name}-%{version}-%{release}-XXXXXX)}
BuildArch: noarch
%if 0%{?suse_version}
BuildRequires: python-devel
%else
BuildRequires: python2-devel
%endif
%description
A Python program that demonstrates usage of argparse.
%prep
%setup -qn %{name}-%{version}
%build
%{__python2} setup.py build
%install
%{?el5:rm -rf $RPM_BUILD_ROOT}
%{__python2} setup.py install --skip-build --prefix=%{_prefix} \
--optimize=1 --root $RPM_BUILD_ROOT
%check
export PYTHONPATH=`pwd`/build/lib
export PATH=`pwd`/build/scripts-%{python2_version}:${PATH}
%if 0%{python2_minor_version} >= 7
%{__python2} -m unittest discover -s %{upstream_name}/tests -p '*.py'
%endif
%clean
%{?el5:rm -rf $RPM_BUILD_ROOT}
%files
%doc LICENSE README.md
%{_bindir}/*
%{python2_sitelib}/%{upstream_name}
%{?!el5:%{python2_sitelib}/*.egg-info}
%changelog
* Wed Jan 14 2015 Marcin Dulak <X.Y#Z.com> - 0.0.1-1
- initial version
Here you will find the simplest possible example of using distutils and setup.py:
https://docs.python.org/2/distutils/introduction.html#distutils-simple-example
This assumes that all your code is in a single file and tells how to package a project containing a single module.
Related
When I install one of my own Python applications from PyPi, it fails to run citing
File "/home/me/.local/lib/python3.9/site-packages/refrapt/refrapy.py", line 20, in
from classes import (
ModuleNotFoundError: No module named 'classes'.
I have the following directory layout in my local area:
/refrapt
setup.py
/refrapt
classes.py
helpers.py
refrapt.conf
refrapt.py
settings.py
__init__.py
To build the project, I'm using setuptools, running the following command:
python setup.py sdist bdist_wheel
This builds and works happily enough, and I'm able to upload the resulting /dist.
I then install the project using pip3 install refrapt. When I run it using refrapt, I get the error ModuleNotFoundError above.
When I run the development code locally, it runs fine, but installed via pip, it is failing. I assume it's a problem with my setup.py, but this is my first time and I haven't really a clue what is correct. I tried adding the init.py (which is empty) as suggested by some python docs, but to no avail. The contents of setup.py are as follows:
import pathlib
from setuptools import setup, find_packages
HERE = pathlib.Path(__file__).parent
README = (HERE / "README.md").read_text()
setup(
name='Refrapt',
version='0.1.5',
description='A tool to mirror Debian repositories for use as a local mirror.',
python_requires='>=3.9',
long_description=README,
long_description_content_type="text/markdown",
packages=find_packages(),
install_requires=[
'Click >= 7.1.2',
'Colorama >= 0.4.4',
'tqdm >= 4.60.0',
'wget >= 3.2',
'filelock >= 3.0.12'
],
classifiers=[
"Development Status :: 4 - Beta",
"Operating System :: Microsoft :: Windows :: Windows 10",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: Implementation",
"Topic :: System :: Archiving :: Mirroring"
],
keywords=['Mirror', 'Debian', 'Repository'],
entry_points='''
[console_scripts]
refrapt=refrapt:refrapt
''',
)
If anyone could help, I'd greatly appreciate. I'm out of my depth on this one, and haven't been able to find an answer so far.
from classes import …
In Python 2 this was relative import — the statements imports classes from the current directory of the importing module.
But in Python 3 it was changed to absolute import. The import fails because there is no a global module or a package named classes. You need to convert the import to absolute or relative. Either
from refrapt.classes import …
or
from .classes import …
Potentially I've found out the answer to my question, but it's not the answer I wanted.
I spun up a virtual environment, and installed an application that I've used before via pip. When I went to run the app, I got the ModuleNotFoundError: No module named 'app'.
I tried to run it manually via the .py file by using python .\.venv\Lib\site-packages\app\cli.py, which resulting in the same error.
Seems to be something about the environment setup in Windows VS Code that just doesn't operate the same as on a Linux machine.
I guess I'll just have to remove the "refrapt." prefix from import statements when developing locally, and then add it back when pushing to GitHub.
I'm did a python library, it's my first python library
published on pypl and github
The library works very well, but the setup() doesn't.
When I install it by pip install, it dowloand the appfly package but do not install the requirements: Flask,flask_cors, Flask-SocketIO and jsonmerge. So I need install it by myself.
If I install the dependencies myself, it's works very well, but i think it's the wrong way to use a python library right?
here is my setup.py file, I'm doing something wrong?
from setuptools import setup, find_packages
from appfly import __version__ as version
with open('README.md') as readme_file:
readme = readme_file.read()
# with open('HISTORY.md') as history_file:
# history = history_file.read()
requirements = [
'Flask==1.0.2',
'flask_cors==3.0.6',
'Flask-SocketIO==3.0.2',
'jsonmerge==1.5.2'
]
setup(
author="Italo José G. de Oliveira",
author_email='italo.i#live.com',
classifiers=[
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
],
description="This pkg encapsulate the base flask server configurations",
install_requires=requirements,
license="MIT license",
long_description=readme,
include_package_data=True,
keywords='appfly',
name='appfly',
packages=find_packages(),
url='https://github.com/italojs/appfly',
version=version,
zip_safe=False,
)
The reason for this error is that the setup.py imports from the package. This means that python will try importing the library while processing the setup.py (ie. before any of the dependencies get installed).
Since you are only importing the package to get the version information, this import can be replaced with a different method.
An easy way to do this is to include the version information directly in the setup.py, but the drawback with this is that the version is no longer single sourced.
Other methods involve a bit of work but allow the version information to continue to be single sourced. See https://packaging.python.org/guides/single-sourcing-package-version/ for recommendations. That page has a list of options, some of which may be better suited to your package setup than others. I personally prefer option 3:
Set the value to a __version__ global variable in a dedicated module
in your project (e.g. version.py), then have setup.py read and exec
the value into a variable.
...
Using exec:
version = {}
with open("...sample/version.py") as fp:
exec(fp.read(), version)
# later on we use: version['__version__']
You can also define the version in the __init__.py of your package like:
__version__ = "1.1.0"
Then, instead of importing __version__ in your setup.py, you can read the init.py and exctract the version.
Actually, this is the solution proposed in the official Python guides:
https://packaging.python.org/guides/single-sourcing-package-version/
Another option could be using versioneer package (pandas uses it).
versioneer uses Git tags to create the version value.
Steps
Step 0: Use git tags
Versioneer uses yout Git tags.
Step 1: Install versioneer
pip install versionner
Step 2: Execute versioneer
versioneer install
Step 3: Follow the versioneer instructions
Versioneer will ask you to do some changes in your setup.py and setup.cfg.\
In the setup.py you'll have to add something like:
import versioneer
setup(
version=versioneer.get_version(),
cmdclass=versioneer.get_cmdclass()
)
And in setup.cfg:
[versioneer]
VCS = git
style = pep440
versionfile_source = your_project_name/_version.py
versionfile_build = your_project_name/_version.py
tag_prefix = v
parentdir_prefix =
Note: In the example I set "tag_prefix = v" because I like tagging like: v0.1.0, v1.0.0 and so on
After that, try:
python setup.py version
I am having some trouble understanding some of the basic paradigms around setuptools and am hoping to get some help understanding some of the principals and options surrounding setuptools for python.
Currently, I am working on a cross platform implementation of the blender as a python module build cycle, such that bpy.pyd/ bpy.so could be installed from pip.
I am successfully able to perform this build process from Windows. You can check out the repo here: https://github.com/TylerGubala/blenderpy
My primary concerns are as follows:
1) I have supplementary files to facilitate building for different system architectures; I want to upload these to pypi, not the built binaries
2) The supplementary files should not live inside the package when it is installed, they are only relevant during the setup/ build process
3) Currently, the way that the setup script works, is that it builds the modules, then sneakily copies the built files into the site-packages and executable directory for the given python environment. My concern here is: how, when the user runs py -m pip uninstall blenderpy will the package manager know to grab these files and remove them?
4) What is the correct way to package such a module as this?
I think my primary disconnect is coming from the fact that I would be using pypi as a build script delivery system, where the actual module that I intend to install is not present until midway through the setup.py execution.
So how could I install these utilities onto a user's machine, run them, and have my resultant built bpy.pyd be the source for my package?
Thanks in advance!
EDIT: I feel I should mention that I read through the following post and, while it seems related it seems to be talking more about 'extras' handlers and the internals of setuptools rather than talking about installing a compiled library that's controlled by python build scripts.
Python setuptools/distutils custom build for the `extra` package with Makefile
UPDATED 28th July, 2018 for multiple improvements that I found.
I ended up finding through a lot of research and a lot of trial-and-error what I needed to do to accomplish my goal.
In the end, the solution ended up looking almost exactly like what hoefling over at this question ended up doing:
Extending setuptools extension to use CMake in setup.py?
1) Extend the setuptools.Extension class with a class of my own, which does not contain entries for the sources or libs properties
2) Extend the setuptools.commands.build_ext.build_ext class with a class of my own, which has a custom method which performs my necessary build steps (git, svn, cmake, cmake --build)
3) Extend the distutils.command.install_data.install_data class (yuck, distutils... however there doesn't seem to be a setuputils equivalent) with a class of my own, to mark the built binary libraries during setuptools' record creation (installed-files.txt) such that
The libraries will be recorded and will be uninstalled with pip uninstall bpy
The command py setup.py bdist_wheel will work natively as well, and can be used to provide precompiled versions of your source code
4) Extend the setuptools.command.install_lib.install_lib class with a class of my own, which will ensure that the built libraries are moved from their resultant build folder into the folder that setuptools expects them in (on Windows it will put the .dll files in a bin/Release folder and not where setuptools expects it)
5) Extend the setuptools.command.install_scripts.install_scripts class with a class of my own such that the scripts files are copied to the correct directory (Blender expects the 2.79 or whatever directory to be in the scripts location)
6) After the build steps are performed, copy those files into a known directory that setuptools will copy into the site-packages directory of my environment. At this point the remaining setuptools and distutils classes can take over writing the installed-files.txt record and will be fully removable!
You can check out the up to date repository here: https://github.com/TylerGubala/blenderpy
Here is a snapshot of what I ended up with:
"""
Build blender into a python module
"""
from distutils.command.install_data import install_data
import os
import pathlib
from setuptools import find_packages, setup, Extension
from setuptools.command.build_ext import build_ext
from setuptools.command.install_lib import install_lib
from setuptools.command.install_scripts import install_scripts
import shutil
import struct
import sys
from typing import List
PYTHON_EXE_DIR = os.path.dirname(sys.executable)
BLENDER_GIT_REPO_URL = 'git://git.blender.org/blender.git'
BLENDERPY_DIR = os.path.join(pathlib.Path.home(), ".blenderpy")
BITS = struct.calcsize("P") * 8
LINUX_BLENDER_BUILD_DEPENDENCIES = ['build-essential']
LINUX_BLENDER_ADDTL_DEPENDENCIES = ['libfreetype6-dev', 'libglew-dev',
'libglu1-mesa-dev', 'libjpeg-dev',
'libpng12-dev', 'libsndfile1-dev',
'libx11-dev', 'libxi-dev',
# How to find current Python version best
# guess and install the right one?
'python3.5-dev',
# TODO: Update the above for a more
# maintainable way of getting correct
# Python version
'libalut-dev', 'libavcodec-dev',
'libavdevice-dev', 'libavformat-dev',
'libavutil-dev', 'libfftw3-dev',
'libjack-dev', 'libmp3lame-dev',
'libopenal-dev', 'libopenexr-dev',
'libopenjpeg-dev', 'libsdl1.2-dev',
'libswscale-dev', 'libtheora-dev',
'libtiff5-dev', 'libvorbis-dev',
'libx264-dev', 'libspnav-dev']
class CMakeExtension(Extension):
"""
An extension to run the cmake build
"""
def __init__(self, name, sources=[]):
super().__init__(name = name, sources = sources)
class InstallCMakeLibsData(install_data):
"""
Just a wrapper to get the install data into the egg-info
"""
def run(self):
"""
Outfiles are the libraries that were built using cmake
"""
# There seems to be no other way to do this; I tried listing the
# libraries during the execution of the InstallCMakeLibs.run() but
# setuptools never tracked them, seems like setuptools wants to
# track the libraries through package data more than anything...
# help would be appriciated
self.outfiles = self.distribution.data_files
class InstallCMakeLibs(install_lib):
"""
Get the libraries from the parent distribution, use those as the outfiles
Skip building anything; everything is already built, forward libraries to
the installation step
"""
def run(self):
"""
Copy libraries from the bin directory and place them as appropriate
"""
self.announce("Moving library files", level=3)
# We have already built the libraries in the previous build_ext step
self.skip_build = True
bin_dir = self.distribution.bin_dir
libs = [os.path.join(bin_dir, _lib) for _lib in
os.listdir(bin_dir) if
os.path.isfile(os.path.join(bin_dir, _lib)) and
os.path.splitext(_lib)[1] in [".dll", ".so"]
and not (_lib.startswith("python") or _lib.startswith("bpy"))]
for lib in libs:
shutil.move(lib, os.path.join(self.build_dir,
os.path.basename(lib)))
# Mark the libs for installation, adding them to
# distribution.data_files seems to ensure that setuptools' record
# writer appends them to installed-files.txt in the package's egg-info
#
# Also tried adding the libraries to the distribution.libraries list,
# but that never seemed to add them to the installed-files.txt in the
# egg-info, and the online recommendation seems to be adding libraries
# into eager_resources in the call to setup(), which I think puts them
# in data_files anyways.
#
# What is the best way?
self.distribution.data_files = [os.path.join(self.install_dir,
os.path.basename(lib))
for lib in libs]
# Must be forced to run after adding the libs to data_files
self.distribution.run_command("install_data")
super().run()
class InstallBlenderScripts(install_scripts):
"""
Install the scripts available from the "version folder" in the build dir
"""
def run(self):
"""
Copy the required directory to the build directory and super().run()
"""
self.announce("Moving scripts files", level=3)
self.skip_build = True
bin_dir = self.distribution.bin_dir
scripts_dirs = [os.path.join(bin_dir, _dir) for _dir in
os.listdir(bin_dir) if
os.path.isdir(os.path.join(bin_dir, _dir))]
for scripts_dir in scripts_dirs:
shutil.move(scripts_dir,
os.path.join(self.build_dir,
os.path.basename(scripts_dir)))
# Mark the scripts for installation, adding them to
# distribution.scripts seems to ensure that the setuptools' record
# writer appends them to installed-files.txt in the package's egg-info
self.distribution.scripts = scripts_dirs
super().run()
class BuildCMakeExt(build_ext):
"""
Builds using cmake instead of the python setuptools implicit build
"""
def run(self):
"""
Perform build_cmake before doing the 'normal' stuff
"""
for extension in self.extensions:
if extension.name == "bpy":
self.build_cmake(extension)
super().run()
def build_cmake(self, extension: Extension):
"""
The steps required to build the extension
"""
# We import the setup_requires modules here because if we import them
# at the top this script will always fail as they won't be present
from git import Repo
self.announce("Preparing the build environment", level=3)
blender_dir = os.path.join(BLENDERPY_DIR, "blender")
build_dir = pathlib.Path(self.build_temp)
extension_path = pathlib.Path(self.get_ext_fullpath(extension.name))
os.makedirs(blender_dir, exist_ok=True)
os.makedirs(build_dir, exist_ok=True)
os.makedirs(extension_path.parent.absolute(), exist_ok=True)
# Now that the necessary directories are created, ensure that OS
# specific steps are performed; a good example is checking on linux
# that the required build libraries are in place.
if sys.platform == "win32": # Windows only steps
import svn.remote
import winreg
vs_versions = []
for version in [12, 14, 15]:
try:
winreg.OpenKey(winreg.HKEY_CLASSES_ROOT,
f"VisualStudio.DTE.{version}.0")
except:
pass
else:
vs_versions.append(version)
if not vs_versions:
raise Exception("Windows users must have Visual Studio 2013 "
"or later installed")
svn_lib = (f"win{'dows' if BITS == 32 else '64'}"
f"{'_vc12' if max(vs_versions) == 12 else '_vc14'}")
svn_url = (f"https://svn.blender.org/svnroot/bf-blender/trunk/lib/"
f"{svn_lib}")
svn_dir = os.path.join(BLENDERPY_DIR, "lib", svn_lib)
os.makedirs(svn_dir, exist_ok=True)
self.announce(f"Checking out svn libs from {svn_url}", level=3)
try:
blender_svn_repo = svn.remote.RemoteClient(svn_url)
blender_svn_repo.checkout(svn_dir)
except Exception as e:
self.warn("Windows users must have the svn executable "
"available from the command line")
self.warn("Please install Tortoise SVN with \"command line "
"client tools\" as described here")
self.warn("https://stackoverflow.com/questions/1625406/using-"
"tortoisesvn-via-the-command-line")
raise e
elif sys.platform == "linux": # Linux only steps
# TODO: Test linux environment, issue #1
import apt
apt_cache = apt.cache.Cache()
apt_cache.update()
# We need to re-open the apt-cache after performing the update to use the
# Updated cache, otherwise we will still be using the old cache see:
# https://stackoverflow.com/questions/17537390/how-to-install-a-package-using-the-python-apt-api
apt_cache.open()
for build_requirement in LINUX_BLENDER_BUILD_DEPENDENCIES:
required_package = apt_cache[build_requirement]
if not required_package.is_installed:
required_package.mark_install()
# Committing the changes to the cache could fail due to
# privilages; maybe we could try-catch this exception to
# elevate the privilages
apt_cache.commit()
self.announce(f"Build requirement {build_requirement} "
f"installed", level=3)
self.announce("Installing linux additional Blender build "
"dependencies as necessary", level=3)
try:
automated_deps_install_script = os.path.join(BLENDERPY_DIR,
'blender/build_files/'
'build_environment/'
'install_deps.sh')
self.spawn([automated_deps_install_script])
except:
self.warn("Could not automatically install linux additional "
"Blender build dependencies, attempting manual "
"installation")
for addtl_requirement in LINUX_BLENDER_ADDTL_DEPENDENCIES:
required_package = apt_cache[addtl_requirement]
if not required_package.is_installed:
required_package.mark_install()
# Committing the changes to the cache could fail due to privilages
# Maybe we could try-catch this exception to elevate the privilages
apt_cache.commit()
self.announce(f"Additional requirement "
f"{addtl_requirement} installed",
level=3)
self.announce("Blender additional dependencies installed "
"manually", level=3)
else:
self.announce("Blender additional dependencies installed "
"automatically", level=3)
elif sys.platform == "darwin": # MacOS only steps
# TODO: Test MacOS environment, issue #2
pass
# Perform relatively common build steps
self.announce(f"Cloning Blender source from {BLENDER_GIT_REPO_URL}",
level=3)
try:
blender_git_repo = Repo(blender_dir)
except:
Repo.clone_from(BLENDER_GIT_REPO_URL, blender_dir)
blender_git_repo = Repo(blender_dir)
finally:
blender_git_repo.heads.master.checkout()
blender_git_repo.remotes.origin.pull()
self.announce(f"Updating Blender git submodules", level=3)
blender_git_repo.git.submodule('update', '--init', '--recursive')
for submodule in blender_git_repo.submodules:
submodule_repo = submodule.module()
submodule_repo.heads.master.checkout()
submodule_repo.remotes.origin.pull()
self.announce("Configuring cmake project", level=3)
self.spawn(['cmake', '-H'+blender_dir, '-B'+self.build_temp,
'-DWITH_PLAYER=OFF', '-DWITH_PYTHON_INSTALL=OFF',
'-DWITH_PYTHON_MODULE=ON',
f"-DCMAKE_GENERATOR_PLATFORM=x"
f"{'86' if BITS == 32 else '64'}"])
self.announce("Building binaries", level=3)
self.spawn(["cmake", "--build", self.build_temp, "--target", "INSTALL",
"--config", "Release"])
# Build finished, now copy the files into the copy directory
# The copy directory is the parent directory of the extension (.pyd)
self.announce("Moving Blender python module", level=3)
bin_dir = os.path.join(build_dir, 'bin', 'Release')
self.distribution.bin_dir = bin_dir
bpy_path = [os.path.join(bin_dir, _bpy) for _bpy in
os.listdir(bin_dir) if
os.path.isfile(os.path.join(bin_dir, _bpy)) and
os.path.splitext(_bpy)[0].startswith('bpy') and
os.path.splitext(_bpy)[1] in [".pyd", ".so"]][0]
shutil.move(bpy_path, extension_path)
# After build_ext is run, the following commands will run:
#
# install_lib
# install_scripts
#
# These commands are subclassed above to avoid pitfalls that
# setuptools tries to impose when installing these, as it usually
# wants to build those libs and scripts as well or move them to a
# different place. See comments above for additional information
setup(name='bpy',
version='1.2.2b5',
packages=find_packages(),
ext_modules=[CMakeExtension(name="bpy")],
description='Blender as a python module',
long_description=open("./README.md", 'r').read(),
long_description_content_type="text/markdown",
keywords="Blender, 3D, Animation, Renderer, Rendering",
classifiers=["Development Status :: 3 - Alpha",
"Environment :: Win32 (MS Windows)",
"Intended Audience :: Developers",
"License :: OSI Approved :: "
"GNU Lesser General Public License v3 (LGPLv3)",
"Natural Language :: English",
"Operating System :: Microsoft :: Windows :: Windows 10",
"Programming Language :: C",
"Programming Language :: C++",
"Programming Language :: Python",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: Implementation :: CPython",
"Topic :: Artistic Software",
"Topic :: Education",
"Topic :: Multimedia",
"Topic :: Multimedia :: Graphics",
"Topic :: Multimedia :: Graphics :: 3D Modeling",
"Topic :: Multimedia :: Graphics :: 3D Rendering",
"Topic :: Games/Entertainment"],
author='Tyler Gubala',
author_email='gubalatyler#gmail.com',
license='GPL-3.0',
setup_requires=["cmake", "GitPython", 'svn;platform_system=="Windows"',
'apt;platform_system=="Linux"'],
url="https://github.com/TylerGubala/blenderpy",
cmdclass={
'build_ext': BuildCMakeExt,
'install_data': InstallCMakeLibsData,
'install_lib': InstallCMakeLibs,
'install_scripts': InstallBlenderScripts
}
)
I am trying to install a Pyramid app -- let's say test_app. inside a virtual environment and it is getting installed as test-app (pip freeze output shows it test-app==0.0).
Because of this, I can not import the package.
How should I fix this problem?
More info:
http://mail.python.org/pipermail/distutils-sig/2011-August/017935.html
I am using pip version 1.3.1
setup.py:
import os
from setuptools import setup, find_packages
here = os.path.abspath(os.path.dirname(__file__))
README = open(os.path.join(here, 'README.txt')).read()
CHANGES = open(os.path.join(here, 'CHANGES.txt')).read()
requires = [
'pyramid',
'pyramid_debugtoolbar',
'waitress',
]
setup(name='test_app',
version='0.0',
description='test_app',
long_description=README + '\n\n' + CHANGES,
classifiers=[
"Programming Language :: Python",
"Framework :: Pyramid",
"Topic :: Internet :: WWW/HTTP",
"Topic :: Internet :: WWW/HTTP :: WSGI :: Application",
],
author='',
author_email='',
url='',
keywords='web pyramid pylons',
packages=find_packages(),
include_package_data=True,
zip_safe=False,
install_requires=requires,
tests_require=requires,
test_suite="test_app",
entry_points="""\
[paste.app_factory]
main = test_app:main
""",
)
UPDATE:
to summarize the findings so far:
It is normal that pip reports the package name as test-app.
It is not normal that the egg link is pointing to your virtual env root.
But the fact that the .egg-info file is created inside your virtual env root as well points to develop using that directory as the egg root.
Update 2021
I have now started using Poetry instead of pip for all my new Python projects. It works well for both normal projects and Jupyter notebooks. With its better developer experience for package management all I'd have to do for the above example would be
poetry run xyz
where xyz is a script that I define within the spec file (akin to package.json for npm). I would be able to import my own package as all other packages.
Update 2021
Use Poetry instead of pip.
Original answer:
So, finally after a lot of fiddling around, I've found the solution -- which is annoyingly simple.
I am using virtualenv and am installing the package in the development mode.
I was installing the package from the wrong location. Turns out that the location (directory) from which you run python setup.py develop is indeed the one that goes into the .egg-link file.
You should install the package into the virtual environment FROM the location where your code is.
So, for example, let's say your code resides in '/a/b' and your virtualenv env is in '/x/y/env', then you should install the package like this:
$ cd /a/b
$ /x/y/env/bin/python setup.py develop
This will install the package properly.
Hence, the '-' and '_' issue is not a problem and you should be careful about the location from where you are installing the package in the develop mode.
Ok, I want to create a pre-built Python package which contains a C module. That is, at the end of it I want to have a tarball which contains everything needed to use my module and is pip install-able, ie at the end I can do a:
pip install whatevertarballgetsproduced.tar.gz
and mylibrary will be available. It also needs to be virtual environment friendly.
My current directory structure is:
project/
+ setup.py
+ mylibrary/
+ __init__.py
+ mylibrary.py
+ _mylibrary.so
+ README
That is, the compiled C library is in _mylibrary.so. The C source from which this file is derived is NOT to be included in the tarball. I am also doing this on OSX (Lion). mylibrary.py simply contains Python wrappers to the C library code.
How do I achieve this? I thought about doing a python setup.py bdist but this isn't really what I want (unless I'm missing something the tarball produced by that isn't pip install-able).
For the sake of completion, my setup.py looks like:
from setuptools import setup
setup(
name='mylibrary_py3mac',
version='0.1.1',
description='My library which is tied to OSX & Python 3',
long_description=open('README').read(),
packages=['mylibrary'],
classifiers = [
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Operating System :: MacOS :: MacOS X',
'Programming Language :: Python :: 3',
],
)
Note, I don't plan on distributing this tarball publicly, it is for internal deployment purposes only (hence why I don't care about it being precompiled, or tied to OSX only).
Adam, have you tried setup.py bdist_egg?
It will produce an egg file that can be installed using easy_install (not with pip, unfortunately).
Also, it looks like your project is missing mylibrary/__init__.py.
Once you add the missing "init" file, I am curious to know if setup.py will be smart enough to include the "so" file into the resulting egg file. There's a chance you have to prod it somehow (I myself haven't had a reason to deal with ".so" Python extensions, so I don't know how to).
In the following example, I have gevent-1.0b2-py2.7-macosx-10.4-x86_64.egg that was built using its setup.py bdist_egg. I've placed it into pypi/ directory under the current directory. I know that it happens to have other dependencies that I've also strategically placed into the same pypi/ directory.
easy_install --allow-hosts=None \
--find-links=pypi/ \
--always-unzip \
pypi/gevent-1.0b2-py2.7-macosx-10.4-x86_64.egg
Your case is simpler, because you don't seem to have any extra dependencies.
So you'd do something like:
easy_install --always-unzip \
dist/mylibrary_py3mac-0.1.1-py2.7-macosx-10.4-x86_64.egg
Note that dist/ directory is where built eggs files go, when setup.py bdist_egg is invoked.