I'm using a 64bit Windows machine with 64bit python3. I need to build a installable package for a windows 32bit machine and stumbled upon the cross compile feature of the bdist feature: https://docs.python.org/3/distutils/builtdist.html
I'm using a setup.py like this:
from ez_setup import use_setuptools
use_setuptools()
from setuptools import setup, find_packages
setup(name='mypackage',
version='1.0',
description='Some Description',
install_requires=['requests'],
package_dir={'': 'src'},
packages=[''],
entry_points = {'console_scripts': ['somescript = foobar:main']},
)
And build the install packages like so:
python setup.py build --plat-name=win32 bdist_wininst --user-access-control auto
python setup.py build --plat-name=win-amd64 bdist_wininst --user-access-control auto
In both cases I get the correct executeable format for the specified architecture but the defined console_script somescript was not executeable after installation.
The python documentation says that I need to crosscompile the whole python package for windows - but I'am uncertain if this is even necessary because the installer was for the right architecture and I got no error message while the build process.
Is there something wrong with the command? Do I really need to crosscompile or is it sufficiant to have a second 32bit installation of python?
As I found out this is a reported bug https://github.com/pypa/setuptools/issues/253
The setuptools do only check for the OS architecture and ignore the plat-name string for the installation of scripts.
Workaround (until this issue is closed): Use the target architecture for building the wininst.
Related
We have a placeholder egg that contains no code and only exists for the sake of pulling down a list of dependent packages from our PyPi repository.
Most of these dependent packages are platform-agnostic, however some are only used on Win32 platforms.
Is it possible to somehow make the dependency platform-conditional, so that a given dependency in my install_requires list will only get pulled down when installing on Win32?
Alternatively: Is it possible to specify a list of optional dependencies, that will be installed if available, but will not cause easy_install to fail if they are not?
For sdist, egg and wheel release from : https://setuptools.readthedocs.io/en/latest/userguide/dependency_management.html#platform-specific-dependencies
Sometimes a project might require a dependency to run on a specific platform. This could to a package that back ports a module so that it can be used in older python versions. Or it could be a package that is required to run on a specific operating system. This will allow a project to work on multiple different platforms without installing dependencies that are not required for a platform that is installing the project.
setup(
name="Project",
...
install_requires=[
'enum34 ; python_version<"3.4"',
'pywin32 >= 1.0 ; platform_system=="Windows"'
]
)
In setup.py:
from setuptools import setup
import sys
setup(
name="...",
install_requires=["This", "That"] + (
["WinOnly", "AnotherWinOnly"] if sys.platform.startswith("win") else []
)
)
distutils.util.get_platform has more information than sys.platform if you need it:
>>> sys.platform
'linux2'
>>> distutils.util.get_platform()
'linux-i686'
Use the extras_require distribution option to make 'win32 support' an optional feature:
setup(
...
extras_require={
'win32': 'pywin32'
},
...
)
Then specify the win32 feature when installing on Windows:
easy_install mypackage[win32]
This will pull down the pywin32 package, which is listed as a dependency for the 'win32' feature of mypackage.
See here for more information about optional features.
When the egg is built (using python setup.py bdist_egg), you can force setuptools/distribute to build a platform-specific egg.
from setuptools import setup
import os
# Monkey-patch Distribution so it always claims to be platform-specific.
from distutils.core import Distribution
Distribution.has_ext_modules = lambda *args, **kwargs: True
requirements = ['generic-foo', 'generic-bar']
if os.getenv('WINDOWS_BUILD'):
requirements.extend(['a-windows-only-requirement'])
setup(
name="...",
install_requires=requirements
)
You can then run:
# Force a windows build
$ WINDOWS_BUILD=y python setup.py bdist_egg -p win32
# Do a linux build -- you may not need to specify -p if you're happy
# with your current linux architecture.
$ python setup.py bdist_egg -p linux-i686
I have a problem, i develop an application with python and i use some libraries like flask, sqlalchemy, etc...
The problem is that i have a define version of each library, and I want to deploy this python application in another computer without internet,
can I create a package or use setup.py and include the other package with path ?
I've already try this code, but the library aren't imported they said that :
ModuleNotFoundError: No module named 'cx_Oracle'
My code is:
from distutils.core import setup
setup(
# Application name:
name="MyApplication",
# Version number (initial):
version="0.1.0",
# Packages
packages=["App","App/service"],
include_package_data=True,
install_requires=[
"flask","cx_Oracle","pandas","sqlalchemy"
],
)
install_requires is a setuptools setup.py keyword that should be used to specify what a project minimally needs to run correctly.
It won’t install those libraries.
Maybe you should try pyinstaller (https://www.pyinstaller.org) to make ready runnable file to run on the other computer.
I'm using Cython to generate compiled .so files for a couple of python modules I have. As outlined in the Cython documentation, you can create a setup.py file as follows:
from distutils.core import setup
from Cython.Build import cythonize
setup(
ext_modules = cythonize([
'MyModule1.py',
'MyModule2.py',
'MyModule3.py'
])
)
and then build the modules using the command python3 setup.py build_ext --inplace.
This works fine, however it creates binaries that match the architecture of the host machine (in my case x86_64). I would like to target a different architecture (armv7l) whose cross compile and environment I already have. Is it possible to do so with python distutils?
Pass in an alternative march and related flags via extra_compile_args on the extension:
sources = ['MyModule1.py',
'MyModule2.py',
'MyModule3.py']
ext_modules=cythonize(sources,
extra_compile_args=['-march=armv7l'],
library_dirs=[<arm v7 libraries>],
include_path=[<arm v7 includes>])
Requires working build tool chain for armv7l.
Docker container for an armv7l based linux would probably be easier to use, though, and would automate the arm build.
As in can run the docker container build in a script and generate native packages for all architectures and OS that you want.
I am working on a python2 package in which the setup.py contains some custom install commands. These commands actually build some Rust code and output some .dylib files that are moved into the python package.
An important point is that the Rust code is outside the python package.
setuptools is supposed to detect automatically if the python package is pure python or platform specific (if it contains some C extensions for instance).
In my case, when I run python setup.py bdist_wheel, the generated wheel is tagged as a pure python wheel: <package_name>-<version>-py2-none-any.whl.
This is problematic because I need to run this code on different platforms, and thus I need to generated one wheel per platform.
Is there a way, when building a wheel, to force the build to be platform specific ?
Here's the code that I usually look at from uwsgi
The basic approach is:
setup.py
# ...
try:
from wheel.bdist_wheel import bdist_wheel as _bdist_wheel
class bdist_wheel(_bdist_wheel):
def finalize_options(self):
_bdist_wheel.finalize_options(self)
self.root_is_pure = False
except ImportError:
bdist_wheel = None
setup(
# ...
cmdclass={'bdist_wheel': bdist_wheel},
)
The root_is_pure bit tells the wheel machinery to build a non-purelib (pyX-none-any) wheel. You can also get fancier by saying there are binary platform-specific components but no cpython abi specific components.
The modules setuptools, distutils and wheel decide whether a python distribution is pure by checking if it has ext_modules.
If you build an external module on your own, you can still list it in ext_modules so that the building tools know it exists. The trick is to provide an empty list of sources so that setuptools and distutils will not try to build it. For example,
setup(
...,
ext_modules=[
setuptools.Extension(
name='your.external.module',
sources=[]
)
]
)
This solution worked better for me than patching the bdist_wheel command. The reason is that bdist_wheel calls the install command internally and that command checks again for the existence of ext_modules to decide between purelib or platlib install. If you don't list the external module, you end up with the lib installed in a purelib subfolder inside the wheel. That causes problems when using auditwheel repair, which complains about the extensions being installed in a purelib folder.
You can also specify/spoof a specific platform name when building wheels by specifying a --plat-name:
python setup.py bdist_wheel --plat-name=manylinux1_x86_64
Neither the root_is_pure trick nor the empty ext_modules trick worked for me, but after MUCH searching myself, I finally found a working solution in 'pip setup.py bdist_wheel' no longer builds forced non-pure wheels
Basically, you override the 'has_ext_modules' function in the Distribution class, and set distclass to point to the overriding class. At that point, setup.py will believe you have a binary distribution, and will create a wheel with the specific version of python, the ABI, and the current architecture. As suggested by https://stackoverflow.com/users/5316090/py-j:
from setuptools import setup
from setuptools.dist import Distribution
DISTNAME = "packagename"
DESCRIPTION = ""
MAINTAINER = ""
MAINTAINER_EMAIL = ""
URL = ""
LICENSE = ""
DOWNLOAD_URL = ""
VERSION = '1.2'
PYTHON_VERSION = (2, 7)
# Tested with wheel v0.29.0
class BinaryDistribution(Distribution):
"""Distribution which always forces a binary package with platform name"""
def has_ext_modules(foo):
return True
setup(name=DISTNAME,
description=DESCRIPTION,
maintainer=MAINTAINER,
maintainer_email=MAINTAINER_EMAIL,
url=URL,
license=LICENSE,
download_url=DOWNLOAD_URL,
version=VERSION,
packages=["packagename"],
# Include pre-compiled extension
package_data={"packagename": ["_precompiled_extension.pyd"]},
distclass=BinaryDistribution)
I find Anthony Sottile's answer great, but didn't work for me.
My case is that I have to enforce the wheel to be created for x86_64, but any python3, so making root impure actually caused my wheel to be py36-cp36 :(
A better way, IMO, in general is just to use sys.argv:
from setuptools import setup
import sys
sys.argv.extend(['--plat-name', 'x86_64'])
setup(name='example-wheel')
I've compiled the python wrapper of nanomsg and I want to create a python installer for the package.The package can be created by running
python setup.py bdist --format=wininst
However I would like nanomsg.dll/nanomsg.so to be included in the installer/package but I haven't found any documentation regarding this issue.
As stated in the documentation here one needs to add the following code to his setup.py script:
setup(
name='nanomsg',
version=__version__,
packages=[str('nanomsg'), str('_nanomsg_ctypes'), str('nanomsg_wrappers')],
data_files=[(
'lib\\site-packages\\', ["C:\\Dev\\external\\nanomsg\\x86\\Release\\nanomsg.dll"]
)],
)