I have a little problem with setuptools/easy_install; maybe someone could give me a hint what might be the cause of the problem:
To easily distribute one of my python webapps to servers I use setuptools' sdist command to build a tar.gz file which is copied to servers and locally installed using easy_install /path/to/file.tar.gz.
So far this seems to work great. I have listed everything in the MANIFEST.in file like this:
global-include */*.py */*.mo */*.po */*.pot */*.css */*.js */*.png */*.jpg */*.ico */*.woff */*.gif */*.mako */*.cfg
And the resulting tar.gz file does indeed contain all of the files I need.
It gets weird as soon as easy_install tries to actually install it on the remote system. For some reason a directory called locales and a configuration file called migrate.cfg won't get installed. This is odd and I can't find any documentaiton about this, but I guess it's some automatic ignore feature of easy_install?
Is there something like that? And if so, how do I get easy_install to install the locales and migrate.cfg files?
Thanks!
For reference here is the content of my setup.py:
from setuptools import setup, find_packages
requires = ['flup', 'pyramid', 'WebError', 'wtforms', 'webhelpers', 'pil', 'apns', \
'pyramid_beaker', 'sqlalchemy', 'poster', 'boto', 'pypdf', 'sqlalchemy_migrate', \
'Babel']
version_number = execfile('pubserverng/version.py')
setup(
author='Bastian',
author_email='test#domain.com',
url='http://domain.de/',
name = "mywebapp",
install_requires = requires,
version = __version__,
packages = find_packages(),
zip_safe=False,
entry_points = {
'paste.app_factory': [
'pubserverng=pubserverng:main'
]
},
namespace_packages = ['pubserverng'],
message_extractors = { 'pubserverng': [
('**.py', 'python', None),
('templates/**.html', 'mako', None),
('templates/**.mako', 'mako', None),
('static/**', 'ignore', None),
('migrations/**', 'ignore', None),
]
},
)
I hate to answer my own question this quickly, but after some trial and error I found out what the reason behind the missing files was. In fact it was more than one reason:
The SOURCES.txt file was older and included a full list of most files, which resulted in them being bundled correctly.
The MANIFEST.in file was correct, too, so all required files were actually in the .tar.gz archive as expected. The main problem was that a few files simply would not get installed on the target machine.
I had to add include_package_data = True, to my setup.py file. After doing that all files installed as expected.
I'll have to put some research into include_package_data to find out if this weird behavior is documented somewhere. setuptools is a real mess - especially the documentation.
The entire package distribution system in python leaves a lot to be desired. My issues were similar to yours and were eventually solved by using distutils (rather than setuptools) which honored the include_package_data = True setting as expected.
Using distutils allowed me to more or less keep required file list in MANIFEST.inand avoid using the package_data setting where I would have had to duplicate the source list; the draw back is find_packages is not available. Below is my setup.py:
from distutils.core import setup
package = __import__('simplemenu')
setup(name='django-simplemenu',
version=package.get_version(),
url='http://github.com/danielsokolowski/django-simplemenu',
license='BSD',
description=package.__doc__.strip(),
author='Alex Vasi <eee#someuser.com>, Justin Steward <justin+github#justinsteward.com>, Daniel Sokolowski <unemelpmis-ognajd#danols.com>',
author_email='unemelpmis-ognajd#danols.com',
include_package_data=True, # this will read MANIFEST.in during install phase
packages=[
'simplemenu',
'simplemenu.migrations',
'simplemenu.templatetags',
],
# below is no longer needed as we are utilizing MANIFEST.in with include_package_data setting
#package_data={'simplemenu': ['locale/en/LC_MESSAGES/*',
# 'locale/ru/LC_MESSAGES/*']
# },
scripts=[],
requires=[],
)
And here is a MANIFEST.in file:
include LICENSE
include README.rst
recursive-include simplemenu *.py
recursive-include simplemenu/locale *
prune simplemenu/migrations
You need to use the data_files functionality of setup - your files aren't code, so easy_install won't install them by default (it doesn't know where they go).
The upside of this is that these files are added to MANIFEST automatically - you don't need to do any magic to get them there yourself. (In general if a MANIFEST automatically generated by setup.py isn't sufficient, adding them yourself isn't going to magically get them installed.)
Related
I have created a simple c-function (using the guide here Create Numpy ufunc) and I'm now trying to distribute my package on pypi. For it to work, it needs to compile the c-file(s) into a .so -file, which then can be imported from python and everything is good. To compile it needs the header file numpy/ndarraytypes.h from Numpy.
When building and installing locally, it works fine. This since I know where the header files are located. However when distributing it, where can we find the Numpy folder? It is obvious from the logs that numpy gets installed before my package is built and installed so I just need to include the correct Numpy folder.
from setuptools import setup
from setuptools import Extension
if __name__ == "__main__":
setup(
name="myPack",
install_requires=[
"numpy>=1.22.3", # <- where is this one located after it gets installed?
],
ext_modules=[
Extension(
'npufunc',
sources=['npufunc.c'],
include_dirs=[
# what to put here for it to find the "numpy/ndarraytypes.h" header file when installing?
"/usr/local/Cellar/numpy/1.22.3_1/lib/python3.9/site-packages/numpy/core/include/" # <- this one I have locally and when added, installtion works fine
]
),
]
)
You can query numpy for the include directory from python via
import numpy
numpy.get_include()
This should return a string (/usr/lib/python3.10/site-packages/numpy/core/include on my system) which you can add to the include_dirs. See here for the docs.
As for you question: numpy is a build dependency for you project. The old setup.py method is kind of bad at handling those, which is why it has been superseded by the more modern pyproject.toml approach, which (among other things) makes such specifications possible. A fairly minimal pyproject.toml for your setting (setuptools + numpy) would look like this:
[build-system]
requires = ["setuptools", "wheel", "oldest-supported-numpy"]
build-backend = "setuptools.build_meta"
Given such a pyproject.toml you can build the extension using the python build module by calling
python -m build
which should produce a wheel with a compiled so inside it.
I'm trying to build an egg for my python project using setuptools, however, whenever I build an egg all of the contents are built with the first letter if each file/folder removed.
For example, my parent folder is called dp which gets renamed to p. I.e. when I unzip the egg file, I see a parent folder named p and another folder named GG-INFO (these should be named dp and EGG-INFO respectively). All of the other folders inside folder p are named correctly.
This is an issue because I reference functions in modules within that folder - e.g. from dp.module import function which doesn't work because it complains about not finding the folder dp (which is true since for some reason it's been renamed p).
My setup.py file looks like this:
from setuptools import setup, find_packages
setup(
name="dp",
version="1.0",
author="XXXX",
author_email="XXXX",
description="Data pipeline for XXX algorithm.",
long_description_content_type="text/markdown",
url="XXXX",
packages=find_packages(),
package_data={'': ['*.sql', '*.json', '*.txt']},
include_package_data=True,
classifiers=[
"Programming Language :: Python :: 3"
],
python_requires='>=3.6',
install_requires=['argparse', 'boto3', 'datetime', 'mmlspark', 'pandas', 'pyspark', 'pypandoc', 'scikit-learn',
'numpy', 'googleads', 'mlflow']
)
I've tried renaming the parent directory and the same thing happens. I'm running this via PyCharm (updated to the latest version) on Mac OS Mojave.
Would appreciate any ideas on how to fix this.
Update:
I used a different method to generate the egg which unblocked me, but the issue still remains with the initial method.
Steps to reproduce
Create a new project in Pycharm
Add a setup.py file to root, see above.
Tools -> Run setup.py task -> bdist.egg
Generates an egg. Rename the extension to file_name.zip, unzip the file, check the contents of the folder.
I found that the first letter of the folder names was always missing (i changed the names of the folder and it consistently removed the first letter).
Workaround
Instead of building an egg via Pycharm, i used the command python setup.py bdist_egg in the terminal which created an egg without any issues.
I think this confirms it is a Pycharm bug(?). A colleague managed to intermittently reproduce this bug using Pycharm.
Give the wheel a try.
pip install wheel setuptools pip -U
pip wheel --no-deps --wheel-dir=build .
I'm trying to use the CFFI package in Python to create a Python interface for already existing C-code.
I am able to compile a C library by following this blog post. Now I want to make it so that this python library is available without any fancy updates to the sys.path.
I found that maybe creating a distribution through Python's setuptools setup() function would accomplish this and I got it to mostly work by creating a setup.py file as such
import os
import sys
from setuptools import setup, find_packages
os.chdir(os.path.dirname(sys.argv[0]) or ".")
setup(
name="SobelFilterTest",
version="0.1",
description="An example project using Python's CFFI",
packages=find_packages(),
install_requires=["cffi>=1.0.0"],
setup_requires=["cffi>=1.0.0"],
cffi_modules=[
"./src/build_sobel.py:ffi",
"./src/build_file_operations.py:ffi",
],
)
, but I run into this error
build/temp.linux-x86_64-3.5/_sobel.c:492:19: fatal error: sobel.h: No such file or directory
From what I can tell, the problem is that the sobel.h file does not get uploaded into the build folder created by setuptools.setup(). I looked for suggestions of what to do including using Extensions() and writing a MANIFEST.in file, and both seem to add a relative path to the correct header files:
MANIFEST.in
setup.py
SobelFilterTest.egg-info/PKG-INFO
SobelFilterTest.egg-info/SOURCES.txt
SobelFilterTest.egg-info/dependency_links.txt
SobelFilterTest.egg-info/requires.txt
SobelFilterTest.egg-info/top_level.txt
src/file_operations.h
src/macros.h
src/sobel.h
But I still get the same error message. Is there a correct way to go about adding the header file to the build folder? Thanks!
It's actually not pip that is missing the .h file, but rather the compiler (like gcc). Therefore it's not about adding the missing file to setup, but rather make sure that cffi can find it. One way (like mentioned in the comments) is to make it available to the compiler through environment variables, but there is another way.
When setting the source with cffi you can add directories for the compiler like this:
from cffi import FFI
ffibuilder = FFI()
ffibuilder.set_source("<YOUR SOURCE HERE>", include_dirs=["./src"])
# ... Rest of your code
"""
I am trying to convert a python package to a linux binary (and eventually a windows executable as well) using cx_Freeze. The package has dependency upon multiple egg files, as i understand cx_Freeze doesn't play nice with egg files, so i unzipped the egg files. One of the egg files has a resource string file 'test.resource' in some package 'test.package', to include this resource string file i used -
include_files = ['test/package/test.resource']
Now i see this file is being copied to the target directory along with the binary but when i try to run the binary i get the error - "IOError: [Errno 2] No such file or directory: 'test/package/test.resource'"
The code trying to read the file is doing this:
from pkg_resources import resource_string
strings = resource_string("test.package", "test.resource")
How can i add this resource file so that it's available to the generated binary?
Since the accepted answer is light on details, I thought I'd provide a solution that worked for me. It does not require that you unzip your egg files, but does not preclude it either.
The situation: I have an application that we package with esky using the cx_Freeze freezer, but this information should apply equally well to applications just using cx_Freeze directly. The application makes use of the treq library which includes the following line in treq.__import__:
__version__ = resource_string(__name__, "_version").strip()
I use pkg_resources to determine the filepath of the _version file and manually add it to my zip_includes. We make sure it gets placed into its expected location alongside the rest of the treq library files. In my setup.py file I have:
setup(
name=...,
version=...,
install_requires=[
...,
treq==15.0.0,
...
],
...,
options={
...,
"bdist_esky": {
...,
"freezer_options": {
# This would be "zip_includes" in options if just using cx_Freeze without esky
"zipIncludes": [
# Manually copy the treq _version resource file into our library.zip
(pkg_resources.resource_filename('treq', '_version'), 'treq/_version')
],
"packages": [..., "treq", ...]
...,
...,
...,
...,
)
Bonus: While one does not have to unzip libraries in order for this to work, it seems to be useful enough in general that you might want to always unzip them upon installation (a la python setup.py develop). It's a simple matter of adding the following lines to your application's setup.cfg:
[easy_install]
zip_ok = 0
Then, when you run python setup.py develop, your dependencies are installed into site-packages in their expanded directory form.
In case anyone needs the answer i got it working by using 'zip_includes' instead of 'include_files'
Ok so I'm trying to use py2app to generate a distribution for my project. I'm still not sure I get the hang of it tho. So my setup.py looks like this:
"""
This is a setup.py script generated by py2applet
Usage:
python setup.py py2app
"""
from setuptools import setup
import setuptools
PACKAGES = ['sqlalchemy.dialects.sqlite']
MODULES = ['sqlite3']
APP = ['tvb/interfaces/web/run.py']
OPTIONS = {'argv_emulation': True,
'packages': PACKAGES ,
'includes' : MODULES }
DATA_FILES = []
setup(
app=APP,
data_files=DATA_FILES,
packages = setuptools.find_packages(),
include_package_data=True,
options={'py2app': OPTIONS},
setup_requires=['py2app', "pyopengl", "cherrypy", "sqlalchemy", "simplejson",
"formencode", "genshi", "quantities","numpy", "scipy",
"numexpr", "nibabel", "cfflib", "mdp", "apscheduler",
"scikits.learn"]
)
So my first question would be: What should I include in my MODULES for py2app here? Does py2app know to scan for the things in setup_requires and include them or do I need to add some entries for them in MODULES ?
Another problem is that I'm getting an: sqlalchemy.exc.ArgumentError: Could not determine dialect for 'sqlite' when trying to run my app. After lots of googling I only saw that for py2exe you need to include the sqlalchemy.dialects.sqlite as a package but it doesn't seem to work for me. Am I missing something here?
The last one is that I'm getting a: malformed object (load command 3 cmdsize not a multiple of 8) just before the python setup.py py2app. Is this normal?
Regards,
Bogdan
Well seems I got the whole thing wrong.
'includes' : ['sqlalchemy.dialects.sqlite']
Instead of packages, and that seems to have done the trick.