PyPI upload stopped working and pip install fails - python

I submitted my first PyPI project last night and things are not working as expected (warning long post ahead)...
I originally uploaded the project, cvrfparse, via the commandline by doing:
% python setup.py sdist upload
This created the initial project just fine. However trying to install the project via pip, failed thusly:
% sudo pip install cvrfparse
Password:
Downloading/unpacking cvrfparse
Running setup.py egg_info for package cvrfparse
Traceback (most recent call last):
File "<string>", line 16, in <module>
File "/private/tmp/pip-build-root/cvrfparse/setup.py", line 3, in <module>
from distribute_setup import use_setuptools
ImportError: No module named distribute_setup
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 16, in <module>
File "/private/tmp/pip-build-root/cvrfparse/setup.py", line 3, in <module>
from distribute_setup import use_setuptools
ImportError: No module named distribute_setup
----------------------------------------
Command python setup.py egg_info failed with error code 1 in /private/tmp/pip-build-root/cvrfparse
Storing complete log in /Users/m/Library/Logs/pip.log
According to http://pythonhosted.org/distribute/setuptools.html#using-setuptools-without-bundling-it it should just "work" if I have:
from distribute_setup import use_setuptools
use_setuptools()
after the loadcard in setup.py. I then tried adding distribute_setup.py to a MANIFEST.in as per:
% cat MANIFEST.in
include distribute_setup.py
So after adding that file and bumping the version number in setup.py I then tried to upload the new package to PyPI:
% python setup.py sdist upload
running sdist
running egg_info
writing requirements to cvrfparse.egg-info/requires.txt
writing cvrfparse.egg-info/PKG-INFO
writing top-level names to cvrfparse.egg-info/top_level.txt
writing dependency_links to cvrfparse.egg-info/dependency_links.txt
writing entry points to cvrfparse.egg-info/entry_points.txt
reading manifest file 'cvrfparse.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
writing manifest file 'cvrfparse.egg-info/SOURCES.txt'
running check
creating cvrfparse-0.10
creating cvrfparse-0.10/cvrfparse
creating cvrfparse-0.10/cvrfparse.egg-info
creating cvrfparse-0.10/cvrfparse/sample-xml
creating cvrfparse-0.10/cvrfparse/schemata
creating cvrfparse-0.10/cvrfparse/schemata/common
creating cvrfparse-0.10/cvrfparse/schemata/common/1.1
creating cvrfparse-0.10/cvrfparse/schemata/cvrf
creating cvrfparse-0.10/cvrfparse/schemata/cvrf/1.1
creating cvrfparse-0.10/cvrfparse/schemata/dublincore
creating cvrfparse-0.10/cvrfparse/schemata/prod
creating cvrfparse-0.10/cvrfparse/schemata/prod/1.1
creating cvrfparse-0.10/cvrfparse/schemata/scap
creating cvrfparse-0.10/cvrfparse/schemata/vuln
creating cvrfparse-0.10/cvrfparse/schemata/vuln/1.1
creating cvrfparse-0.10/cvrfparse/schemata/w3.org
making hard links in cvrfparse-0.10...
hard linking MANIFEST.in -> cvrfparse-0.10
hard linking README -> cvrfparse-0.10
hard linking distribute_setup.py -> cvrfparse-0.10
hard linking setup.py -> cvrfparse-0.10
hard linking cvrfparse/__init__.py -> cvrfparse-0.10/cvrfparse
hard linking cvrfparse/cvrfparse.py -> cvrfparse-0.10/cvrfparse
hard linking cvrfparse.egg-info/PKG-INFO -> cvrfparse-0.10/cvrfparse.egg-info
hard linking cvrfparse.egg-info/SOURCES.txt -> cvrfparse-0.10/cvrfparse.egg-info
hard linking cvrfparse.egg-info/dependency_links.txt -> cvrfparse-0.10/cvrfparse.egg-info
hard linking cvrfparse.egg-info/entry_points.txt -> cvrfparse-0.10/cvrfparse.egg-info
hard linking cvrfparse.egg-info/requires.txt -> cvrfparse-0.10/cvrfparse.egg-info
hard linking cvrfparse.egg-info/top_level.txt -> cvrfparse-0.10/cvrfparse.egg-info
hard linking cvrfparse/sample-xml/CVRF-1.1-cisco-sa-20110525-rvs4000.xml -> cvrfparse-0.10/cvrfparse/sample-xml
hard linking cvrfparse/schemata/catalog.xml -> cvrfparse-0.10/cvrfparse/schemata
hard linking cvrfparse/schemata/common/1.1/common.xsd -> cvrfparse-0.10/cvrfparse/schemata/common/1.1
hard linking cvrfparse/schemata/cvrf/1.1/cvrf.xsd -> cvrfparse-0.10/cvrfparse/schemata/cvrf/1.1
hard linking cvrfparse/schemata/dublincore/dc.xsd -> cvrfparse-0.10/cvrfparse/schemata/dublincore
hard linking cvrfparse/schemata/prod/1.1/prod.xsd -> cvrfparse-0.10/cvrfparse/schemata/prod/1.1
hard linking cvrfparse/schemata/scap/cpe-language_2.2a.xsd -> cvrfparse-0.10/cvrfparse/schemata/scap
hard linking cvrfparse/schemata/scap/cvss-v2_0.9.xsd -> cvrfparse-0.10/cvrfparse/schemata/scap
hard linking cvrfparse/schemata/scap/scap-core_0.9.xsd -> cvrfparse-0.10/cvrfparse/schemata/scap
hard linking cvrfparse/schemata/vuln/1.1/vuln.xsd -> cvrfparse-0.10/cvrfparse/schemata/vuln/1.1
hard linking cvrfparse/schemata/w3.org/xml.xsd -> cvrfparse-0.10/cvrfparse/schemata/w3.org
Writing cvrfparse-0.10/setup.cfg
Creating tar archive
removing 'cvrfparse-0.10' (and everything under it)
running upload
Traceback (most recent call last):
File "setup.py", line 21, in <module>
['cvrfparse = cvrfparse.cvrfparse:main',]}
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/core.py", line 152, in setup
dist.run_commands()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/upload.py", line 60, in run
self.upload_file(command, pyversion, filename)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/upload.py", line 135, in upload_file
self.password)
TypeError: cannot concatenate 'str' and 'NoneType' objects
It appears as though something is None where it was previously some?
I then tried to upload the package manually by creating a distribution via:
% python setup.py sdist
And uploading that file to PyPI via the web interface. pip install still reports the same problem with this new .10 package. Where am I going wrong?

PyPI seems to have your package and loads it just fine (for me, on Ubuntu 12.04.2 in a clean virtualenv). Your tool uses console_scripts and your main requires an argument (progname), which load_enry_point() (setuptools) doesn't send. Just assign a default value to that parameter. Eg:
def main(progname=sys.argv[0]):
and you should be golden. Don't forget to update your version number and repush to PyPI.

Related

sphinx python with boost-python library not found

I'm trying to use sphinx to generate documentation for a package that I wrote. Everything was working great until I added some code that used another package I wrote that wraps some boost-python modules.
Now when I run make html I get this (edited for clarity):
$ make html
sphinx-build -b html -d build/doctrees source build/html
Running Sphinx v1.3.1
loading pickled environment... done
building [mo]: targets for 0 po files that are out of date
building [html]: targets for 0 source files that are out of date
updating environment: 0 added, 1 changed, 0 removed
reading sources... [100%] my_project
/path/to/my_project/my_project/docs/source/my_project.rst:19: WARNING: autodoc: failed to import module u'my_project.my_module'; the following exception was raised:
Traceback (most recent call last):
File "/Users/harding/anaconda/lib/python2.7/site-packages/Sphinx-1.3.1-py2.7.egg/sphinx/ext/autodoc.py", line 385, in import_object
__import__(self.modname)
File "/path/to/my_project/__init__.py", line 12, in <module>
from . import module_that_uses_boost_python_classes
File "/path/to/module_that_uses_boost_python_classes.py", line 10, in <module>
import package_that_wraps_boost_python_classes
File "/path/to/package_that_wraps_boost_python_classes/__init__.py", line 21, in <module>
from native_boost_python_module import *
ImportError: dlopen(/path/to/native_boost_python_module.so, 2): Library not loaded: cpp_library.dylib
Referenced from: /path/to/native_boost_python_module.so
Reason: image not found
The problem is that when sphinx trys to import my module, it fails to load the linked c++ library from my boost-python related package. That package is installed via pip and it loads fine in all other situations.
I manually specified the full path to all the c++ libraries with install_name_tool -change and then sphinx works fine.
My question is this: how can I get sphinx to import my boost-python package without manually editing the c++ libraries manually with install_name_tool?
I have DYLD_LIBRARY_PATH set to include the directory that contains all the c++ libraries.
I'm on MacOS Sierra

How to distribute type hints to PyPi?

I've worked on adding Python 3.5 type hints to the responses library. But when I test making a distribution, sdist or bdist_wheel, it doesn't install my .pyi file. I can see it being part of the distribution, but it doesn't go further than that.
You can see what I got in my repo here: https://github.com/gaqzi/responses/tree/feature/type-hints-file
I read PEP484 which mentions that stub files should be distributable. But I can't seem to figure out how. :)
Is there a problem because responses doesn't create a package? It's just a single module file and that's why it doesn't get added correctly?
What I see when I build the package:
% python setup.py sdist
running sdist
running egg_info
writing requirements to responses.egg-info/requires.txt
writing top-level names to responses.egg-info/top_level.txt
writing responses.egg-info/PKG-INFO
writing dependency_links to responses.egg-info/dependency_links.txt
reading manifest file 'responses.egg-info/SOURCES.txt'
writing manifest file 'responses.egg-info/SOURCES.txt'
running check
warning: check: missing meta-data: if 'author' supplied, 'author_email' must be supplied too
creating responses-0.6.0
creating responses-0.6.0/responses.egg-info
making hard links in responses-0.6.0...
hard linking README.rst -> responses-0.6.0
hard linking responses.py -> responses-0.6.0
hard linking responses.pyi -> responses-0.6.0
hard linking setup.cfg -> responses-0.6.0
hard linking setup.py -> responses-0.6.0
hard linking responses.egg-info/PKG-INFO -> responses-0.6.0/responses.egg-info
hard linking responses.egg-info/SOURCES.txt -> responses-0.6.0/responses.egg-info
hard linking responses.egg-info/dependency_links.txt -> responses-0.6.0/responses.egg-info
hard linking responses.egg-info/not-zip-safe -> responses-0.6.0/responses.egg-info
hard linking responses.egg-info/requires.txt -> responses-0.6.0/responses.egg-info
hard linking responses.egg-info/top_level.txt -> responses-0.6.0/responses.egg-info
copying setup.cfg -> responses-0.6.0
Writing responses-0.6.0/setup.cfg
Creating tar archive
removing 'responses-0.6.0' (and everything under it)
After I've installed the package I got this:
% pip install dist/responses-0.6.0.tar.gz
[...snip...]
Installing collected packages: responses
Successfully installed responses-0.6.0
% pwd
/Users/ba/.virtualenvs/responses/lib/python3.5/site-packages
% ls responses*
responses.py
responses-0.6.0.dist-info:
DESCRIPTION.rst METADATA RECORD WHEEL metadata.json top_level.txt
According to the mypy docs, you should pass package_data={"my_package": ["py.typed", "foo.pyi"]} as an argument to setup in setup.py. Note that "foo.pyi" is the relative path from the root of the package to be distributed to the
stub file (docs).
I've created an example repo where you can test this out at https://github.com/SKalt/stub_distrib_demo.

H5py installation via setup.py, undefined symbol: iso_c_binding_

I'm installing the h5py according to the tutorial at http://docs.h5py.org/en/latest/build.html
The installation is sucessfull. However, the test failed,
python setup.py test
I got this:
running test
running build_py
running build_ext
Summary of the h5py configuration
Path to HDF5: '/opt/cray/hdf5-parallel/1.8.13/cray/83/'
HDF5 Version: '1.8.13'
MPI Enabled: True
Rebuild Required: False
Executing cythonize()
Traceback (most recent call last):
File "setup.py", line 140, in <module>
cmdclass = CMDCLASS,
File "/python/2.7.9/lib/python2.7/distutils/core.py", line 151, in setup
dist.run_commands()
File "/python/2.7.9/lib/python2.7/distutils/dist.py", line 953, in run_commands
self.run_command(cmd)
File "/python/2.7.9/lib/python2.7/distutils/dist.py", line 972, in run_command
cmd_obj.run()
File "setup.py", line 68, in run
import h5py
File "/h5py-2.5.0/build/lib.linux-x86_64-2.7/h5py/__init__.py", line 13, in <module>
from . import _errors
**ImportError:** /opt/cray/lib64/libmpichf90_cray.so.3: undefined symbol: iso_c_binding_
looks like cython cant not find the shared library, how can I attach that? Thanks.
(Edited for parallel build)
I got this to work on a Cray XC30 (ARCHER: http://www.archer.ac.uk) using the following:
module swap PrgEnv-cray PrgEnv-gnu
module load cray-hdf5-parallel
export CRAYPE_LINK_TYPE=dynamic
export CC=cc
ARCHER has specific modules for the Python enviroment on the compute nodes that link to performant versions of numpy etc. (see: http://www.archer.ac.uk/documentation/user-guide/python.php) so I also loaded these (this may not apply on your Cray system, in ARCHER's case mpi4py is already inlcuded in the python-compute install):
module load python-compute
module load pc-numpy
Finally, I added the custom install location I will use for h5py to PYTHONPATH
export PYTHONPATH=/path/to/h5py/install/lib/python2.7/site-packages:$PYTHONPATH
Now I can build:
python setup.py configure --mpi
python setup.py install --prefix=/path/to/h5py/install
...lots of output...
Now, running the tests on the frontend node fail but with the error message I would expect to see on a Cray XC if you try to launch MPI code on a login/service node (failed to initialize communication channel, the login/service nodes are not connected to the high performance network so cannot run MPI code). This suggests to me that the test would probably work if it was running on the compute nodes.
> python setup.py test
running test
running build_py
running build_ext
Autodetected HDF5 1.8.13
********************************************************************************
Summary of the h5py configuration
Path to HDF5: '/opt/cray/hdf5-parallel/1.8.13/GNU/49'
HDF5 Version: '1.8.13'
MPI Enabled: True
Rebuild Required: False
********************************************************************************
Executing cythonize()
[Thu Oct 22 19:53:01 2015] [unknown] Fatal error in PMPI_Init_thread: Other MPI error, error stack:
MPIR_Init_thread(547):
MPID_Init(203).......: channel initialization failed
MPID_Init(579).......: PMI2 init failed: 1
Aborted
To test properly you would have to submit a job that launched a parallel Python script on the compute nodes using aprun. I do not think the built in test framework will work easily as it probably expects the MPI launcher to be called mpiexec (as on a standard cluster) so you may need to write your own tests. The other option would be to coerce setup.py to use aprun instead somehow.

Pypi & PyPi Test Could not find any downloads that satisfy the requirement

I am trying to release software through pypi, and while pip can search for the library, it is unable to download the library.
I suspect it is an issue with the setup.py file,
doclines = __doc__.split("\n")
with open('requirements.txt') as f:
required = f.read().splitlines()
setup(
name='Directory_Caching',
packages=find_packages(),
version='1.0.6',
description = doclines[0],
long_description = "\n".join(doclines[2:]),
author='Benjamin Schollnick',
author_email='benjamin#schollnick.net',
license="MIT",
maintainer='Benjamin Schollnick',
maintainer_email='benjamin#schollnick.net',
platforms=["Any"],
url='https://github.com/bschollnick/Directory_Caching',
download_url = 'https://github.com/bschollnick/Directory_Caching/tarball/1.05',
#install_requires=required,
#requires=required,
keywords = ['caching', 'files', 'directories', 'scandir', 'naturalsort'],
classifiers=filter(None, classifiers.split("\n")),
)
Pypitest is accepting the file fine, via register, and the sdist upload is working fine.
-- Register
nerv:Directory_caching Benjamin$ python setup.py register -r pypitest
running register
running egg_info
deleting Directory_Caching.egg-info/requires.txt
writing Directory_Caching.egg-info/PKG-INFO
writing top-level names to Directory_Caching.egg-info/top_level.txt
writing dependency_links to Directory_Caching.egg-info/dependency_links.txt
reading manifest file 'Directory_Caching.egg-info/SOURCES.txt'
writing manifest file 'Directory_Caching.egg-info/SOURCES.txt'
running check
Registering Directory_Caching to https://testpypi.python.org/pypi
Server response (200): OK
nerv:Directory_caching Benjamin$ python setup.py sdist upload -r pypitest
running sdist
running egg_info
writing Directory_Caching.egg-info/PKG-INFO
writing top-level names to Directory_Caching.egg-info/top_level.txt
writing dependency_links to Directory_Caching.egg-info/dependency_links.txt
reading manifest file 'Directory_Caching.egg-info/SOURCES.txt'
writing manifest file 'Directory_Caching.egg-info/SOURCES.txt'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt
running check
creating Directory_Caching-1.0.503
creating Directory_Caching-1.0.503/Directory_Caching
creating Directory_Caching-1.0.503/Directory_Caching.egg-info
making hard links in Directory_Caching-1.0.503...
hard linking setup.cfg -> Directory_Caching-1.0.503
hard linking setup.py -> Directory_Caching-1.0.503
hard linking Directory_Caching/__init__.py -> Directory_Caching-1.0.503/Directory_Caching
hard linking Directory_Caching/directory_caching.py -> Directory_Caching-1.0.503/Directory_Caching
hard linking Directory_Caching.egg-info/PKG-INFO -> Directory_Caching-1.0.503/Directory_Caching.egg-info
hard linking Directory_Caching.egg-info/SOURCES.txt -> Directory_Caching-1.0.503/Directory_Caching.egg-info
hard linking Directory_Caching.egg-info/dependency_links.txt -> Directory_Caching-1.0.503/Directory_Caching.egg-info
hard linking Directory_Caching.egg-info/top_level.txt -> Directory_Caching-1.0.503/Directory_Caching.egg-info
copying setup.cfg -> Directory_Caching-1.0.503
Writing Directory_Caching-1.0.503/setup.cfg
Creating tar archive
removing 'Directory_Caching-1.0.503' (and everything under it)
running upload
Submitting dist/Directory_Caching-1.0.503.tar.gz to https://testpypi.python.org/pypi
Server response (200): OK
nerv:Directory_caching Benjamin$ python setup.py register -r pypitest
running register
running egg_info
writing Directory_Caching.egg-info/PKG-INFO
writing top-level names to Directory_Caching.egg-info/top_level.txt
writing dependency_links to Directory_Caching.egg-info/dependency_links.txt
reading manifest file 'Directory_Caching.egg-info/SOURCES.txt'
writing manifest file 'Directory_Caching.egg-info/SOURCES.txt'
running check
Registering Directory_Caching to https://testpypi.python.org/pypi
Server response (200): OK
upload
nerv:Directory_caching Benjamin$ python setup.py sdist upload -r pypitest
running sdist
running egg_info
writing Directory_Caching.egg-info/PKG-INFO
writing top-level names to Directory_Caching.egg-info/top_level.txt
writing dependency_links to Directory_Caching.egg-info/dependency_links.txt
reading manifest file 'Directory_Caching.egg-info/SOURCES.txt'
writing manifest file 'Directory_Caching.egg-info/SOURCES.txt'
warning: sdist: standard file not found: should have one of README, README.rst, README.txt
running check
creating Directory_Caching-1.0.504
creating Directory_Caching-1.0.504/Directory_Caching
creating Directory_Caching-1.0.504/Directory_Caching.egg-info
making hard links in Directory_Caching-1.0.504...
hard linking setup.cfg -> Directory_Caching-1.0.504
hard linking setup.py -> Directory_Caching-1.0.504
hard linking Directory_Caching/__init__.py -> Directory_Caching-1.0.504/Directory_Caching
hard linking Directory_Caching/directory_caching.py -> Directory_Caching-1.0.504/Directory_Caching
hard linking Directory_Caching.egg-info/PKG-INFO -> Directory_Caching-1.0.504/Directory_Caching.egg-info
hard linking Directory_Caching.egg-info/SOURCES.txt -> Directory_Caching-1.0.504/Directory_Caching.egg-info
hard linking Directory_Caching.egg-info/dependency_links.txt -> Directory_Caching-1.0.504/Directory_Caching.egg-info
hard linking Directory_Caching.egg-info/top_level.txt -> Directory_Caching-1.0.504/Directory_Caching.egg-info
copying setup.cfg -> Directory_Caching-1.0.504
Writing Directory_Caching-1.0.504/setup.cfg
Creating tar archive
removing 'Directory_Caching-1.0.504' (and everything under it)
running upload
Submitting dist/Directory_Caching-1.0.504.tar.gz to https://testpypi.python.org/pypi
Server response (200): OK
If I run a verbose run on pip, the following errors appear to be the problem?
Skipping link https://testpypi.python.org/pypi/Directory_Caching/1.0.504 (from https://testpypi.python.org/pypi/Directory_Caching); unknown archive format: .504
Skipping link https://testpypi.python.org/pypi/Directory_Caching/1.0.503 (from https://testpypi.python.org/pypi/Directory_Caching); unknown archive format: .503
Skipping link https://testpypi.python.org/pypi/Directory_Caching/1.0.502 (from https://testpypi.python.org/pypi/Directory_Caching); unknown archive format: .502
Skipping link https://testpypi.python.org/pypi/Directory_Caching/1.0.501 (from https://testpypi.python.org/pypi/Directory_Caching); unknown archive format: .501
Skipping link https://testpypi.python.org/pypi/Directory_Caching/1.0.51 (from https://testpypi.python.org/pypi/Directory_Caching); unknown archive format: .51
Skipping link https://testpypi.python.org/pypi/Directory_Caching/1.0.5 (from https://testpypi.python.org/pypi/Directory_Caching); unknown archive format: .5
I have setup tags at Github (https://github.com/bschollnick/Directory_Caching), and using the links in pypi or pypi test appear to work fine. Any suggestions?
I'm not sure, this the basic setup i use though with no problem.
import os
from distutils.core import setup
def read(fname):
return open(os.path.join(os.path.dirname(__file__), fname)).read()
setup(
name = 'name',
packages = ['package'],
version = '1.0.0',
author = 'your name',
author_email = 'some_email#gmail.com',
url = 'github',
download_url = 'git download link',
keywords = ['keywords'],
description = 'short description',
long_description = read('README.txt'),
classifiers = [],
)

Heroku push rejected due to pip/distribute bug. What's the workaround?

My local git/virtualenv is using pip version 1.3.1. When I try to push my Python 3.3.2 app to Heroku, I get
Downloading/unpacking distribute==0.6.34 (from -r requirements.txt (line 5))
Running setup.py egg_info for package distribute
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "./setuptools/__init__.py", line 2, in <module>
from setuptools.extension import Extension, Library
File "./setuptools/extension.py", line 5, in <module>
from setuptools.dist import _get_unpatched
File "./setuptools/dist.py", line 103
except ValueError, e:
^
SyntaxError: invalid syntax
Complete output from command python setup.py egg_info:
Traceback (most recent call last):
File "<string>", line 3, in <module>
File "./setuptools/__init__.py", line 2, in <module>
from setuptools.extension import Extension, Library
File "./setuptools/extension.py", line 5, in <module>
from setuptools.dist import _get_unpatched
File "./setuptools/dist.py", line 103
except ValueError, e:
^
SyntaxError: invalid syntax
----------------------------------------
Command python setup.py egg_info failed with error code 1 in /tmp/pip-build-u58345/distribute
Storing complete log in /app/.pip/pip.log
! Push rejected, failed to compile Python app
Given I can't manually install distribute on Heroku's servers, how am I supposed to avoid this bug?
The behaviour you are seeing is an issue with pip itself:
https://github.com/pypa/pip/issues/650
It seems that pip uses distribute to upgrade distribute.
However, what you need to do to fix your error is remove distribute from requirements.txt altogether. It's already there since it's being installed by the buildpack and there's no need to install it again using pip.
I believe you actually CAN and ARE installing distribute on heroku's servers via the default buildpack. Heroku's Python support is implemented in the form of a buildpack. You can read more about buildpacks here.
If you wish to have a specific version of distribute, in this case one without the pip-bug, you have to replace it's source within the buildpack your app is using. It can be done like so:
Get the original buildpack from heroku at https://github.com/heroku/heroku-buildpack-python
In your cloned buildpack (at the time of this writing) you will find /vendor/distribute-0.6.36. This is the problem. Replace it with a newer version of distribute.
Inside the buildpack's bin/compile script, replace the version of distribute the buildpack is using. In my case this was replacing line 31 DISTRIBUTE_VERSION="0.6.36" with DISTRIBUTE_VERSION="0.6.45"
Upload your buildpack to github and tell heroku to use it by saying
$ heroku config:set BUILDPACK_URL=https://github.com/you/name-of-buildpack-python-repo.git
ALTERNATIVELY
Tell heroku to use my custom buildpack instead of the original. My builbpack's only differences to the original are those described in steps 1-4.
To override the buildpack for an existing application:
$ heroku config:set BUILDPACK_URL=https://github.com/jhnwsk/heroku-buildpack-python.git
Or if you are creating a new application
$ heroku create myapp --buildpack https://github.com/jhnwsk/heroku-buildpack-python.git
When you push your application to Heroku after making these changes you should see something like
-----> Fetching custom git buildpack... done
-----> Python app detected
-----> No runtime.txt provided; assuming python-2.7.4.
-----> Preparing Python runtime (python-2.7.4)
-----> Installing Distribute (0.6.45)
-----> Installing Pip (1.3.1)
which means you have your custom distribute version running.

Categories

Resources