I am creating a module and need to prepare my setup.py file to have some requirements. One of the requirements is a fork of one package that is already in PyPI so I want to reference my GitHub repository directly.
I tried two configurations, the first one is:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement', # This is my repository location
]
)
I create a local distribution of my module using python setup.py sdist and when I run pip install path/to/module/dist/mymodule-0.1.tar.gz it ends up installing the version on PyPI and not my repository.
The other configuration, I tried to change the requirement name to force searching for a dependency link like so:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement_alt', # The dependency name with a suffix
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt', # This is my repository location
]
)
But with this, I only end up getting an error that myrequirement_alt is not found...
So I ask, what is the right way to achieve this without using PyPI?
For dependency links to work you need to add the version number of the package to https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt.
or it will not know what to install.
e.g.:
setup(
'name': 'mymodule',
# other arguments
install_requires=[
'myrequirement', # The dependency name
],
dependency_links=[
'https://github.com/ihhcarus/myrequirement.git#egg=myrequirement_alt-1.3' # Link with version at the end
]
)
Note that I wouldn't recommend using dependency links at all as they're deprecated. You should probably, instead, use requirement files.
Related
I would like to configure pip to use a custom search path to install packages from a local folder, which are not hosted on PyPI. The goal is to be able to run
$ pip install --user my_non_published_package
And have it install said package from /home/myuser/projects/my_non_published_package.
I know that I can explicitely install it using
$ pip install --user /home/myuser/projects/my_non_published_package
but this is not what I want. I want pip to transparently resolve potential depencies of other packages and install them from /home/myuser/projects/<package> iff they are not hosted on PyPI. So, if I have a project foo having a setup.py containing
from setuptools import setup
setup(
...
install_requires=[
bar,
...
],
...
)
And package bar is not on PyPI, I want pip to install it from /home/myuser/projects/bar automatically.
Update
I found some information here, but the proposed solution does not work.
#! /usr/bin/env python3
"""Setup configuration."""
from pathlib import Path
from setuptools import setup
def local_pkg(name: str) -> str:
"""Returns a path to a local package."""
return f'{name} # file://{Path.cwd().parent / name}'
setup(
name='openimmodb',
use_scm_version={
"local_scheme": "node-and-timestamp"
},
setup_requires=['setuptools_scm'],
author='HOMEINFO - Digitale Informationssysteme GmbH',
author_email='<info at homeinfo dot de>',
maintainer='Richard Neumann',
maintainer_email='<r dot neumann at homeinfo period de>',
install_requires=[
local_pkg('filedb'),
local_pkg('mdb'),
local_pkg('openimmo'),
local_pkg('openimmolib'),
'peewee',
local_pkg('peeweeplus'),
'pyxb',
local_pkg('timelib'),
local_pkg('xmldom')
],
packages=[
'openimmodb',
'openimmodb.dom',
'openimmodb.dom.ausstattung',
'openimmodb.json',
'openimmodb.json.ausstattung',
'openimmodb.json.barrier_freeness',
'openimmodb.json.flaechen',
'openimmodb.json.immobilie',
'openimmodb.json.preise',
'openimmodb.mixins',
'openimmodb.orm'
],
entry_points={
'console_scripts': [
'oidbctl = openimmodb.oidbctl:main'
]
},
description='Relational OpenImmo database'
)
Results in:
...
Processing dependencies for openimmodb==0.1.dev987+g199a834.d20220110094301
Searching for xmldom# file:///home/neumann/Projekte/xmldom
Reading https://pypi.org/simple/xmldom/
Couldn't find index page for 'xmldom' (maybe misspelled?)
Scanning index of all packages (this may take a while)
Reading https://pypi.org/simple/
Even though it is there:
$ ls /home/neumann/Projekte/xmldom
LICENSE Makefile README.md setup.py venv xmldom.egg-info xmldom.py
Same with additional localhost:
Searching for xmldom# file://localhost/home/neumann/Projekte/xmldom
Reading https://pypi.org/simple/xmldom/
Couldn't find index page for 'xmldom' (maybe misspelled?)
Scanning index of all packages (this may take a while)
Reading https://pypi.org/simple/
And with #egg=
Processing dependencies for openimmodb==0.1.dev987+g199a834.d20220110095236
Searching for xmldom# file://localhost/home/neumann/Projekte/xmldom#egg=xmldom
Reading https://pypi.org/simple/xmldom/
Couldn't find index page for 'xmldom' (maybe misspelled?)
Scanning index of all packages (this may take a while)
Reading https://pypi.org/simple/
With the absolute path in install_requires I get:
$ python setup.py install
error in openimmodb setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Parse error at "'/home/ne'": Expected W:(abcd...)
I am trying to write setup.py for my package. My package needs to specify a dependency on another Git repository.
This is what I have so far:
from setuptools import setup, find_packages
setup(
name='abc',
packages=find_packages(),
url='https://github.abc.com/abc/myabc',
description='This is a description for abc',
long_description=open('README.md').read(),
install_requires=[
"requests==2.7.0",
"SomePrivateLib>=0.1.0",
],
dependency_links = [
"git+git://github.abc.com/abc/SomePrivateLib.git#egg=SomePrivateLib",
],
include_package_data=True,
)
When I run:
pip install -e https://github.abc.com/abc/myabc.git#egg=analyse
I get
Could not find a version that satisfies the requirement
SomePrivateLib>=0.1.0 (from analyse) (from versions: ) No matching
distribution found for SomePrivateLib>=0.1.0 (from analyse)
What am I doing wrong?
After digging through the pip issue 3939 linked by #muon in the comments above and then the PEP-508 specification, I found success getting my private repo dependency to install via setup.py using this specification pattern in install_requires (no more dependency_links):
install_requires = [
'some-pkg # git+ssh://git#github.com/someorgname/pkg-repo-name#v1.1#egg=some-pkg',
]
The #v1.1 indicates the release tag created on github and could be replaced with a branch, commit, or different type of tag.
This answer has been updated regularly as Python has evolved over the years. Scroll to the bottom for the most current answer, or read through to see how this has evolved.
Unfortunately the other answer does not work with private repositories, which is one of the most common use cases for this. I eventually did get it working with a setup.py file that looks like this (now deprecated) method:
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository - needs entry in `dependency_links`
'ExampleRepo'
],
dependency_links=[
# Make sure to include the `#egg` portion so the `install_requires` recognizes the package
'git+ssh://git#github.com/example_org/ExampleRepo.git#egg=ExampleRepo-0.1'
]
)
Newer versions of pip make this even easier by removing the need to use "dependency_links"-
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git#egg=ExampleRepo-0.1'
]
)
However, with the very latest pip you'll run into issues with the EGG format handler. This is because while the egg is ignored pip is now doing direct URL matching and will consider two URLs, one with the egg fragment and the other without, to be completely different versions even if they point to the same package. As such it's best to leave any egg fragments off.
June 2021 - setup.py
So, the best way (current to June 2021) to add a dependency from Github to your setup.py that will work with public and private repositories:
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git'
]
)
February 2022 - setup.cfg
Apparently setup.py is being deprecated (although my guess is it'll be around for awhile) and setup.cfg is the new thing.
[metadata]
name = MyProject
version = 0.1.1
[options]
packages = :find
install_requires =
ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git
June 16th 2022 - pyproject.toml
setup.cfg is already "pre" deprecated. as setuptools now has experimental support for pyproject.toml files.
This is the future, but since this is still experimental it should not be used in real projects for now. Even though setup.cfg is on its way out you should use it for a declarative format, otherwise setup.py is still the way to go. This answer will be updated when setuptools has stabilized their support of the new standard.
January 4th 2023 - pyproject.toml
It is now possible to define all of your dependencies in pyproject.toml. Other options such as setup.cfg still work.
[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"
[project]
dependencies = [
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git',
]
[project.optional-dependencies]
dev = ['ExtraExample # git+ssh://git#github.com/example_org/ExtraExample.git']
Note: this answer is now outdated. Have a look at this answer from #Dick Fox for up-to-date instructions: https://stackoverflow.com/a/54794506/2272172
You can find the right way to do it here.
dependency_links=['http://github.com/user/repo/tarball/master#egg=package-1.0']
The key is not to give a link to a Git repository, but a link to a tarball. GitHub creates a tarball of the master branch for you if you append /tarball/master as shown above.
A more general answer: To get the information from the requirements.txt file I do:
from setuptools import setup, find_packages
from os import path
loc = path.abspath(path.dirname(__file__))
with open(loc + '/requirements.txt') as f:
requirements = f.read().splitlines()
required = []
dependency_links = []
# Do not add to required lines pointing to Git repositories
EGG_MARK = '#egg='
for line in requirements:
if line.startswith('-e git:') or line.startswith('-e git+') or \
line.startswith('git:') or line.startswith('git+'):
line = line.lstrip('-e ') # in case that is using "-e"
if EGG_MARK in line:
package_name = line[line.find(EGG_MARK) + len(EGG_MARK):]
repository = line[:line.find(EGG_MARK)]
required.append('%s # %s' % (package_name, repository))
dependency_links.append(line)
else:
print('Dependency to a git repository should have the format:')
print('git+ssh://git#github.com/xxxxx/xxxxxx#egg=package_name')
else:
required.append(line)
setup(
name='myproject', # Required
version='0.0.1', # Required
description='Description here....', # Required
packages=find_packages(), # Required
install_requires=required,
dependency_links=dependency_links,
)
Actually if you like to make your packages installable recursively (YourCurrentPackage includes your SomePrivateLib), e.g. when you want to include YourCurrentPackage into another one (like OuterPackage → YourCurrentPackage → SomePrivateLib) you'll need both:
install_requires=[
...,
"SomePrivateLib # git+ssh://github.abc.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
],
dependency_links = [
"git+ssh://github.abc.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
And make sure you have a tag created with your version number.
Also if your Git project is private and you like to install it inside the container, e.g., a Docker or GitLab runner, you will need authorized access to your repository. Please consider to use Git + HTTPS with access tokens (like on GitLab: https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html):
import os
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
....
install_requires=[
...,
f"SomePrivateLib # git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
],
dependency_links = [
f"git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Updated:
You have to put #egg=SomePrivateLib at the end of dependency line if you like to have this dependency in requirements.txt file. Otherwise pip install -r requirements.txt won't work for you and you wil get something like:
ERROR: Could not detect requirement name for
'git+https://gitlab-ci-token:gitlabtokenvalue#gitlab.server.com/abc/SomePrivateLib.git#0.1.0',
please specify one with #egg=your_package_name
If you use reuirements.txt, this part is resposible for name of dependency’s folder that would be created inside python_home_dir/src and for name of egg-link in site-packages/
You can use a environment variable in your requirements.txt to store your dependency’s token value safe in your repo:
Example row in requrements.txt file for this case:
....
-e git+https://gitlab-ci-token:${EXPORTED_VAR_WITH_TOKEN}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib
....
I was successful with these three options in GitLab. I am using version 11 of GitLab.
Option 1 - no token specified. The shell will prompt for username/password.
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
"SomePrivateLib # git+https://gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Option 2 - user access token specified. The token generated by going to GitLab → account top right → settings → access tokens. Create the token with read_repository rights.
Example:
import os
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
f"SomePrivateLib # git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Option 3 - repository-level token specified. The token generated by going to the repository → settings → repository → deploy tokens. From here, create a token with read_repository rights.
Example:
import os
from setuptools import setup
TOKEN_USER = os.getenv('EXPORTED_TOKEN_USER')
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
f"SomePrivateLib # git+https://{TOKEN_USER}:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
In all three, I was able to do simply: "SomePrivateLib # git+https://gitlab.server.com/abc/SomePrivateLib.git" without the #egg marking at the end.
This solution works for me when I run python setup.py install:
setuptools.setup(
...,
install_requires=[
'numpy',
'pandas',
'my_private_pkg'
],
dependency_links=["git+https://github.com/[username]/[my_private_pkg].git#main#egg=my_private_pkg"],
...
)
I am working on a fork of a python projet (tryton) which uses setuptools for packaging. I am trying to extend the server part of the project, and would like to be able to use the existing modules with my fork.
Those modules are distributed with setuptools packaging, and are requiring the base project for installation.
I need a way to make it so that my fork is considered an acceptable requirement for those modules.
EDIT : Here is what I used in my setup.py :
from setuptools import setup
setup(
...
provides=["trytond (2.8.2)"],
...
)
The modules I want to be able to install have those requirements :
from setuptools import setup
setup(
...
install_requires=["trytond>=2.8"]
...
)
As it is, with my package installed, trying to install a module triggers the installation of the trytond package.
Don’t use provides, it comes from a packaging specification (a metadata PEP) that is not implemented by any tool. The requiremens in the install_requires argument map to the name in your other setup.py. IOW, replace your provides with setup(name='trytond', version='2.8.2').
If you are building rpms, it is possible to use the setup.cfg as follows:
[bdist_rpm]
provides = your-package = 0.8
obsoletes = your-package
I am creating a setup.py file for a project which depends on private GitHub repositories. The relevant parts of the file look like this:
from setuptools import setup
setup(name='my_project',
...,
install_requires=[
'public_package',
'other_public_package',
'private_repo_1',
'private_repo_2',
],
dependency_links=[
'https://github.com/my_account/private_repo_1/master/tarball/',
'https://github.com/my_account/private_repo_2/master/tarball/',
],
...,
)
I am using setuptools instead of distutils because the latter does not support the install_requires and dependency_links arguments per this answer.
The above setup file fails to access the private repos with a 404 error - which is to be expected since GitHub returns a 404 to unauthorized requests for a private repository. However, I can't figure out how to make setuptools authenticate.
Here are some things I've tried:
Use git+ssh:// instead of https:// in dependency_links as I would if installing the repo with pip. This fails because setuptools doesn't recognize this protocol ("unknown url type: git+ssh"), though the distribute documentation says it should. Ditto git+https and git+http.
https://<username>:<password>#github.com/... - still get a 404. (This method doesn't work with curl or wget from the command line either - though curl -u <username> <repo_url> -O <output_file_name> does work.)
Upgrading setuptools (0.9.7) and virtualenv (1.10) to the latest versions. Also tried installing distribute though this overview says it was merged back into setuptools. Either way, no dice.
Currently I just have setup.py print out a warning that the private repos must be downloaded separately. This is obviously less than ideal. I feel like there's something obvious that I'm missing, but can't think what it might be. :)
Duplicate-ish question with no answers here.
I was trying to get this to work for installing with pip, but the above was not working for me. From [1] I understood the PEP508 standard should be used, from [2] I retrieved an example which actually does work (at least for my case).
Please note; this is with pip 20.0.2 on Python 3.7.4
setup(
name='<package>',
...
install_requires=[
'<normal_dependency>',
# Private repository
'<dependency_name> # git+ssh://git#github.com/<user>/<repo_name>#<branch>',
# Public repository
'<dependency_name> # git+https://github.com/<user>/<repo_name>#<branch>',
],
)
After specifying my package this way installation works fine (also with -e settings and without the need to specify --process-dependency-links).
References
[1] https://github.com/pypa/pip/issues/4187
[2] https://github.com/pypa/pip/issues/5566
Here's what worked for me:
install_requires=[
'private_package_name==1.1',
],
dependency_links=[
'git+ssh://git#github.com/username/private_repo.git#egg=private_package_name-1.1',
]
Note that you have to have the version number in the egg name, otherwise it will say it can't find the package.
I couldn't find any good documentation on this, but came across the solution mainly through trial & error. Further, installing from pip & setuptools have some subtle differences; but this way should work for both.
GitHub don't (currently, as of August 2016) offer an easy way to get the zip / tarball of private repos. So you need to point setuptools to tell setuptools that you're pointing to a git repo:
from setuptools import setup
import os
# get deploy key from https://help.github.com/articles/git-automation-with-oauth-tokens/
github_token = os.environ['GITHUB_TOKEN']
setup(
# ...
install_requires='package',
dependency_links = [
'git+https://{github_token}#github.com/user/{package}.git/#{version}#egg={package}-0'
.format(github_token=github_token, package=package, version=master)
]
A couple of notes here:
For private repos, you need to authenticate with GitHub; the simplest way I found is to create an oauth token, drop that into your environment, and then include it with the URL
You need to include some version number (here is 0) at the end of the link, even if there's no package on PyPI. This has to be a actual number, not a word.
You need to preface with git+ to tell setuptools it's to clone the repo, rather than pointing at a zip / tarball
version can be a branch, a tag, or a commit hash
You need to supply --process-dependency-links if installing from pip
I found a (hacky) workaround:
#!/usr/bin/env python
from setuptools import setup
import os
os.system('pip install git+https://github-private.corp.com/user/repo.git#master')
setup( name='original-name'
, ...
, install_requires=['repo'] )
I understand that there are ethical issues with having a system call in a setup script, but I can't think of another way to do this.
Via Tom Hemmes' answer I found this is the only thing that worked for me:
install_requires=[
'<package> # https://github.com/<username>/<package>/archive/<branch_name>.zip']
Using archive URL from github works for me, for public repositories. E.g.
dependency_links = [
'https://github.com/username/reponame/archive/master.zip#egg=eggname-version',
]
With pip 20.1.1, this works for me
install_requires=[ "packson3#https://tracinsy.ewi.tudelft.nl/pubtrac/Utilities/export/138/packson3/dist/packson3-1.0.0.tar.gz"],
in setup.py
Edit: This appears to only work with public github repositories, see comments.
dependency_links=[
'https://github.com/my_account/private_repo_1/tarball/master#egg=private_repo_1',
'https://github.com/my_account/private_repo_2/tarball/master#egg=private_repo_2',
],
Above syntax seems to work for me with setuptools 1.0. At the moment at least the syntax of adding "#egg=project_name-version" to VCS dependencies is documented in the link you gave to distribute documentation.
This work for our scenario:
package is on github in a private repo
we want to install it into site-packages (not into ./src with -e)
being able to use pip install -r requirements.txt
being able to use pip install -e reposdir (or from github), where the dependencies are only specified in requirements.txt
https://github.com/pypa/pip/issues/3610#issuecomment-356687173
I have a little problem with setuptools/easy_install; maybe someone could give me a hint what might be the cause of the problem:
To easily distribute one of my python webapps to servers I use setuptools' sdist command to build a tar.gz file which is copied to servers and locally installed using easy_install /path/to/file.tar.gz.
So far this seems to work great. I have listed everything in the MANIFEST.in file like this:
global-include */*.py */*.mo */*.po */*.pot */*.css */*.js */*.png */*.jpg */*.ico */*.woff */*.gif */*.mako */*.cfg
And the resulting tar.gz file does indeed contain all of the files I need.
It gets weird as soon as easy_install tries to actually install it on the remote system. For some reason a directory called locales and a configuration file called migrate.cfg won't get installed. This is odd and I can't find any documentaiton about this, but I guess it's some automatic ignore feature of easy_install?
Is there something like that? And if so, how do I get easy_install to install the locales and migrate.cfg files?
Thanks!
For reference here is the content of my setup.py:
from setuptools import setup, find_packages
requires = ['flup', 'pyramid', 'WebError', 'wtforms', 'webhelpers', 'pil', 'apns', \
'pyramid_beaker', 'sqlalchemy', 'poster', 'boto', 'pypdf', 'sqlalchemy_migrate', \
'Babel']
version_number = execfile('pubserverng/version.py')
setup(
author='Bastian',
author_email='test#domain.com',
url='http://domain.de/',
name = "mywebapp",
install_requires = requires,
version = __version__,
packages = find_packages(),
zip_safe=False,
entry_points = {
'paste.app_factory': [
'pubserverng=pubserverng:main'
]
},
namespace_packages = ['pubserverng'],
message_extractors = { 'pubserverng': [
('**.py', 'python', None),
('templates/**.html', 'mako', None),
('templates/**.mako', 'mako', None),
('static/**', 'ignore', None),
('migrations/**', 'ignore', None),
]
},
)
I hate to answer my own question this quickly, but after some trial and error I found out what the reason behind the missing files was. In fact it was more than one reason:
The SOURCES.txt file was older and included a full list of most files, which resulted in them being bundled correctly.
The MANIFEST.in file was correct, too, so all required files were actually in the .tar.gz archive as expected. The main problem was that a few files simply would not get installed on the target machine.
I had to add include_package_data = True, to my setup.py file. After doing that all files installed as expected.
I'll have to put some research into include_package_data to find out if this weird behavior is documented somewhere. setuptools is a real mess - especially the documentation.
The entire package distribution system in python leaves a lot to be desired. My issues were similar to yours and were eventually solved by using distutils (rather than setuptools) which honored the include_package_data = True setting as expected.
Using distutils allowed me to more or less keep required file list in MANIFEST.inand avoid using the package_data setting where I would have had to duplicate the source list; the draw back is find_packages is not available. Below is my setup.py:
from distutils.core import setup
package = __import__('simplemenu')
setup(name='django-simplemenu',
version=package.get_version(),
url='http://github.com/danielsokolowski/django-simplemenu',
license='BSD',
description=package.__doc__.strip(),
author='Alex Vasi <eee#someuser.com>, Justin Steward <justin+github#justinsteward.com>, Daniel Sokolowski <unemelpmis-ognajd#danols.com>',
author_email='unemelpmis-ognajd#danols.com',
include_package_data=True, # this will read MANIFEST.in during install phase
packages=[
'simplemenu',
'simplemenu.migrations',
'simplemenu.templatetags',
],
# below is no longer needed as we are utilizing MANIFEST.in with include_package_data setting
#package_data={'simplemenu': ['locale/en/LC_MESSAGES/*',
# 'locale/ru/LC_MESSAGES/*']
# },
scripts=[],
requires=[],
)
And here is a MANIFEST.in file:
include LICENSE
include README.rst
recursive-include simplemenu *.py
recursive-include simplemenu/locale *
prune simplemenu/migrations
You need to use the data_files functionality of setup - your files aren't code, so easy_install won't install them by default (it doesn't know where they go).
The upside of this is that these files are added to MANIFEST automatically - you don't need to do any magic to get them there yourself. (In general if a MANIFEST automatically generated by setup.py isn't sufficient, adding them yourself isn't going to magically get them installed.)