pip not installing Cython source files - python

I have one *.pyx (Cython) source file in my package but it was not getting included in the source distribution when I ran:
$ python3 setup.py sdist
Therefore, I added a MANIFEST.in file to include all *.pyx files. My package requirements specify Cython as a requirement: therefore all my users would have Cython installed. In other packages, I use pyximport:
import pyximport; pyximport.install(language_level=3)
from mycythonpackage.mycythonmodule import *
Now when I run sdist I correctly see the *.pyx files packaged in the tarball. However, when I extract the tarball to a folder, and then execute:
$ sudo pip3 install .
The installation proceeds perfectly. But the pyx files were not installed and therefore, my package is unable to find the Cython module when I run it.
Just including/installing the pyx files in my package will solve my problem as I expect my users to run my package as is, and Cython will compile them on the fly. I do not want to compile/build the pyx files nor do I want/need to package the .so / .c / ... files. I just want the plain *.pyx files installed. How can I instruct pip to forcefully include/install the *.pyx files during installation?

I found the solution: modify the setup.py file to include:
package_data = { 'mypackage': ['mycythonmodule-filename.pyx']},
include_package_data = True

Related

compile translation files when calling setup.py install

I'm developing a Flask application using Babel. Thanks to Distutils/Setuptools Integration, all the parameters of compile/extract/... functions are stored in setup.cfg and compiling the i18n files is as easy as
./setup.py compile_catalog
Great. Now I would like this to be done automatically when running
./setup.py install
In make's words, that would be letting install target depend on compile_catalog target.
The context
We store only translation (.po) files in the code repository. .gitignore excludes .mo and .pot files from being tracked.
When a developer pulls a new revision of the code, he runs
pip install -r requirements.txt
to update dependencies and install the project in development mode. Then, using above command line, he compiles the translation binary (.mo) files.
Is there a simple and recommended way to modify setup.py to do both operations in one step? Or am I trying to misuse setuptools?
Using a script that like this would work for development purposes:
#!/bin/sh
./setup.py compile_catalog
pip install -r requirements.txt
but I would like a solution that also works when the package is installed with usual setup.py install instructions, like if installed from PyPi.
Should I understand that setuptools are not meant to be used like this, and people distributing software compile their translation files either manually or using custom scripts when creating their archives, rather than relying on setup.py to compile them at installation time?
I didn't find many posts on the Internet addressing this. The ones I found involved running pybabel command line interface from a function in setup.py, which sounds like a shame as it misses the point of setuptools integration.
I think your demand is totally valid and I'm quite surprised that there seems to be no official guideline on how to accomplish this.
The project I working on now also went multilingual and this is what I did:
In setup.cfg, make appropriate entries so that compile_catalog can be run without options.
In setup.py, subclass the install command from setuptools:
setup.py:
from setuptools import setup
from setuptools.command.install import install
class InstallWithCompile(install):
def run(self):
from babel.messages.frontend import compile_catalog
compiler = compile_catalog(self.distribution)
option_dict = self.distribution.get_option_dict('compile_catalog')
compiler.domain = [option_dict['domain'][1]]
compiler.directory = option_dict['directory'][1]
compiler.run()
super().run()
Then, when calling setup(), register our InstallWithCompile command with the name "install" and make sure that the *.mo files will be included in the package:
setup(
...
cmdclass={
'install': InstallWithCompile,
},
...
package_data={'': ['locale/*/*/*.mo', 'locale/*/*/*.po']},
)
Since babel is used during the setup, you should add it as a setup dependency:
setup_requires=[
'babel',
],
Note that a package (here, babel) appearing in both setup_requires and install_requires won't be installed correctly using python setup.py install due to an issue in setuptools, but it works fine with pip install.

Distribute pip package no source code

I have an "homemade" python package which i can successfully install via pip package manager.
I'd like ti distribute it without giving the source code (*.py files)... i tried to compile them with
python -m compileall .
and then installed by typing pip install .
However it can't find the module when i try to import it in my application.
ImportError: No module named...
What do you suggest to solve?
Thanks
I guess it is to do with setuptools not packaging up *.pyc files, because normally you don't want them.
You should create a file MANIFEST.in with the content
global-include *.py[co]
global-exclude *.py
This tells setuptools to exclude *.py source files and include *.pyc compiled files.
Afterwards create a source distribution package
python setup.py sdist
or a wheel
python setup.py bdist_wheel
which also compiles C extensions.

Installing Python package from version control where setup.py is not in the project root

I'm trying to include the pyobjc package in my pip requirements file, but I need a committed version that doesn't have a release yet in order to pull in a much needed bug fix. The pyobjc package is a pseudo-package to install all the other framework dependencies.
I can specify the HG path in the pip requirements just fine. The problem I'm facing is that the repository doesn't have a setup.py in the root directory. Instead it has a subdirectory labeled pyobjc (with all the framework subdirectories alongside) that contains setup.py. In the root directory of the repo, there's a file labeled install.py that pyobjc's readme recommends using when installing from source.
Does anyone have any idea how to call install.py from pip instead of setup.py or point to the subdirectory location?
In the pyobjc/pyobjc subdirectory I see setup.py, not `install.py.
pip can be advised to look into a subdirectory of a VCS repository for setup.py:
pip install -e 'hg+https://bitbucket.org/ronaldoussoren/pyobjc#39c254b20bf2d63ef2891cb57bba027dfe7e06e8#egg=pyobjc&subdirectory=pyobjc'

How can I make setuptools (or distribute) install a package from the local file system

Is it possible to specify (editable) source dependencies in setup.py that are known to reside on the local file system?
Consider the following directory structure, all of which lives in a single VCS repository:
projects
utils
setup.py
...
app1
setup.py
... # app1 files depend on ../utils
app2
setup.py
... # app2 files depend on ../utils
Given the following commands:
cd projects
mkvirtualenv app1
pip install -e app1
I'd like to have all the dependencies for app1 installed, including "utils", which is an "editable" dependency. Likewise, if I did the same for app2.
I've tried playing with all different combinations of file://... URLs in install_requires and dependency_links to no avail. I'd like to use a dependency link URL like src+file://../utils, which would tell setuptools that the source for the package is on the file system at this relative path. Is there a way to do this?
i managed to provide relative local dependency in setup.py with:
setup(
install_requires=[
'utils # file://localhost/%s/../utils/' % os.getcwd().replace('\\', '/'),
],
)
but maybe someone know better solution
I had an identical problem where I needed to depend on modules in a sibling folder. I was able to find a solution after stumbling upon https://caremad.io/2013/07/setup-vs-requirement/
I ended up requirements.txt to refer specifically to the file I wanted, and then installing everything with
pip install -r requirements.txt
requirements.txt
-e ../utils
-e .
And setup.py has all my other dependencies, including utils. When pip tries to install app1 itself, it realizes that the utils dependency has already been filled, and so passes over it, while installing the other requirements.
When I want to work with a set of projects interrelated, I install all of them using /setup.py develop.
If I mistakenly or later I want to make a pip-installed module editable, I clone the source, and do a python setup.py develop on it too, substituting the existing one.
Just to get sure, I erase the reference in the virtualenv's site-packages and the package itself.

Can not install hcluster from pypi under python 2.6 on xp

I am using the setup.py file supplied with hcluster with the following lines added:
sys.path.append("c:\\Program Files\\Python26\\Lib\\site-packages\\hcluster-0.2.0")
sys.path.append("c:\\Program Files\\Python26\\Lib\\site-packages\\hcluster-0.2.0\\hcluster")
Then used setup.py as follows:
"c:\program files\python26\python.exe" "c:\Program Files\Python26\Lib\site-packages\hcluster-0.2.0\setup.py" install
I get the following error messages:
running install
running build
running build_py
error: package directory 'hcluster' does not exist
Don't know if it trying to read or write hcluster.
Any help appreciated
You don't need to add packages in site-packages in sys.path.
Did you copy the hcluster in site-package manually? It is not the correct way to do it.
2.1 You should have the hcluster outside the site-packages say in your home directory and then run "python setup.py install"
2.2 This will put the package after build into site-package directory. This is where all external package reside by default after they are installed.
Remove the folders related to hcluster from site-packages and install with instruction 2.
Read the following to understand your error: http://docs.python.org/install/index.html

Categories

Resources