Python distutils exclude setup.py - python

My setup.py script is simple:
from distutils.core import setup
setup(name='my-awesome-app',
version='1.0',
scripts=['my-awesome-app.py'],
)
And the file structure is:
my-awesome-app/
my-awesome-app.py
setup.py
In theory I am only including my-awesome-app.py in the distribution. In practice setup.py ends up in the RPM too.
I don't see a point of including setup.py there, is there a way to force distutils to leave this file out?
I am using python 2.7, I build my RPM by running python setup.py bdist_rpm.
Thanks for help :)

setup.py is required because when the package is installed in your environment, the following command is run:
$ python setup.py install
Running python setup.py bdist_rpm only creates a distribution package that you can give to others. setup.py is still required to do the installation.

You can always create the spec file manually and leave out the setup.py.
For example and more details see:
https://fedoraproject.org/wiki/Packaging:Python#Example_common_spec_file

Related

Wheel not built when pip installing from source - causes ModuleNotFoundError

I have written a python package mypackage , hosted on github, that I now want to install as a dependency to another project (specifically branch branch).
I can build it locally using setuptools and a pyproject.toml file by running python3 -m pip install . at the top of mypackage/ and can then successfully import it into python.
I pushed this to github and now try to install it using
python3 -m pip install "git+https://github.com/mygituser/mypackage.git#branch"
This runs without warning, and if I then run python3 -m pip list I can see mypackage listed there. However, if I enter python and run import mypackage I get the error ModuleNotFoundError: No module named 'mypackage'.
Comparing the verbose outputs I can see that for the local install after installing the dependencies I get
running bdist_wheel
running build
running build_py
running egg_info
writing mypackage/mypackage.egg-info/PKG-INFO
writing dependency_links to mypackage/mypackage.egg-info/dependency_links.txt
...
...
etc. which is absent for the github build process (this just ends after dependencies are installed).
Am I missing a setting in pyproject.toml or somewhere to tell it to build a wheel so I can import mypackage it in python?
I have tried appending with #egg=mypackage and adding wheel to the build dependencies, but none of this has worked.
TIA
In addition to the changes you made to your pyprojcet.toml file here: https://github.com/jatkinson1000/archeryutils/commit/f5fb491aa37961a65796ce4be3616d8be23548ed
You also need to include a __init__.py file in your round_data_files folder so it gets included in the package install.
https://github.com/jatkinson1000/archeryutils/pull/9

Setuptools: Include subdirectories in package_data

I believe this question was asked before but I'm still a little stuck. I'm trying to install a Python package that has some data files with subdirectories. Here's my setup:
setup.py
src/
mypkg/
__init__.py
module.py
data/
tables.dat
spoons.dat
sub/
forks.dat
Following the docs I tried to add:
setup(...,
packages=['mypkg'],
package_dir={'mypkg': 'src/mypkg'},
package_data={'mypkg': ['data/*.dat', 'data/sub/*.dat']},
)
I install the module with python setup.py install (though eventually I'll use python setup.py sdist upload to upload the package to pypi so others can pip install the module.
After running the python setup.py install command, to find the module location, I then import mypkg and print(mypkg.__file__). In the package directory, however, I can see data but not data/sub. Does anyone know what I'm missing? Any help is greatly appreciated!
Ah, it turns out the above works fine!
To install the module to my site-packages/mypkg location, I just had to use: python setup.py sdist and then pip install dist/mypkg-0.0.1.tar.gz.
Then my data files were in site-packages/mypkg.
I had the same issue, in my case the problem was the package was installed and when executing
pip install .
in my local it didn't reinstall, so the packages weren't included.
Uninstall before install was the key for me

how to exclude source code from bdist_wheel python

We want to exclude the Python source code from the package we create. But after configuring setup.py, I failed excluding the py files. I have been using python setup.py bdist_wheel as command. Is there any way to exclude to source code from Python package? Basically we do not want to expose the source codes.
The wheel plugin for setuptools that handles bdist_wheel does not have the --exclude_source_files which is only supported by bdist_egg. The egg packages are however deprecated and not supported by pip for example. What you can do however:
pip3 install wheel
python3 setup.py bdist_egg --exclude-source-files
wheel convert dist/mypackage-1.0-py3.6.egg
The wheel utility takes a source-stripped egg and converts it into whl package.
This came up for us putting Python libs on an embedded system with very minimal available memory.
It can be achieved with:
./setup.py bdist_egg --exclude_source_files
Works for both distutils and setuptools.
Well, we also encountered this issue (around 2 years ago) and didn't find a sensible automation process for it. So I wrote my own.
You're welcomed to it: setup.py template

Python 3: setup.py: pip install that does everything (build_ext + install)

I'm learning how to use distutils, and there's something I don't understand, and I wish someone could explain this to me.
I already am successful in creating tar.gz packages that can be installed with
pip install mypackage.tar.gz
I did this with setup.py, with a simple script that runs the function setuptools.setup(), which I call using python3 setup.py sdist.
What I wanna learn now: How to include building extensions in this.
My problem: I couldn't find any comprehensive text that explains how a pip install of a package that has a build_ext class can get it to build, then install.
If we look at this example, for the famous tool cx_freeze package, we see:
There's an inherited build_ext class
There's a method build_extension()
in setup(), the variable cmdclass, which contains a dict that contains the class build_ext
My question: What gets cx_freeze to build the extension and then install it? Is having a cmdclass defined with build_ext enough to achieve this?
After many tests, I learned that this is related to how pip works, not how setup.py works. It turns out that after writing your setup.py file and using pip to install, this is what happens:
pip creates a temporary file, in Linux it's in /tmp, and on Windows it's in the temp dir of the user.
pip downloads/extracts the package to that temp directory (whether from a tar.gz or from an online source or from a repository)
pip runs the following operations in order:
setup.py install
setup.py build
setup.py install_lib
setup.py build_py
setup.py build_ext
And all this depends on whether you have stuff defined in the cmdclass parameter of setup(). For example, build_ext will run only if you have build_ext defined in cmdclass AND you have ext_modules defined in the parameters of your setup() call. So ext_modules, for example, is expected to be a list of Extension(), which contains all the information about every extension. The function in the class build_extension(self,ext) will be executed on every element of that list.
All these classes that go to cmdclass (I use build and build_py) have a method called run() that you should override to put in your custom building procedure.
After all that is done, pip installs (or copies) the packages defined in setup() (which are basically directories in that temp) to your Python directory, which is the end of the installation, and deletes temp files.
There's more details to all this, but I guess this is good for a starter. I saw none of this explained anywhere comprehensively. So I hope this helps and saves people the empirical testing I had to do to learn this.

installing pandas from source

I need some help with installing pandas from source. I compile from the source because I am working on a hpc cluster and I have no administrative rights, and I install in my local folder.
I have looked for the official documentation for installation from source, but I think it is missing.
I do
python setup.py install --prefix=/my/local/folder build_ext --inplace --force
but then when I import it from python 3.3. it says:
ImportError: C extension: hashtable not built. If you want to import pandas from the source directory, you may need to run 'python setup.py uild_ext --inplace' to build the C extensions first.
Can it be that I am doing something wrong with Cython? Ideas/suggestions?

Categories

Resources