Setuptools: Import from 'project' with setup.py in source folder - python

I am trying to create a setup.py for an existing project. The project has a directory structure that I cannot change. I need my setup.py to be in the same folder as my project source files.
Sample 1, directory structure.
MyModule
├── __init__.py
├── MyApp.ini
├── MyApp.py
├── setup.py
└── foo.py
This is my stetup.py
from setuptools import setup, find_packages
packages = find_packages(exclude=['ez_setup', 'tests', 'tests.*'])
console_script = list()
console_script.append('MyApp = MyApp:main')
py_modules = list()
py_modules.append('MyApp')
other_files = list()
other_files.append('MyApp.ini')
module_name = "MyModule"
mysetup = setup(name=module_name,
py_modules=py_modules,
version="1.0.0",
packages=packages,
package_dir={module_name: module_name},
package_data={module_name: other_files},
include_package_data=True,
entry_points={'console_scripts': console_script, },
zip_safe=False,
python_requires='>=2.7,<=3.0',
)
After installing MyModule via 'python setup install'. I cannot import from MyModule. 'from MyModule import MyApp' does not work. I can import directly. 'import MyApp' works. The problems is 'import foo' works as well. I have multiple projects with different foo.py.
Sample 2:
If I could change the directory structure as shown below. The install works correctly.
MyModule
├── MyModule
│ ├── foo.py
│ ├── __init__.py
│ ├── MyApp.ini
│ └── MyApp.py
└── setup.py
Is there a way to get sample 1, to install the way sample 2 does?

I was able to answer my own question. It can be done by setting package_dir up one level as shown below. I had to use data_files rather than package_data to add my support files.
Limitation: The setup script, setup.py, is installed as part of the egg. I tried to excluding it, but it gets installed anyway.
from setuptools import setup, find_packages
packages = find_packages(exclude=['ez_setup', 'tests', 'tests.*'])
console_script = list()
console_script.append('MyApp = MyModule.MyApp:main')
packages.append("MyModule")
setup(name="MyModule",
version="1.0.0",
packages=packages,
package_dir={"MyModule": "../MyModule"},
data_files=[('MyModule', ['MyApp.ini'])],
include_package_data=True,
entry_points={'console_scripts': console_script, },
zip_safe=False,
python_requires='>=2.7,<=3.0',
)

Related

Python: Best way to get shared object library path from within a pypi package

I'm developing a Python package that includes an extension module:
# setup.py
from distutils.core import setup, Extension
setup(
name="myPythonPkg",
# ... all other args
packages=["myPythonPkg"],
ext_modules=[
Extension('myFastCfunctions', ['myFastCfunctions.c'])
]
)
When I test the installation of this package with python setup.py install --prefix=$PWD/prefix I see (roughly):
<prefix>
└── lib
└── python3.10
└── site-packages
├── myFastCfunctions.cpython-310-x86_64-linux-gnu.so
├── myPythonPkg
│   ├── __init__.py
└── myPythonPkg-1.0.2-py3.10.egg-info
Inside myPythonPkg/__init__.py I'd like to get the path of myFastCfunctions.cpython-310-x86_64-linux-gnu.so and load it via ctypes.cdll.LoadLibrary. Of course I can paste that path directly, but I was wondering if there is a smarter, more platform and version agnostic way to doing that.
Use EXTENSION_SUFFIXES[0], example:
from importlib.machinery import EXTENSION_SUFFIXES
myFastCfunctions_file = path.join(
path.join(
path.dirname(__file__), '..',
'myFastCfunctions{}'.format(
EXTENSION_SUFFIXES[0]
)
)
)
myFastCfunctions = cdll.LoadLibrary(myFastCfunctions_file)
Link to docs.

Building a package referencing a package in a parent directory in setup.py

I am trying to build a pip package from source code in a Git repository that has multiple packages that share a common package. (I am not allowed to change the structure of this Git repository.)
The structure is:
├── common
│ ├── __init__.py
│ ├── run_helpers
│ │ ├── __init__.py
│ │ ├── aws.py
│ │ └── s3.py
└── components
└── redshift_unload
├── redshift_unload
│ ├── __init__.py
│ └── run.py
└── setup.py
My setup.py is as follows:
from setuptools import setup, find_packages
setup(
...
packages=find_packages(),
package_dir={"": "."},
entry_points={
"console_scripts": ["redshift_unload=redshift_unload.run:main"]
}
)
Looking at other answers here, things I have tried so far include:
Specifying the actual package names in the packages= line instead of using find_packages().
Passing where="../../" to find_packages()
Using find_packages() + find_packages(where="../../") in the packages=` line.
Everything I can think of in the packages_dir line.
When I run pip install . I get, the package installs fine, but then when I run the installed python script I get:
# redshift_unload
Traceback (most recent call last):
File "/usr/local/bin/redshift_unload", line 5, in <module>
from redshift_unload.run import main
File "/usr/local/lib/python3.8/site-packages/redshift_unload/run.py", line 9, in <module>
from common._run_helpers.aws import get_boto3_session
ModuleNotFoundError: No module named 'common'
What did work:
If I moved the common directory to components/redshift_unload, then it works fine. But I can't do this. I also tried placing a symlink there in its place, but seems like that doesn't work either.
Is there a way to make this work?
I believe I have found the best solution to this.
Based on this comment here, I concluded that what I am trying to do is not intended or supported.
However, I found a workaround as follows works fine:
from pathlib import Path
from shutil import rmtree, copytree
from setuptools import setup, find_packages
src_path = Path(os.environ["PWD"], "../../common")
dst_path = Path("./common")
copytree(src_path, dst_path)
setup(
...
packages=find_packages(),
package_dir={"": "."},
entry_points={
"console_scripts": ["redshift_unload=redshift_unload.run:main"]
}
)
rmtree(dst_path)
The key insight here is that, while packaging occurs in a temporary directory, the value of os.environ["PWD"] is available to the process, such that the common directory can be copied temporarily and then cleaned up again (using shutil functions copytree and rmtree) into a location that will be found by find_packages() before the setup() function is called.

Multiple subpackages within one package

I am trying to write a package with the following structure
/package
setup.py
/subpackage1
subpackage1.py
__init__.py
/subpackage2
subpackage2.py
__init__.py
/utils
some_other_files_and_codes
__init__.py
My setup.py currently looks like this:
from setuptools import setup, find_packages
setup(
name = 'subpackage1',
version = '1.0',
install_requires=['numpy',
'scipy'],
packages = find_packages(),
)
I then install it using pip install -e . from the /package folder.
However, I am not able to import subpackage2, only subpackage1.
I would like to be able to import them as
from package import subpackage1
from package import subpackage2
This is important because subpackage1 and subpackage2 exist as standalone packages too on my system.
Could someone help me with this?
The snippets you are showing do not make sense. Looks like there's a misunderstanding, in particular there's probably confusion between the name of the Python project and the names of the top-level importable packages.
In the setuptools.setup() function call, the parameter to the name argument should be the name of the project, not the name of an importable top level package. They can be the same names, but not necessarily.
The following might make it more explicit:
MyPythonProject
├── my_importable_package_one
│   ├── __init__.py
│   └── my_module_foo.py
├── my_importable_package_two
│ ├── __init__.py
│ └── my_module_bar.py
└── setup.py
setup.py
import setuptools
setuptools.setup(
name='MyPythonProject',
version='1.2.3',
packages=['my_importable_package_one', 'my_importable_package_two'],
# ...
)
from my_importable_package_one import my_module_foo
from my_importable_package_two import my_module_bar
Maybe this article on the terminology of Python packaging might help.

setup.py - copy external files in namespace directory

I want to install a package with the following structure:
├── nsp
│ ├── __init__.pxd
│ └── A
| ├── b.py
| ├── extension.pxd
| ├── extension.pyx
│ └── __init__.py
└── setup.py
whereby nsp is a namespace package. (Motivation: see here)
My setup script reads as follows:
from setuptools import setup
from setuptools.extension import Extension
# factory function
def my_build_ext(pars):
# import delayed:
from setuptools.command.build_ext import build_ext as _build_ext
# include_dirs adjusted:
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
#object returned:
return build_ext(pars)
extensions = [Extension(nsp.A.extension, ['nsp/A/extension.cpp'])]
setup(
cmdclass={'build_ext' : my_build_ext},
setup_requires=['numpy'],
install_requires=['numpy'],
packages=['nsp.A'],
ext_modules=extensions
package_data={
'': ['*.pxd', '*.pyx']
},
)
According to my current understanding, package_data describes all non-python files that shall be included. I desire to rebuild the exact file tree given above in the installation directory. Unfortunately, however, __init__.pxd is not copied. How could I achieve that?
I am working with Python 3.7 on Windows 10 x64.

setup.py sdist exclude packages in subdirectory

I have the following project structure I would like to package:
├── doc
│   └── source
├── src
│   ├── core
│   │   ├── config
│   │   │   └── log.tmpl
│   │   └── job
│   ├── scripts
│   └── test
└── tools
I would like to package core under src but exclude test. Here is what I tried unsuccessfully:
setup(name='core',
version=version,
package_dir = {'': 'src'}, # Our packages live under src but src is not a package itself
packages = find_packages("src", exclude=["test"]), # I also tried exclude=["src/test"]
install_requires=['xmltodict==0.9.0',
'pymongo==2.7.2',
'ftputil==3.1',
'psutil==2.1.1',
'suds==0.4',
],
include_package_data=True,
)
I know I can exclude test using the MANIFEST.in file, but I would be happy if you could show me how to do this with setup and find_packages.
Update:
After some more playing around, I realized that building the package with python setup.py install does what I expected (that is, it excludes test). However, issuing python setup.py sdist causes everything to be included (that is, it ignores my exclude directive). I don't know whether it is a bug or a feature, but there is still the possibility of excluding files in sdist using MANIFEST.in.
find_packages("src", exclude=["test"]) works.
The trick is to remove stale files such as core.egg-info directory. In your case you need to remove src/core.egg-info.
Here's setup.py I've used:
from setuptools import setup, find_packages
setup(name='core',
version='0.1',
package_dir={'':'src'},
packages=find_packages("src", exclude=["test"]), # <- test is excluded
####packages=find_packages("src"), # <- test is included
author='J.R. Hacker',
author_email='jr#example.com',
url='http://stackoverflow.com/q/26545668/4279',
package_data={'core': ['config/*.tmpl']},
)
To create distributives, run:
$ python setup.py sdist bdist bdist_wheel
To enable the latter command, run: pip install wheel.
I've inspected created files. They do not contain test but contain core/__init__.py, core/config/log.tmpl files.
In your MANIFEST.in at project root, add
prune src/test/
then build package with python setup.py sdist
I probably just use wild cards as defined in the find_packages documentation. *test* or *tests* is something I tend to use as we save only test filenames with the word test. Simple and easy ^-^.
setup(name='core',
version=version,
package_dir = {'': 'src'}, # Our packages live under src but src is not a package itself
packages = find_packages("src", exclude=['*tests*']), # I just use wild card. Works perfect ^-^
install_requires=['xmltodict==0.9.0',
'pymongo==2.7.2',
'ftputil==3.1',
'psutil==2.1.1',
'suds==0.4',
],
include_package_data=True,
)
FYI:
I would also recommend adding following into .gitignore.
build
dist
pybueno.egg-info
And move build and pushing package to pypi or your private repository bit into CI/CD to make whole setup look clean and neat.
Assuming that your folder is called tests and not test, it should work with the following code:
setup(name='core',
version=version,
package_dir = {'': 'src'}, # Our packages live under src but src is not a package itself
packages = find_packages('src', exclude=['tests'])
install_requires=['xmltodict==0.9.0',
'pymongo==2.7.2',
'ftputil==3.1',
'psutil==2.1.1',
'suds==0.4',
],
include_package_data=True,
)

Categories

Resources