In my Python project with the directory layout
.
├── justfile
├── pyproject.toml
├── README.md
├── setup.cfg
└── src
└── foobar
├── __about__.py
├── __init__.py
└── main.py
__about__.py reads
__version__ = "1.0.0"
I would like to use this version info in setup.cfg. I tried
[metadata]
name = foobar
version = attr: foobar.__about__.__version__
[options]
package_dir =
=src
packages = find:
install_requires =
rich
python_requires = >=3.6
[options.packages.find]
where=src
but this still tries to import foobar, resulting in ModuleNotFoundErrors from foobars dependencies when trying to evaluate the version string.
The __init__.py reads
from .main import solve
from .__about__ import __version__
I had been under the impression that after this PR has been merged, just the AST of the attr was evaluated. (See also this question.)
Any idea what might be wrong?
src is not part of the package import path (you wouldn't do import src.foobar with your installed package), it's just a directory.
Try
version = attr:foobar.__about__.__version__
instead (assuming you've set up the src/ layout in setup.cfg).
I found the issue. In __init__.py, the __version__ must be imported before all external dependencies. This
from .__about__ import __version__
from .main import solve
works.
Related
I'm developing a Python package that includes an extension module:
# setup.py
from distutils.core import setup, Extension
setup(
name="myPythonPkg",
# ... all other args
packages=["myPythonPkg"],
ext_modules=[
Extension('myFastCfunctions', ['myFastCfunctions.c'])
]
)
When I test the installation of this package with python setup.py install --prefix=$PWD/prefix I see (roughly):
<prefix>
└── lib
└── python3.10
└── site-packages
├── myFastCfunctions.cpython-310-x86_64-linux-gnu.so
├── myPythonPkg
│ ├── __init__.py
└── myPythonPkg-1.0.2-py3.10.egg-info
Inside myPythonPkg/__init__.py I'd like to get the path of myFastCfunctions.cpython-310-x86_64-linux-gnu.so and load it via ctypes.cdll.LoadLibrary. Of course I can paste that path directly, but I was wondering if there is a smarter, more platform and version agnostic way to doing that.
Use EXTENSION_SUFFIXES[0], example:
from importlib.machinery import EXTENSION_SUFFIXES
myFastCfunctions_file = path.join(
path.join(
path.dirname(__file__), '..',
'myFastCfunctions{}'.format(
EXTENSION_SUFFIXES[0]
)
)
)
myFastCfunctions = cdll.LoadLibrary(myFastCfunctions_file)
Link to docs.
I am packaging a personal Python project and uploading it to pypi. It has the following structure:
root
├── package
│ ├── __init__.py
│ ├── foo.py
│ └── subpackage
│ ├── bar.py
│ └── __init__.py
├── setup.py
└── test.py
Now, foo.py has a function called public_function that depends on bar.py:bar_function , so I am structuring my code as:
# package/foo.py
from package.subpackage import bar_function
def public_function():
return bar_function()
# package/subpackage/__init__.py
from .bar import bar_function
My goal is to let people call public_function while hiding all its dependencies like this:
import package
package.bar_function()
So package/__init__.py looks like:
from .bar import bar_function
This is working correctly when importing from test.py, but when building using setuptools, things start getting weird.
After checking these similar questions (one, two and three) these are the two approaches I have tried, but don't work as I want:
Approach 1
As I don't want to expose subpackage, I tried:
# package/setup.py
from setuptools import setup
setup(
name='package',
packages=['package']
)
But when I do import package (after uploading to pypi and running pip install package), I get the following error:
File /venv/lib/python3.8/site-packages/package/foo.py:1
----> 1 from package.subpackage import bar_function
ImportError: cannot import name 'bar_function' from 'package.subpackage' (unknown location)
(I also tried with a relative import from .subpackage ... but didn't work)
Approach 2
As it looked like the subpackages are not making it to the build, I tried:
# package/setup.py
from setuptools import setup, find_packages
setup(
name='package',
packages=find_packages()
)
Which doesn't yield the error anymore, but now I end up exposing everything else (i.e I can do import foo and import subpackage) which I don't want.
What should be the right way to set up my package? Is it even possible to achieve what I want in Python?
I am trying to write a package with the following structure
/package
setup.py
/subpackage1
subpackage1.py
__init__.py
/subpackage2
subpackage2.py
__init__.py
/utils
some_other_files_and_codes
__init__.py
My setup.py currently looks like this:
from setuptools import setup, find_packages
setup(
name = 'subpackage1',
version = '1.0',
install_requires=['numpy',
'scipy'],
packages = find_packages(),
)
I then install it using pip install -e . from the /package folder.
However, I am not able to import subpackage2, only subpackage1.
I would like to be able to import them as
from package import subpackage1
from package import subpackage2
This is important because subpackage1 and subpackage2 exist as standalone packages too on my system.
Could someone help me with this?
The snippets you are showing do not make sense. Looks like there's a misunderstanding, in particular there's probably confusion between the name of the Python project and the names of the top-level importable packages.
In the setuptools.setup() function call, the parameter to the name argument should be the name of the project, not the name of an importable top level package. They can be the same names, but not necessarily.
The following might make it more explicit:
MyPythonProject
├── my_importable_package_one
│ ├── __init__.py
│ └── my_module_foo.py
├── my_importable_package_two
│ ├── __init__.py
│ └── my_module_bar.py
└── setup.py
setup.py
import setuptools
setuptools.setup(
name='MyPythonProject',
version='1.2.3',
packages=['my_importable_package_one', 'my_importable_package_two'],
# ...
)
from my_importable_package_one import my_module_foo
from my_importable_package_two import my_module_bar
Maybe this article on the terminology of Python packaging might help.
I have several repos that I want to name space. All of the repos follow the standard Python folder structures where
repo1 - repo1 - __init__.py
Outermost repo1 folder is the root folder and the inner repo1 folder is the root of the module. All of these repos will be installed using
pip install -e .
Currently, import statements like the following is used to import these modules.
import repo1
import repo2
import repo3
Is there a way to name space these modules so that I can have
import mymodule.repo1
import mymodule.repo2
import mymodule.repo3
I have to achieve the name spacing while keeping the repos separate. Merging the repos is not an option at this moment.
Implementation details depends on your needs for version support and distribution, but take a look at setuptools namespace_packages, this will do the work.
As pointed above, packaging site has an useful page on namespaced packaging.
Example for native namespaces (python >=3.3). Project layout for isolated repos:
project_root1
├── finance_namespace # no __init__ file here, this is important
│ └── repo1
│ ├── __init__.py
│ └── module1.py
└── setup.py
===============================
# setup.py
import setuptools
setuptools.setup(
name='repo1',
version='1',
description='',
long_description='',
author='Big bank',
author_email='john#bank.com',
license='MIT',
packages=['finance_namespace.repo1'],
zip_safe=False,
)
Now, by making cd project_root1 && pip install -e . you should be able to do
>>> from finance_namespace.repo1 import module1
>>> module1.func()
I am trying to create a setup.py for an existing project. The project has a directory structure that I cannot change. I need my setup.py to be in the same folder as my project source files.
Sample 1, directory structure.
MyModule
├── __init__.py
├── MyApp.ini
├── MyApp.py
├── setup.py
└── foo.py
This is my stetup.py
from setuptools import setup, find_packages
packages = find_packages(exclude=['ez_setup', 'tests', 'tests.*'])
console_script = list()
console_script.append('MyApp = MyApp:main')
py_modules = list()
py_modules.append('MyApp')
other_files = list()
other_files.append('MyApp.ini')
module_name = "MyModule"
mysetup = setup(name=module_name,
py_modules=py_modules,
version="1.0.0",
packages=packages,
package_dir={module_name: module_name},
package_data={module_name: other_files},
include_package_data=True,
entry_points={'console_scripts': console_script, },
zip_safe=False,
python_requires='>=2.7,<=3.0',
)
After installing MyModule via 'python setup install'. I cannot import from MyModule. 'from MyModule import MyApp' does not work. I can import directly. 'import MyApp' works. The problems is 'import foo' works as well. I have multiple projects with different foo.py.
Sample 2:
If I could change the directory structure as shown below. The install works correctly.
MyModule
├── MyModule
│ ├── foo.py
│ ├── __init__.py
│ ├── MyApp.ini
│ └── MyApp.py
└── setup.py
Is there a way to get sample 1, to install the way sample 2 does?
I was able to answer my own question. It can be done by setting package_dir up one level as shown below. I had to use data_files rather than package_data to add my support files.
Limitation: The setup script, setup.py, is installed as part of the egg. I tried to excluding it, but it gets installed anyway.
from setuptools import setup, find_packages
packages = find_packages(exclude=['ez_setup', 'tests', 'tests.*'])
console_script = list()
console_script.append('MyApp = MyModule.MyApp:main')
packages.append("MyModule")
setup(name="MyModule",
version="1.0.0",
packages=packages,
package_dir={"MyModule": "../MyModule"},
data_files=[('MyModule', ['MyApp.ini'])],
include_package_data=True,
entry_points={'console_scripts': console_script, },
zip_safe=False,
python_requires='>=2.7,<=3.0',
)