How to run Python powered CLI like a normal system CLI? - python

I want to run my built CLI like other cli tools, for eg, kubectl, redis, etc. Currently, I run my cli as: python3 cli.py subarg --args; instead, I want to run: invdb subarg --args where invdb is the Python package.
The structure of the project repository is:
.
├── CHALLENGE.md
├── Pipfile
├── Pipfile.lock
├── README.md
├── __pycache__
│   └── config.cpython-38.pyc
├── data_platform_challenge_darwin
├── data_platform_challenge_linux
├── data_platform_challenge_windows
├── discussion_answers_rough_work
├── dist
│   ├── invdb-0.0.1.tar.gz
│   └── invdb-tesla-kebab-mai-haddi-0.0.1.tar.gz
├── example.json
├── invdb
│   ├── __init__.py
│   ├── analysis.py
│   ├── cleanup.py
│   ├── cli.py
│   ├── config.py
│   ├── etl.py
│   ├── groups.py
│   ├── initialize_db.py
│   └── nodes.py
├── invdb.egg-info
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   ├── dependency_links.txt
│   └── top_level.txt
├── setup.py
├── test.db
└── tests

setuptools (or is it distutils? The line is so blurry) provides an entry_points.console_scripts option that can do this for you when installing your package. I will provide an example repository at the bottom of my summary.
Construct a project tree like so:
# /mypackage/mymodule.py
print("We did it!")
# /pyproject.toml
[build-system]
requires = ["setuptools", "wheel"]
build-backend = "setuptools.build_meta"
# this is required for Python to recognize setuptools as the build backend
[metadata]
name = sample_module
version = 0.0.1
author = Adam Smith
description = console_script example
[bdist_wheel]
universal = true
[options]
packages = my_package
python_requires = >=2.7
entry_points =
[console_scripts]
sample_module = my_package.my_module:main
then run the following at the shell:
$ python3 -mpip install .
(ed. this will install the file locally. To build a wheel (to install elsewhere) try pep517)
If you get a warning about the installation script not being on your PATH, you should consider adding it. Otherwise, just run your new script
$ sample_module
We did it!
GitLab: nottheeconomist/console_script_example
Since you already have a setup.py, consider adding the following entry to your setuptools.setup call:
# ...
setuptools.setup(
# ...
entry_points = {
'console_scripts': ['sample_module=my_package.my_module:main']
}
)

Related

Error: package directory XYZ does not exist

I have a directory with following structure,
.
├── awesome
│   ├── alice
│   │   ├── conf.py
│   │   └── __init__.py
│   ├── bob
│   │   ├── conf.py
│   │   └── __init__.py
│   ├── conf.py
│   ├── __init__.py
│   └── john
│   ├── conf.py
│   └── __init__.py
├── not_awesome_1
│   ├── __init__.py
│   └── something.py
├── not_awesome_2
│   ├── __init__.py
│   └── something.py
└── setup.py
I want to make the awesome package to be shippable. So, I made the setup.py as below,
from setuptools import find_packages, setup
setup(
name="my-awesome-package",
version="0.1.0",
description="",
long_description="",
license="BSD",
packages=find_packages(where="awesome"),
include_package_data=True,
author="JPG",
author_email="foo#gmail.com",
install_requires=[],
)
I ran the command python3 setup.py bdist_wheel and it gave me the result
running bdist_wheel
running build
running build_py
error: package directory 'alice' does not exist
What was I trying to achieve?
I wanted to decouple the awesome package and wanted to reuse it in multiple projects as I'm currently using the same in not_awesome_1 or not_awesome_2 packages.
In other words, after the successful installation of my-awesome-package I should be able to use the awesome packge as
from awesome.alice.conf import Alice
alice = Alice()
What have I tried so far?
replaced packages=find_packages(where="awesome"), with packages=find_packages(),, but, during the build it also includes the not_awesome_X packages as well - which is not intended.
Intriduced package_dir as well
setup(
# other options
packages=find_packages(where="awesome"),
package_dir={"": "awesome"},
)
But, this doesn't allow me to import my packages as from awesome.alice.conf import Alice, but, from alice.conf import Alice (ie, awesome is missing)
Questions?
What was I doing wrong here?
How to properly configure packages and package_dir?
I encountered a similar error. Try manually defining both the top-level package and the sub-packages:
packages=["awesome", "awesome.alice", "awesome.bob", "awesome.john", "awesome.something.somethingelse"].
Edit:
The issue is that using the where kwarg defines the package to search in. Since you have packages in the root of the project that should not be bundled, you'll likely need to manually add the parent package's name in front of each of its sub-packages.
from setuptools import find_packages
if __name__ == "__main__":
print(find_packages(where="awesome"))
# ['bob', 'alice', 'john', 'john.child']
# the problem here is that 'awesome' is the root, not the current directory containing awesome
root_package = "awesome"
print([root_package] + [f"{root_package}.{item}" for item in find_packages(where=root_package)])
# ['awesome', 'awesome.bob', 'awesome.alice', 'awesome.john', 'awesome.john.child']
Then, in your setup.py:
...
root_package = "awesome"
...
setup(
# other options
packages=[root_package] + [f"{root_package}.{item}" for item in find_packages(where=root_package)],
# package_dir={"": "awesome"}, <- not needed
)

How to install python package namespace from private bitbucket-git repository

I have several related projects that I think will be a good fit for Python's namespace-packages. I'm currently running python 3.8, and have created the following directory structure for testing.
├── namespace-package-test.package1
│   ├── LICENSE.txt
│   ├── README.md
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── pkg1
│   │   ├── cli
│   │   │   ├── __init__.py
│   │   │   └── pkg1_cli.py
│   │   └── __init__.py
│   └── tests
├── namespace-package-test.package2
│   ├── AUTHORS.rst
│   ├── CHANGELOG.rst
│   ├── LICENSE.txt
│   ├── README.md
│   ├── setup.cfg
│   ├── setup.py
│   ├── src
│   │   └── pkg2
│   │   ├── cli
│   │   │   ├── __init__.py
│   │   │   └── pkg2_cli.py
│   │   └── __init__.py
│   └── tests
The entire project is on a private bitbucket (cloud) server at;
git#bitbucket.org:<my-company>/namespace-package-test.git
I would like to install, locally, only package 1. I've tried every iteration I can imagine of the following, but nothing seems to get me there. I either get a repository not found error or a setup.py not found error.
pip install git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git
Is this possible?
Is my project structure correct for what I am doing?
What should the pip install command look like?
Bonus, what if I only want to install a specific spec using pipx?
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org:<my-company>/namespace-package-test.package1.git"
I think I figured it out ... for posterity sake
Pip install (into virtual environment)
pip install git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1
pipx install - with spec
pipx install "namespace-package-test.package1[cli] # git+ssh://git#bitbucket.org/<company name>/namespace-package-test.git/#subdirectory=namespace-package-test.package1"

Dynamically copying package data during install time in setup.py

I have a directory structure which looks like below :
.
├── bitbucket-pipelines.yml
├── MANIFEST.in
├── pylintrc
├── setup.cfg
├── setup.py
├── src
│   ├── bin
│   │   ├── __init__.py
│   │   └── project.py
│   ├── __init__.py
│   └── ml_project
│   ├── configurations
│   │   └── precommit
│   ├── core
│   │   ├── command
│   │   │   ├── abs_command.py
│   │   │   ├── __init__.py
│   │   │   ├── no_command.py
│   │   │   ├── precommit.py
│   │   │   ├── project_utils.py
│   │   │   ├── setupsrc.py
│   │   │   └── setuptox.py
│   │   ├── configurations
│   │   │   └── precommit
│   │   └── __init__.py
│   └── __init__.py
└── tox.ini
When i do the packaging for the project my requirement is to basically copy the files .gitlint and .pre-commit-config.yaml files inside the configurations/precommit folder of my ml_project package. configurations is just a normal directory and not a Python package as it does not contain .py files.
A small edit the .gitlint and .pre-commit-config.yaml are in the same level as setup.py is.
My setup.py looks like below :
"""Setup script."""
import io
import re
import os
import shutil
from setuptools import setup
PROJECT_NAME = "ml_project"
CONFIGURATIONS_DIR_NAME = "configurations"
FULL_CONFIG_DIR = os.path.join("src", PROJECT_NAME, CONFIGURATIONS_DIR_NAME)
def get_version() -> str:
"""Return the version stored in `ml_project/__init__.py:__version__`."""
# see https://github.com/pallets/flask/blob/master/setup.py
with io.open("src/ml_project/__init__.py", "rt", encoding="utf8") as init_file:
return re.search(r'__version__ = "(.*?)"', init_file.read()).group(1)
def add_config_files_for_package(source_dir: str = None) -> None:
if not source_dir:
source_dir = os.path.dirname(os.path.abspath(__file__))
config_files = {"precommit": [".gitlint", ".pre-commit-config.yaml"]}
for config in config_files:
config_dir = os.path.join(source_dir, FULL_CONFIG_DIR, config)
for file in config_files[config]:
shutil.copyfile(
os.path.join(source_dir, file), os.path.join(config_dir, file)
)
add_config_files_for_package()
setup(version=get_version())
So i am using the add_config_files_for_package function to do the copying when i run python setup.py sdist.
I have a MANIFEST.in file which looks like below :
include .gitlint
include .pre-commit-config.yaml
graft src/ml_project
And finally below is my setup.cfg :
[options]
package_dir =
=src
packages = find:
include_package_data = true
install_requires =
click
pre-commit
pyyaml
gitlint
[options.packages.find]
where = src
[options.entry_points]
console_scripts =
project = bin.project:main
[options.extras_require]
tests =
pytest
pytest-mock
pyfakefs
pyyaml
configparser
linting =
pylint
testdocs =
pydocstyle
pre-commit =
pre-commit
[semantic_release]
version_variable = ml_project/__init__.py:__version__
This runs fine but my question is : is there a better and more standard way of doing this stuff ? Like without writing the function in the first place at all?
Thanks for any pointers in advance.
As mentioned in the comments, it could be a good idea to place these files in the src/ml_project/configurations/precommit directory and create symbolic links to these files at the root of the project. Symbolic links should play well with git, but some platforms (Windows for example) don't have good support for them.
Or the copy of these files could be just another step in the build process (eventually via a custom setuptools command), that should be triggered from a Makefile (for example, or any other similar tool), and from the CI/CD toolchains (bitbucket-pipelines.yml in this case).

ModuleNotFoundError with package installed from github

I installed a package in my anaconda environment by entering the following line:
pip3 install -e git+https://github.com/gauravmm/jupyter-testing.git#egg=jupyter-testing
I keep getting ModuleNotFoundError: No module named 'testing' on line: from testing.testing import test. I have no idea why this is happening, and believe it has something to do with the way my directory structure is set up.
My directory tree looks like this:
├── hw1_get_started.ipynb
├── requirements.txt
└── src
└── jupyter-testing
├── jupyter_testing.egg-info
│   ├── dependency_links.txt
│   ├── PKG-INFO
│   ├── SOURCES.txt
│   └── top_level.txt
├── LICENSE
├── README.md
├── setup.py
└── testing
├── __init__.py
└── testing.py
I am trying to use this module : https://github.com/gauravmm/jupyter-testing.git#egg=jupyter-testing to do some testing in an online class.
I appreciate any help and explanation as to what I am doing wrong! :)

Right way to set python package with sub-packages

I am trying to set a package with sub-packages in python. Here is the tree structure that I have at the moment:
myPackage
├── __init__.py
├── mySubPackage1
│   ├── foo2.py
│   ├── foo.py
│   └── __init__.py
├── mySubPackage2
│   ├── bar2.py
│   ├── bar.py
│   └── __init__.py
└── setup.py
All __init__.py are empty. Here is my setup.py:
from distutils.core import setup
if __name__ == "__main__":
setup(
name='myPackage',
package_dir = {
'mySubPackage1': 'mySubPackage1',
'mySubPackage2': 'mySubPackage2'},
packages=['mySubPackage1', 'mySubPackage2'],
)
The problem is that, when I run python setup.py install from myPackage, the sub packages are installed into dist-packages:
/usr/local/lib/python2.7/dist-packages/mySubPackage1
/usr/local/lib/python2.7/dist-packages/mySubPackage2
I guess the problem is my setup.py, but I don't know how to fix it? Should it be in the parent directory of myPackage? If so, then how does it work when I pack the package into a zip using python setup.py sdist?
Just use setuptools instead of distutils, it has find_packages exactly for that purpose:
from setuptools import setup, find_packages
setup(
name='myPackage',
packages=find_packages(),
)
TL;DR: Nest the package in another package having the same name.
I nested the super-package myPackage inside a directory (having the same name) as follow:
myPackage
├── myPackage
│   ├── __init__.py
│   ├── mySubPackage1
│   │   ├── foo1.py
│   │   ├── foo2.py
│   │   └── __init__.py
│   └── mySubPackage2
│   ├── bar1.py
│   ├── bar2.py
│   └── __init__.py
└── setup.py
Then, I updated the setup.py:
from distutils.core import setup
if __name__ == "__main__":
setup(
name='myPackage',
package_dir = {
'myPackage': 'myPackage',
'myPackage.mySubPackage1': 'myPackage/mySubPackage1',
'myPackage.mySubPackage2': 'myPackage/mySubPackage2'},
packages=['myPackage', 'myPackage.mySubPackage1',
'myPackage.mySubPackage2']
)
Now, sudo python setup.py install behaves as I expect and in dist-packages I have the following structure:
myPackage
├── __init__.py
├── __init__.pyc
├── mySubPackage1
│   ├── foo1.py
│   ├── foo1.pyc
│   ├── foo2.py
│   ├── foo2.pyc
│   ├── __init__.py
│   └── __init__.pyc
└── mySubPackage2
├── bar1.py
├── bar1.pyc
├── bar2.py
├── bar2.pyc
├── __init__.py
└── __init__.pyc
and an egg file.
This is almost good. Now it is not platform independent because of the usage of /. To fix this, I edited setup.py as follow:
from distutils.core import setup
from distutils import util
if __name__ == "__main__":
pathMySubPackage1 = util.convert_path('myPackage/mySubPackage1')
pathMySubPackage2 = util.convert_path('myPackage/mySubPackage2')
setup(
name='myPackage',
package_dir = {
'myPackage': 'myPackage',
'myPackage.mySubPackage1': pathMySubPackage1,
'myPackage.mySubPackage2': pathMySubPackage2},
packages=['myPackage', 'myPackage.mySubPackage1',
'myPackage.mySubPackage2']
)

Categories

Resources