Install own projects as dependency | module name issue - python

I have multiple projects I am working on several libraries and several clients which require these libraries as dependencies.
library structure
$ pwd
~/Projects/library
$ tree
.
├── api.py
├── __init__.py
└── setup.py
$ cat api.py
import requests
# ...
def process(data):
for record in data:
print(f"Processing {record}")
$ cat __init__.py
from .api import process
$ cat setup.py
from setuptools import find_packages, setup
setup(
name='my_library',
version='1.0.0',
packages=find_packages(),
include_package_data=True,
zip_safe=False,
install_requires=[
'requests',
],
)
I then pushed the code to my private github repo, and now want to install it as a dependency of client
client structure
$ pwd
~/Projects/client
$ tree -a -L 1
.
├── .venv
└── client.py
$ cat client.py
from my_library import process
data = list(range(5))
process(data)
$ . .venv/bin/activate
(.venv) $ pip install git+ssh://git#github.com/USER/library.git
...
Installing collected packages: idna, certifi, urllib3, chardet, requests, my-library
Running setup.py install for my-library ... done
Successfully installed certifi-2019.9.11 chardet-3.0.4 idna-2.8 my-library-1.0.0 requests-2.22.0 urllib3-1.25.3
$ python client.py
Traceback (most recent call last):
File "client.py", line 1, in <module>
from my_library import process
ModuleNotFoundError: No module named 'my_library'
A point that I realized and am thinking it might be related to the question;
The directory (and repo) are named library (single word)
In setup.py the name is my_library (name='my_library') (separated by an underscore)
pip freeze shows it as my-library==1.0.0 (separated with a hyphen)

You are confused between the project name and package/ module name.
Python import system doesn't care about the project name, only pip does.
Python cares about packages and modules but your project doesn't have packages therefore find_packages() doesn't add nothing to your folder.
What you should do is:
Create a folder named my_library under the project folder.
Put __init__.py on this folder
Put you python modules in this folder.
Remove the __init__.py from your project folder.
More info here

Related

Python project, installation only works with "pip install -e .", but not "pip install ."

I've made my first python project and uploaded it on GitHub.
Upon installation with pip, I get different errors based on whether, from the root directory, I execute
pip install .
or
pip install -e .
I was able to recreate this error by having a project with these elements:
Dir tree:
.
├── my_package
│   ├── __init__.py
│   ├── main.py
│   └── test.py
└── setup.py
main.py:
import my_package.test
test.py:
print("hello there!")
setup.py
from setuptools import setup, find_packages
setup(
name='my_package',
version='1.0.0',
description='Description of my package',
packages=find_packages(
where="my_package"
),
install_requires=[]
)
This would, after executing
pip install .
python3 my_package/main.py
give this error:
Traceback (most recent call last):
File "./my_package/main.py", line 1, in <module>
import my_package.test
ModuleNotFoundError: No module named 'my_package'
Said error does not appear when main is called by using
pip install -e .
python3 my_package/main.py
giving back the expected "hello there!".
I managed to fix the problem, understanding that the issue was having the main script inside of the package, while importing files from the package itself.
But I still do not understand why it would work anyway when setting up the package/app in development mode...
Question: Why does that happen?

setup.py runs but I get a "ModuleNotFoundError" when trying to import my package

I have a setup.py file. I run it using pip install -e .. The logging suggest my package is successfully installed. But when I run python -c "from mypackage import test_script" I get the error ModuleNotFoundError: No module named 'mypackage'.
My setup.py file looks like:
from setuptools import find_packages, setup
setup(
name="mypackage",
packages=find_package(include=["src", "src.*"]),
version="0.0.1",
description="my pakage",
author="me",
)
My package folder looks like:
src
mypackage
__init__.py
test_script.py
python -c "from src.mypackage import test_script" does work, but I don't want my package name to be preceeded with "src".
Any ideas what is going wrong?

Ignore pkg_resources.ContextualVersionConflict or ResolutionImpossible

I have built a python module which installs kwikapi==0.4.5 and requests==2.22.0. But kwikapi has requests==2.18.4.
Now when I install and run my package, I am getting error pkg_resources.ContextualVersionConflict: (requests 2.22.0 (/tmp/test_vir3.7/lib/python3.7/site-packages), Requirement.parse('requests==2.18.4'), {'kwikapi'}).
Now if I install requests==2.18.4(pip install requests==2.18.4) and run then the error is pkg_resources.DistributionNotFound: The 'requests==2.22.0' distribution was not found and is required by my-pack.
I can use 2.18.4 instead of 2.22.0 requests to solve this. But this problem again comes if I have same module with different versions in both requests and in kwikapi.
Is there a way to ignore/solve this error?
setup/reproduce
Module structure
.
├── my_pack
│   └── __init__.py
└── setup.py
setup.py
from setuptools import setup, find_packages
version = "0.0.1"
setup(
name="my_pack",
packages=find_packages("."),
package_dir={"my_pack": "my_pack"},
include_package_data=True,
install_requires=[
"kwikapi==0.4.5",
"requests==2.22.0"
],
entry_points={"console_scripts": ["my_pack = my_pack:main"]},
)
__init__.py
def main():
print("Hey! I ran")
Create python virtual environment and activate
$ python3.7 -m venv vir
$ source vir/bin/activate
install
# Go to setup.py file location and do
$ pip install .
Run
$ my_pack

How are dependencies managed when developing a python package to be distributed on PyPI?

I have a python package whose source looks like this
├── MANIFEST.in
├── README.rst
├── setup.cfg
├── setup.py
└── sqlemon
├── connection_strings.py
└── __init__.py
Most of the code is in __init__.py, which has the following imports:
import os
import sqlemon.connections_strings as sqlcs
import yaml #This is the problem
If we run
python setup.py sdist
we see the following error
Traceback (most recent call last):
File "setup.py", line 2, in <module>
import sqlemon
File "/home/danielsank/src/sqlemon/sqlemon/__init__.py", line 4, in <module>
import yaml
ImportError: No module named yaml
This suggests that the virtualenv in which I work on my project must have all of the project's dependencies installed in order to do development.
I guess that's not unreasonable, but I'm not entirely sure what the workflow should look like because the project's dependencies are listed in setup.py:
from distutils.core import setup
import sqlemon
version = sqlemon.__version__
project_name = sqlemon.__project_name__
setup(name=project_name,
# Irrelevant lines removed
install_requires=[
'sqlalchemy',
'alembic',
'pyyaml',
'sqlalchemy-schemadisplay'
],
)
I usually put requirements in requirements.txt so the developer can do pip install -r requirements.txt, but since the requirements are already in setup.py that seems redundant.
Furthermore, after uploading my project to PyPI, when I try to pip install from pypi, the installation fails unless I already have pyyaml installed in my virtualenv.
Obviously this is not the behavior we want; pyyaml should install automatically as it is listed in the install_requires list in setup.py.
What is the recommended workflow for this situation?
The problem is that setup.py imports sqlemon which imports pyyaml (and in principle any other dependencies) so it's impossible to process it without having those dependencies installed.
The reason I had setup.py importing sqlemon was to get the version number.
A better strategy for the version numbers is explained here, which allows us to not import our own project in setup.py.

Call another setup.py in setup.py

My repository contains my own python module and a submodule to one of its dependencies which has its own setup.py.
I'd like to call the dependency's setupy.py when installing my own lib, how is it possible?
My first attempt:
$ tree
.
├── dependency
│ └── setup.py
└── mylib
└── setup.py
$ cat mylib/setup.py
from setuptools import setup
setup(
name='mylib',
install_requires= ["../dependency"]
# ...
)
$ cd mylib && python setup.py install
error in arbalet_core setup command: 'install_requires' must be a string or list of strings containing valid project/version requirement specifiers; Invalid requirement, parse error at "'../depen'"
However install_requires does not accept paths.
My second attempt was to use dependency_links=["../dependency"] with install_requires=["dependency"] however a dependency of the same name already exists in Pypi so setuptools tries to use that version instead of mine.
What's the correct/cleanest way?
A possible solution is to run a custom command before/after the install process.
An example:
from setuptools import setup
from setuptools.command.install import install
import subprocess
class InstallLocalPackage(install):
def run(self):
install.run(self)
subprocess.call(
"python path_to/local_pkg/setup.py install", shell=True
)
setup(
...,
cmdclass={ 'install': InstallLocalPackage }
)

Categories

Resources