Not able to import custom package - python

what I am doing.
I am building a python package.
file hirarchy of the package is as following.
package/
library1/
__init__.py
module1.py
module2 py
setup.py
LICENCE
README.md
due to some reason I need to import module2 in module1
I have described packages in setup.py as follows.
setup(
name = "package",
packages = [ "library1", "library2"])
once I have built the package I installed it to my device.
I tried to import package
but I received an error no module named package

You can install your package w/o building it first:
pip install package --editable
It stays where it is and you can continue editing/debugging.
Your package is added to the Python sys.path, so Python looks inside package for imports but can't import package itself.
You import module2 via
import library1.module2
This is an absolute import, so it works from everywhere.

Related

setup.py runs but I get a "ModuleNotFoundError" when trying to import my package

I have a setup.py file. I run it using pip install -e .. The logging suggest my package is successfully installed. But when I run python -c "from mypackage import test_script" I get the error ModuleNotFoundError: No module named 'mypackage'.
My setup.py file looks like:
from setuptools import find_packages, setup
setup(
name="mypackage",
packages=find_package(include=["src", "src.*"]),
version="0.0.1",
description="my pakage",
author="me",
)
My package folder looks like:
src
mypackage
__init__.py
test_script.py
python -c "from src.mypackage import test_script" does work, but I don't want my package name to be preceeded with "src".
Any ideas what is going wrong?

Problem with importing my own pacakge from pypi and testpypi

I'm trying to publish a package I made on PyPI and i'm testing it on TestPyPI.I was able to upload the package and install it via pip but when I import it, it doesn't work as I would like.
The folder structure is:
package_name
|-- README.md
|-- setup.py
|
|-- package_name
|--__init__.py
|--__main__.py
|--name.py
|--tools.py
with a setup.py like this:
from setuptools import setup
setup(
name = 'package_name',
packages = ['package_name'],
version = '0.1.0',
...
)
When i write import package_name i obtain the directory containing the setup.py and the package directory. To use this package i need to write import package_name.package_name that load the init file but it's verbose . How can i obtain the same results with import pacakge_name alone?
I tried to use package_dir but without results. In addition this iusse emerge only when i upload it on TestPyPI or PyPi but not when i use the development mode pip install -e package_name
Some additional information:
the init file contain __version__ = '0.1.0'
if i load package_name and i inspect it i obtain <module 'package_name' (namespace)>
Using find_packages doesn't resolve the iusse

Python `pip install` from a local project - Modules can't find each other

I have a development server running in virtualenv (Python 3.6), into which I want to install a local python project. If I run pip install -e /path/to/myproject while virtualenv is active, then inside that environment I can import myproject. I can also do from myproject import submodule. But if I do from myproject import othermodule, I get ModuleNotFoundError: No module named 'submodule'. (othermodule imports submodule). This does not happen if I import myproject from myproject's root.
The directory structure is:
/path/to/myproject
setup.py
myproject/
__init__.py
submodule.py
othermodule.py
...
setup.py looks like:
setup(
name='myproject'
packages=['myproject']
)
What's going on? Why aren't those libraries found?
The issue was that Python 3 relative imports must be explicit.
In othermodule, instead of
import submodule
I need to write:
import myproject.submodule
or
import .submodule

How are dependencies managed when developing a python package to be distributed on PyPI?

I have a python package whose source looks like this
├── MANIFEST.in
├── README.rst
├── setup.cfg
├── setup.py
└── sqlemon
├── connection_strings.py
└── __init__.py
Most of the code is in __init__.py, which has the following imports:
import os
import sqlemon.connections_strings as sqlcs
import yaml #This is the problem
If we run
python setup.py sdist
we see the following error
Traceback (most recent call last):
File "setup.py", line 2, in <module>
import sqlemon
File "/home/danielsank/src/sqlemon/sqlemon/__init__.py", line 4, in <module>
import yaml
ImportError: No module named yaml
This suggests that the virtualenv in which I work on my project must have all of the project's dependencies installed in order to do development.
I guess that's not unreasonable, but I'm not entirely sure what the workflow should look like because the project's dependencies are listed in setup.py:
from distutils.core import setup
import sqlemon
version = sqlemon.__version__
project_name = sqlemon.__project_name__
setup(name=project_name,
# Irrelevant lines removed
install_requires=[
'sqlalchemy',
'alembic',
'pyyaml',
'sqlalchemy-schemadisplay'
],
)
I usually put requirements in requirements.txt so the developer can do pip install -r requirements.txt, but since the requirements are already in setup.py that seems redundant.
Furthermore, after uploading my project to PyPI, when I try to pip install from pypi, the installation fails unless I already have pyyaml installed in my virtualenv.
Obviously this is not the behavior we want; pyyaml should install automatically as it is listed in the install_requires list in setup.py.
What is the recommended workflow for this situation?
The problem is that setup.py imports sqlemon which imports pyyaml (and in principle any other dependencies) so it's impossible to process it without having those dependencies installed.
The reason I had setup.py importing sqlemon was to get the version number.
A better strategy for the version numbers is explained here, which allows us to not import our own project in setup.py.

install local package into virtualenv using setuptools

I have a virtualenv with multiple little projects in it. Consider that they are all equal, so my folder structure looks something like this:
categorisation_ml/
categorisation.py
setup.py
__init__.py
nlp/
nlp.py
setup.py
__init__.py
etc/
__init__.py
I want to install both packages into the same virtualenv so that they are both accessible everywhere within the virtualenv.
Using this and this guide, I have created a setup.py script like this (for categorisation in this case):
from setuptools import setup, find_packages
setup(
name = "categorisation",
version = "1.0",
scripts = ['categorisation.py']
)
then, I run python setup.py install , which seems to complete successfully.
When I cd into nlp/, enter python command line and try
import categorisation, I get:
ImportError: No module named categorisation.
What am I missing?
It seems that the package structure and setup.py is off. It should be something like this:
irrelevant_package_name/
__init__.py
setup.py
categorisation_ml/
categorisation.py
__init__.py
nlp/
nlp.py
__init__.py
and then the install script looking like this:
from setuptools import setup, find_packages
setup(
name='package_name',
version='1.0.0',
description='This is a working setup.py',
url='http://somesite.com',
author='Roman',
author_email='roman#somesite.com',
packages=find_packages(),
install_requires=[
'numpy',
],
zip_safe=False
)
Then install it like this:
python setup.py install #(just installs it as is)
python setup.py develop #(Keeps track of changes for development)
If you pip freeze this should come up
package_name==1.0.0
And then in python imports should look like this:
from categorisation_ml import categorisation
from nlp import nlp

Categories

Resources