My package installed with pip install shows no modules - python

I have a simple Python project with effectively one package (called forcelib) containing one module (also called forcelib):
- setup.py
- forcelib
|- __init__.py
|- forcelib.py
My setup.py is copied from the official example and has the obvious edits.
The problem is that I can install the forcelib package using pip but when I import forcelib, it only has the "double-underscore" attributes visible. That is, I cannot see the forcelib module.
Example to replicate:
git clone https://github.com/blokeley/forcelib
cd forcelib
pip install -e .
python
import forcelib
print(forcelib.__version__) # Correctly prints 0.1.2
dir(forcelib) # The only contents are the __version__, __path__ etc. double-underscore attributes. I had expected to see forcelib, example_read etc.
Perhaps I'm supposed to distribute just the module rather than bother with a package.
The (very small) project is on GitHub.
Any advice would be much appreciated.

It seems that there are 2 ways of doing it:
Keep the same directory structure but put the following in __init__.py
from .forcelib import *
Distribute a module, not a package. Follow the instructions to use the py_modules argument rather than the packages argument in setup.py. This would mean restructuring the project to:
setup.py
forcelib.py
Approach (1) can be seen here. It has the advantage of hiding the private functions and attributes (anything not in __all__), but the client can still see the module forcelib.forcelib, which I don't think it should.
Approach (2) can be seen here. It is simpler, but has the disadvantage that it does not hide private functions and attributes.

Download zip file and extract zip after go to forcelib-master directory
then open command prompt and go to forcelib-master directory in command prompt and run command
python setup.py install
It will install packages successfully

Related

Easiest way to import a function from a file in a different folder

Can I import a python module from a distant folder? (Possible duplicate of How to import a Python module from a sibling folder?)
In general, the git repo should have a requirements.txt if it is for general use. If it has one then you can run pip install requirements.txt
It is also fairly easy for the repo owner to generate this file. At the root of their project they can run pip freeze > requirements.txt. pip freeze lists all current dependencies and so this command outputs the result to requirements.txt.
As for your second point, it really depends on the package structure. If they want to expose the code in the package they may have imported it in one of the top level __init__.py files. Otherwise, you can always directly import by following the paths.
For example if your structure is:
project
folder
subfolder
module
And module has a function called foo then you can import foo like so: from project.folder.subfolder.module import foo. Of course this assumes each of these directories has its own __init__.py file
I could suggest places to look for packages using
import sys
sys.path.append('/path/to/your/module/address/')
Source: Import a file from another directory

pip installing a package with the same namespace as a local package

I am using Python 3.6.5, installed via miniconda. My issue is arising from the fact that I'm pip installing a package that has the same namespace as a local package. After pip installing this package, I can no longer import from the local package. I receive a ModuleNotFoundError error. The namespaces need to stay this way, if possible.
Here is my directory structure:
/root
stuff
- __init__.py
- my_stuff.py
app.py
init.py
__import__('pkg_resources').declare_namespace(__name__)
app.py
from stuff.my_stuff import my_fun
This works fine until I pip install the package with the same namespace, "stuff". After pip installing the package, the import statement, from stuff.my_stuff import my_fun throws the following error: ModuleNotFoundError: No module named 'stuff.my_stuff'. I kind of understand why. When importing modules in Python, it will look for built-in modules first, then sys.path, PYTHONPATH etc...
Here's the part thats really confusing me. If I create another arbitrary local module, like some_stuff, as shown below:
/root
stuff
- __init__.py
- my_stuff.py
some_stuff
- __init__.py
- more_stuff.py
app.py
and if I then run:
app.py
from some_stuff.more_stuff import more_fun
from stuff.my_stuff import my_fun
Everything works as expected. i.e. if I import some_stuff.more_stuff before stuff.my_stuff, everything works. But not vice versa. Solely importing stuff.my_stuff causes the ModuleNotFoundError.
app.py
# The code above works, but this causes the error
from stuff.my_stuff import my_fun
What is causing this behaviour? How can I solve this issue of locally referencing a package with the same namespace as one that was pip installed?
Edit:
I continued experimenting and noticed that when I remove all __init__.py files, everything works as expected. I came across this post: Since Python 3.3, a folder without an __init__.py can be considered part of an implicit namespace package. I'm still confused about the behaviour mentioned above though.
This SO question should answer your question
I am still putting the original answer here for convenience.
It's not possible to change "import path" (installed name) by specifying arguments to pip. You can, however, make some changes to the package after installing:
use pip install -e git+http://some_url#egg=some-name that way even if both packages have the same import path, they will be saved under different directories (using some-name provided after #egg=). After this you can go to the source directories of packages (usually venv/src/some-name) and rename some folders to change import paths
fork the repository, make changes, then install the package from that repository. Or you can publish your package on PyPI using different name and install it by that name
use pip download to put one of the packages in your project, then rename folders as you like

Including a python library with my script

My system administrator will not allow global installation of python packages.
I'm writing a script that people will invoke to perform certain actions for them. The script I'm writing needs certain libraries like sqlalchemy and coloredlogs. I am however allowed to install python libs any local folder. i.e not site-packages.
How would I go about installing the libs in the same folder as the script so that the script has access to them?
My folder hierarchy is like so
script_to_invoke.py
scriptpack/
bin/
coloredlogs
coloredlogs.egg
...
utils/
util1.py
util2.py
(all the folders indicated have an __init__.py)
What I've tried so far:
within script_to_invoke.py I use
from scriptpack.utils invoke util1 # no problem here
from scriptpack.bin import coloredlogs # fails to find the import
I've looked at some other SO answers abut I'm not sure how to correlate them with my problem.
I figured it out!
Python had to be directed to find the .egg files
This can be done by either
Editing the PYTHONPATH var BEFORE the interpreter is started (or)
Appending the full path to the eggs to the sys path
Code Below:
import sys
for entry in [<list of full path to egg files in bin dir>]:
sys.path.append(str(entry))
# Proceed with local imports
If you might want to try packaging up everything as a zipapp. Doing so makes a single zip file that acts as a Python script, but can contain a whole multitude of embedded packages. The steps to make it are:
Make a folder with the name of your program (testapp in my example)
Name your main script __main__.py and put it in that folder
Using pip, install the required packages to the folder with --target=/path/to/testapp
Run python3 -mzipapp testapp -p='/usr/bin/env python3' (providing the shebang line is optional; without it, users will need to run the package with python3 testapp.pyz, while with the shebag, they can just do ./testapp.pyz)
That creates a zip file with all your requirements embedded in it alongside your script, that doesn't even need to be unpacked to run (Python knows how to run zip apps natively). As a trivial example:
$ mkdir testapp
$ echo -e '#!/usr/bin/python3\nimport sqlalchemy\nprint(sqlalchemy)' > __main__.py
$ pip3 install --target=./testapp sqlalchemy
$ python3 -mzipapp testapp -p='/usr/bin/env python3'
$ ./testapp.pyz
<module 'sqlalchemy' from './testapp.pyz/sqlalchemy/__init__.py'>
showing how the simple main was able to access sqlalchemy from within the same zipapp. It's also smaller (thanks to the zipping) that distributing the uncompressed modules:
$ du -s -h testapp*
13M testapp
8.1M testapp.pyz
You can install these packages in a non-global location (generally in ~/.local/lib/python<x.y>) using the --user flag, e.g.:
pip install --user sqlalchemy coloredlogs
That way you don't have to worry about changing how imports work, and you're still compliant with your sysadmins policies.

Importing from sibling directories (python 3)

I can't figure out how to import modules from sibling directories in Python 3 using absolute imports.
modify the sys.path.
turn the directory into a pip installable package via __init__.py and setup.py.
For option 1. I figured out how to import modules from sibling directories by modifying the sys.path, but this method seems a little hackey to me. Also, I've read that it is not preferred. Why? Is there something inherently wrong or dangerous about modifying the sys.path?
For option 2. What exactly do I need to do make my package pip installable? I've alreay created my __init__.py file, but it seems that I need to create and configure a setup.py script to prepare my package for distribution? I'm still in the development mode, so is this really the best/pythonic method? If it is, then do I just type python setup.py install into my terminal after creating the setup.py?
Edit: I'm now trying to figure this out using absolute imports as python 3 does support relative imports.
From what I've read, Python 3 does not support relative imports
It does.
To import myproject/foo/__init__.py from myproject/bar/baz.py, you can use this:
from .. import foo
Or if you want to import an object/module in foo:
from ..foo import object
This requires myproject to be a package, so myproject/__init__.py has to exist.

What option do I need in setup.py to create the package in the right directory?

I am using setup.py to create a python package, which I want to install via pip. To correctly install the files under
lib/python2.7/site-packages/<package-name>
I used the following option in setup.py:
'package_dir': {'':'lib'}
as described here but get an error
error: package directory 'lib' does not exist
Well, there is no such directory as I want the current directory to be installed as package lib or whatever. I also tried to use
'package_dir': {'mycode':''}
which installes the code directly in
lib/python2.7/site-packages/
and not under
lib/python2.7/site-packages/<package-name>
What am I doing wrong, and where is this documented? I might overlooked the documentation of this basic feature as the documentation for setup.py is 'suboptimal'.
The description to how to do this an be found in the distribute documentation... Within a directory containing all of the project (TowelStuff/ in the given example) you specify the name of the actual module (towelstuff/). To include this as your module you need to add the following line in setup.py:
'packages': ['towelstuff']
After having created the sdist (from within TowelStuff/), the installation of this package will install it under site-packages/towelstuff, which can be imported as usual (from towelstuff import ...).

Categories

Resources