I'm trying to install a python package I've developed using the develop command of setuptools.
[sidenote: There is a bewilderingly vast quantity of information about this on the web (distutils, distutils2, setuptools, distribute). setuptools and develop are, as far as I can tell, the most modern/best practice way to use a piece of code that's in development. Perhaps I am wrong.]
Here's what I did:
(1) I placed an empty __init__.py in the directory with my Python code.
(2) I made a setup.py:
from setuptools import setup, find_packages
setup(name = "STEM_pytools",
version = "0.1",
packages = find_packages(),
author = "Timothy W. Hilton",
author_email = "my#email.address",
description = "visualization and data pre-/post-processing tools for STEM",
license = "",
keywords = "STEM",
url = "")
(3) I ran
python setup.py develop
That seemed to proceed without problems.
However, when I try to use the package, I get:
>>> import STEM_pytools
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named STEM_pytools
The same thing happens with the install command: it's output looks ok, then "No module named STEM_pytools". I'm tearing my hair out. Any suggestions appreciated!
I solved the problem, although I still don't entirely understand why it works now and did not work before. It seems my setup.py and the directory structure of my project were not interacting successfully.
This is the directory structure that worked with my setup.py:
STEMpytools/
setup.py
stem_pytools/
__init__.py
source1.py
source2.py
...
sourceN.py
This directory structure did not work, at least when paired with my setup.py:
STEMpytools/
setup.py
__init__.py
source1.py
source2.py
...
sourceN.py
This explanation helped me a lot: http://bashelton.com/2009/04/setuptools-tutorial/
Now, from the python interpreter, these both work:
import stem_pytools
import stem_pytools.source1
Experimenting on my system suggests it is necessary to place __init__.py and the package source code in a subdirectory one level below the root directory that contains setup.py. I'm not sure from the setuptools and distutils documentation why this is the case.
Related
I am working on a project (CN_Analysis) for which I would like to create my own python package (cn_tools) using setuptools. My goal is to have it accessible everywhere in my project folder. However, when I try to import it from a subfolder (e.g. CN_Analysis/Notebooks), I get
(.virtualenvironment) ...:~/Workspace/CN_Analysis/Notebooks$ python3
import cn_tools
ModuleNotFoundError: No module named 'cn_tools'
The directory structure is as follows:
CN_Analysis
├──README.md\
├──requirements.txt\
├──.gitignore\
├──setup.py\
├──.virtualenvironment/\
├──Notebooks/\
├──Data/\
├──cn_tools/\
| ├──__init__.py\
| ├──my_tools.py
The contents of setup.py are:
from setuptools import setup, find_packages
setup(name = 'cn_tools',
version = '0.1',
description = 'This package contains helpful functions for the processing data obtained from Karambola.',
packages=find_packages(where='cn_tools'),
package_dir={'': 'cn_tools'})
Additional information:
The basic routine is
source .virtualenvironment/bin/activate
(.virtualenvironment) python3 setup.py develop
Results in
Installed /home/ansgar/Workspace/CN_Analysis/cn_tools\
Processing dependencies for cn-tools==0.1\
Finished processing dependencies for cn-tools==0.1
Check for python3
(.virtualenvironment) which python3
/home/my_name/Workspace/CN_Analysis/.virtualenvironment/bin/python3
And if I call sys.path from python after I navigated to a subfolder (e.g Notebooks/), it returns
['',
'/usr/lib/python38.zip',
'/usr/lib/python3.8',
'/usr/lib/python3.8/lib-dynload',
'/home/my_name/Workspace/CN_Analysis/.virtualenvironment/lib/python3.8/site-packages',
'/home/my_name/Workspace/CN_Analysis/cn_tools']
Does someone know why I cannot import cn_tools?
It works if I just use
packages=find_packages()
instead of
packages=find_packages(where='cn_tools')
in the setup.py file.
I am trying to build a standalone app that utilises Pandas. This is my setup.py file:
from setuptools import setup
APP = ['MyApp.py']
DATA_FILES = ['full path to/chromedriver']
PKGS = ['pandas','matplotlib','selenium','xlrd']
OPTIONS = {'packages': PKGS, 'iconfile': 'MyApp_icon.icns'}
setup(
app=APP,
data_files=DATA_FILES,
options={'py2app': OPTIONS},
setup_requires=['py2app','pandas','matplotlib','selenium','xlrd'],
)
The making of the *.app file goes smoothly, but when I try to run it, it gives me the following error:
...
import pandas._libs.testing as _testing
File "pandas/_libs/testing.pyx", line 1, in init pandas._libs.testing
ModuleNotFoundError: No module named 'cmath'
I tried to include ‘cmath’ in my list of PKGS and in setup_requires in the setup.py file, but when I tried to build the app using py2app it gave me the error:
distutils.errors.DistutilsError: Could not find suitable distribution for Requirement.parse('cmath')
I am stuck. I couldn't find anything useful online. cmath should be automatically included from what I have been reading. Any ideas on where is the problem and how can I fix it?
I think I have found a solution: downgrade to Python version 3.6.6.
See: python 3.6 module 'cmath' is not found
To uninstall Python I followed this process: https://www.macupdate.com/app/mac/5880/python/uninstall
Then I installed Python 3.6.6: https://www.python.org/downloads/release/python-366/
With Python 3.6.6, Py2App seem to work no problem and includes Pandas smoothly.
It seems that for some reasons cmath is not included in the latest versions of Python? I might be wrong. Please let me know what you think and if you have any questions.
P.S.: I am using MacOS (Mojave 10.14.6) and PyCharm.
I had a similar issue with py2app and cmath. I solved this by adding import cmath into the main script. (MyApp.py in your case) I think doing so may have the modulegraph to add the cmath library files.
This doesn't make sense to me. How can I use the setup.py to install Cython and then also use the setup.py to compile a library proxy?
import sys, imp, os, glob
from setuptools import setup
from Cython.Build import cythonize # this isn't installed yet
setup(
name='mylib',
version='1.0',
package_dir={'mylib': 'mylib', 'mylib.tests': 'tests'},
packages=['mylib', 'mylib.tests'],
ext_modules = cythonize("mylib_proxy.pyx"), #how can we call cythonize here?
install_requires=['cython'],
test_suite='tests',
)
Later:
python setup.py build
Traceback (most recent call last):
File "setup.py", line 3, in <module>
from Cython.Build import cythonize
ImportError: No module named Cython.Build
It's because cython isn't installed yet.
What's odd is that a great many projects are written this way. A quick github search reveals as much: https://github.com/search?utf8=%E2%9C%93&q=install_requires+cython&type=Code
As I understand it, this is where PEP 518 comes in - also see some clarifications by one of its authors.
The idea is that you add yet another file to your Python project / package: pyproject.toml. It is supposed to contain information on build environment dependencies (among other stuff, long term). pip (or just any other package manager) could look into this file and before running setup.py (or any other build script) install the required build environment. A pyproject.toml could therefore look like this:
[build-system]
requires = ["setuptools", "wheel", "Cython"]
It is a fairly recent development and, as of yet (January 2019), it is not finalized / approved by the Python community, though (limited) support was added to pip in May 2017 / the 10.0 release.
One solution to this is to not make Cython a build requirement, and instead distribute the Cython generated C files with your package. I'm sure there is a simpler example somewhere, but this is what pandas does - it conditionally imports Cython, and if not present can be built from the c files.
https://github.com/pandas-dev/pandas/blob/3ff845b4e81d4dde403c29908f5a9bbfe4a87788/setup.py#L433
Edit: The doc link from #danny has an easier to follow example.
http://docs.cython.org/en/latest/src/reference/compilation.html#distributing-cython-modules
When you use setuptool, you should add cython to setup_requires (and also to install_requires if cython is used by installation), i.e.
# don't import cython, it isn't yet there
from setuptools import setup, Extension
# use Extension, rather than cythonize (it is not yet available)
cy_extension = Extension(name="mylib_proxy", sources=["mylib_proxy.pyx"])
setup(
name='mylib',
...
ext_modules = [cy_extension],
setup_requires=["cython"],
...
)
Cython isn't imported (it is not yet available when setup.pystarts), but setuptools.Extension is used instead of cythonize to add cython-extension to the setup.
It should work now. The reason: setuptools will try to import cython, after setup_requires are fulfilled:
...
try:
# Attempt to use Cython for building extensions, if available
from Cython.Distutils.build_ext import build_ext as _build_ext
# Additionally, assert that the compiler module will load
# also. Ref #1229.
__import__('Cython.Compiler.Main')
except ImportError:
_build_ext = _du_build_ext
...
It becomes more complicated, if your Cython-extension uses numpy, but also this is possible - see this SO post.
It doesn't make sense in general. It is, as you suspect, an attempt to use something that (possibly) has yet to be installed. If tested on a system that already has the dependency installed, you might not notice this defect. But run it on a system where your dependency is absent, and you will certainly notice.
There is another setup() keyword argument, setup_requires, that can appear to be parallel in form and use to install_requires, but this is an illusion. Whereas install_requires triggers a lovely ballet of automatic installation in environments that lack the dependencies it names, setup_requires is more documentation than automation. It won't auto-install, and certainly not magically jump back in time to auto-install modules that have already been called for in import statements.
There's more on this at the setuptools docs, but the quick answer is that you're right to be confused by a module that is trying to auto-install its own setup pre-requisites.
For a practical workaround, try installing cython separately, and then run this setup. While it won't fix the metaphysical illusions of this setup script, it will resolve the requirements and let you move on.
I have written the beginnings of a package that I would like to distribute, but I am having issues. When I place the sample_test.py in the primary directory, the script runs just fine. When I attempt to create a distribution and run it sample_test.py from anywhere, it doesn't work: ImportError: No module named 'script_functions'.
To install, I am running python setup.py sdist then python setup.py install. Both of these execute without error. Also, to keep from 'polluting' my core python environment, I am creating a new virtual environment and installing to that.
The moog_visa.py and moog_daqmx.py files contain classes that are used by script_functions.py. The hw_test_runner.py and script_functions.py contain simple functions that I wish to make available in my python environment. I'm not sure if this is relevant...
Directory structure:
\hw_test_runner
\examples
\sample_test.py
\hw_test_runner
\__init__.py
\hw_test_runner.py
\moog_daqmx.py
\moog_visa.py
\script_functions.py
\setup.py
My setup script contains:
from setuptools import setup
setup(name='hw_test_runner',
version='0.12',
description='Scriptable hardware test suite',
author='me',
author_email='xxx#XXX',
url='https://my_url.com',
packages=['hw_test_runner'],
install_requires=['numpy', 'pyvisa', 'PyDAQmx']
)
And init.py:
from hw_test_runner.script_functions import *
from hw_test_runner.hw_test_runner import *
In hw_test_runner.py:
from hw_test_runner.script_functions import *
<... more code below ... >
In `script_functions.py:
from hw_test_runner import moog_visa
from hw_test_runner import moog_daqmx
<... more code below ... >
I have tried various incarnations of the import statement within the __init__.py file, but haven't gotten anything working. I suspect that there is one line off somewhere that I just don't have the experience to easily spot.
Edit - More Information
After playing around a bit on the command line, I haven't found the problem, but I believe that the issue may lay with PyCharm. I can execute sample_test.py on the command line but not within PyCharm. PyCharm is set up to use the appropriate virtual environment, but there is apparently still something else missing.
Is there a way to include/invoke python module(s) (dependencies) installation first, before running the actual/main script?
For example, in my main.py:
import os, sys
import MultipartPostHandler
def main():
# do stuff here
But MultipartPostHandler is not yet installed, so what I want is to have it installed first before
actually running main.py... but in an automated manner. When I say automatically, I mean I will just invoke the script one time to start the dependency installation, then to be followed by actual functionalities of the main script.
(somehow, a little bit similar with maven. But I just need the installation part)
I already know the basics of setuptools. The problem is I may have to call the installation (setup.py) and the main script (main.py) separately.
Any ideas are greatly appreciated. Thanks in advance!
Is there a way to include/invoke python module(s) (dependencies) installation first, before running the actual/main script?
A good way is to use setuptools and explicitly list them in install_requires.
Since you are providing a main function, you also probably want to provide entry_points.
I already know the basics of setuptools. The problem is I may have to call the installation (setup.py) and the main script (main.py) separately.
That is usually not a problem. It is very common to first install everything with a requirements.txt file and pip install -r requirements.txt. Plus if you list dependencies you can then have reasonable expectations that it will be there when your function is called and not rely on try/except ImporError. It is a reasonable approach to expect required dependencies to be present and only use try/except for optional dependencies.
setuptools 101:
create a tree structure like this:
$ tree
.
├── mymodule
│ ├── __init__.py
│ └── script.py
└── setup.py
your code will go under mymodule; let's imagine some code that does a simple task:
# module/script.py
def main():
try:
import requests
print 'requests is present. kudos!'
except ImportError:
raise RuntimeError('how the heck did you install this?')
and here is a relevant setup:
# setup.py
from setuptools import setup
setup(
name='mymodule',
packages=['mymodule'],
entry_points={
'console_scripts' : [
'mycommand = mymodule.script:main',
]
},
install_requires=[
'requests',
]
)
This would make your main available as a command, and this would also take care of installing the dependencies you need (e.g requests)
~tmp damien$ virtualenv test && source test/bin/activate && pip install mymodule/
New python executable in test/bin/python
Installing setuptools, pip...done.
Unpacking ./mymodule
Running setup.py (path:/var/folders/cs/nw44s66532x_rdln_cjbkmpm000lk_/T/pip-9uKQFC-build/setup.py) egg_info for package from file:///tmp/mymodule
Downloading/unpacking requests (from mymodule==0.0.0)
Using download cache from /Users/damien/.pip_download_cache/https%3A%2F%2Fpypi.python.org%2Fpackages%2F2.7%2Fr%2Frequests%2Frequests-2.4.1-py2.py3-none-any.whl
Installing collected packages: requests, mymodule
Running setup.py install for mymodule
Installing mycommand script to /tmp/test/bin
Successfully installed requests mymodule
Cleaning up...
(test)~tmp damien$ mycommand
requests is present. kudos!
more useful commands with argparse:
If you want to use argparse then...
# module/script.py
import argparse
def foobar(args):
# ...
def main():
parser = argparse.ArgumentParser()
# parser.add_argument(...)
args = parser.parse_args()
foobar(args)
There's a few ways to do this. One way is to surround the import statement with a try...except ImportError block and then have some Python code that installs the package if the ImportError exception is raised, so something like:
try:
import MultipartPostHandler
except ImportError:
# code that installs MultipartPostHandler and then imports it
I don't think this approach is very clean. Plus if there are other unrelated importing issues, that won't be detected here. A better approach might be to have a bash script that checks to see if the module is installed:
pip freeze | grep MultipartPostHandler
and if not, installs the module:
pip install MultipartPostHandler
Then we can safely run the original Python code.
EDIT: Actually, I like FLORET's answer better. The imp module is exactly what you want.
You should use the imp module. Here's a example:
import imp
import httplib2
import sys
try:
import MultipartPostHandler
except ImportError:
# Here you download
http = httplib2.Http()
response, content = http.request('http://where_your_file_is.com/here')
if response.status == 200:
# Don't forget the right managment
with open('MultipartPostHandler.py', 'w') as f:
f.write(content)
file, pathname, description = imp.find_module('MultipartPostHandler')
MultipartPostHandler = imp.load_module('MultipartPostHandler', file, pathname, description)
else:
sys.exit('Unable to download the file')
For a full approach, use a queue:
download_list = []
try:
import FirstModule
except ImportError:
download_list.append('FirstModule')
try:
import SecondModule
except ImportError:
download_list.append('SecondModule')
if download_list:
# Here do the routine to dowload, install and load_modules
# THe main routine
def main():
the_response_is(42)
You can download binaries with open(file_content, 'wb')
I hope it help
BR