Importing package in another package's module? - python

I am building a package (and then going to upload it on pypi). I have 3 modules in this package. These packages have a few dependencies like numpy, cv2, and more.
So, I mentioned my dependencies in the setup.py file.
# -*- coding: utf-8 -*-
import setuptools
setuptools.setup(
name='StyleTransferTensorFlow',
url='https://github.com/LordHarsh/Neural_Style_Transfer',
author='Hash Banka',
author_email='harshbanka321#gmail.com',
packages=setuptools.find_packages(),
# Needed for dependencies
install_requires=['matplotlib','tensorflow','os-win','ffmpy','glob2', 'pytest-shutil', 'pytube', 'opencv-python', 'pillow', 'numpy', 'tensorflow_hub'],
version='0.0.8',
# The license can be anything you like
license='MIT',
description="Package to apply style transfer on different frames of a video",
# We will also need a readme eventually (there will be a warning)
# long_description=open('README.txt').read(),
python_requires='>=3.6',
)
I have also imported them in the init.py file present in same directory as the modules.
import matplotlib.pyplot as plt
import tensorflow as tf
import tensorflow_hub as hub
import numpy as np
from pytube import YouTube
import os
import cv2
from PIL import Image
import shutil
import glob
import ffmpy
But still i am getting error when I execute code in the module
NameError: name 'cv2' is not defined
I am not sure why my imports are not running.
I have used cv2 inside a module to do the task.
So I am not sure what I am doing wrong. Please help me out.

I am not sure you fully understand what install_requires and packages are for.
install_requires specifies libraries that are required for your installation through setup.py to work. E.g. when installing it with pip or python setup.py install. You are specifying what should be on your computer before installing for the installation to work. No package will be installed: if a package listed here is missing, it will simply throw you an error during installation.
A very common package to include there is numpy, as you may import it in the setup.py, for example if you have some C or FORTRAN code which needs to compile upon installation.
packages argument shows which packages are needed for your library to work, so basically which packages to install along with the library. It will check if the package is not already installed on your machine, and if not install it along the library.
What I would do is empty entirely the install_requires argument. Don't even specify it if you are importing none of these packages in the setup.py. If it still doesn't work, I would replace setuptools.find_packages() with the list you are currently providing to install_requires.

Related

AWS Lambda function in python not running and saying scikit-learn has not been built correctly

I have the following, very simple python code in a lambda function:
from sklearn.externals import joblib
import praw
import datetime
from operator import attrgetter
import sys
def handler_name(event, context):
return "I am a cat dog and i meow."
I have also done pip installs for scikit-learn, praw, datetime, numpy and scipy from within a python 2.7 virtualenv. I then compressed my .py file along with everything in my virtualenv's /lib/python2.7/site-packages folder into a zip and uploaded it to AWS lambda. Unfortunately when I run the code I get the following error:
Unable to import module 'mainLambda': /var/task/sklearn/__check_build/_check_build.so: invalid ELF header
___________________________________________________________________________
Contents of /var/task/sklearn/__check_build:
setup.py _check_build.so __init__.pyc
__init__.py setup.pyc
___________________________________________________________________________
It seems that scikit-learn has not been built correctly.
If you have installed scikit-learn from source, please do not forget
to build the package before using it: run `python setup.py install` or
`make` in the source directory.
If you have used an installer, please check that it is suited for your
Python version, your operating system and your platform.
Obviously the issue is with sk-learn. I have no idea what though. It could be a versioning issue but I downloaded all the libraries from within a virtulenv and chose a python2.7 lambda function. Any idea? I am stumped!

ImportError: No module named datasets

from datasets import dataset_utils ImportError: No module named datasets.
when i am writing this in python sript.
import tensorflow as tf
from datasets import dataset_utils
slim = tf.contrib.slim
But i am getting error.
from datasets import dataset_utils
ImportError: No module named datasets
I found this solution
How can jupyter access a new tensorflow module installed in the right path?
I did the same and i have dataset packages at path anaconda/lib/python2.7/site-packages/. Still i am getting same error.
pip install datasets
I solved it this way.
You can find the folder address on your device and append it to system path.
import sys
sys.path.append(r"D:\Python35\models\slim\datasets"); import dataset_utils
You'll need to do the same with 'nets' and 'preprocessing'
sys.path.append(r"D:\Python35\models\slim\nets"); import vgg
sys.path.append(r"D:\Python35\models\slim\preprocessing"); import vgg_preprocessing
Datasets is present in https://github.com/tensorflow/models/tree/master/slim/datasets
Since 'models' are not installable from pip (at the time of writing), they are not available in python load paths by default. So either we copy them or manually add to the path.
Here is how I setup env before running the code:
# git clone or wget
wget https://github.com/tensorflow/models/archive/master.zip -O models.zip
unzip models.zip
# add it to Python PATH
export PYTHONPATH=$PYTHONPATH:$PWD/models-master/slim
# now we are good to call `python mytensorflow.py`
It's using the datasets package in the TF-slim image models library, which is in:
git clone https://github.com/tensorflow/models/
Having done that though, in order to import the module as shown in the example on the slim image page, empty init.py have to be added to the models and models/slim directories.
go to https://github.com/nschaetti/EchoTorch/releases and download the latest release
install the latest release from the downloaded file (202006291 is the latest version at the moment):
$pip install ./EchoTorch-202006291.zip
test it out using narma10_esn.py (other examples may have some issues)
you may still need to install some more python packages not listed in the requirements file but it works once you do this.

Import pillow without installing

I am working on a Python project that requires PIL to show images. However, the computers that I am working on often do not allow me to install things, and have a very bare bones python setup. For this reason, most of the modules that I need I simply place in the same directory as my python files.
I tried doing the same with PIL. I downloaded the pillow source, and copied the PIL folder into my project. I was then able to run "import PIL" with no problems. However, when I then tried to run "from PIL import Image" I get the error: "The _Imaging C module is not installed". From other searches I think that installing Pillow properly would fix this problem, however I would like PIL to be more portable, and not require an instillation.
Any ideas would be great. Thanks in advance.
One solution is bundle PIL in with the script in .egg form. Then, you can import PIL directly from the .egg instead of having to install it:
How to create Python egg file
The basic process is as follows:
How to create egg:
Edit PIL's setup.py to include from setuptools import setup instead of normal setup import
Run python setup.py bdist_egg
Egg will be inside of dist/
How to import egg:
Copy .egg file to script's directory and import desired modules:
import os
import sys
DIR = os.path.dirname(__file__)
sys.path.append(os.path.join(DIR, "./path/to/PIL.egg"))
#You can now import from PIL normally:
from PIL import Image

Having an issue creating an exe with py2exe and script importing xlrd

My goal is to create a python script that loops over cells of an excel document. This is my python script called reader.py, and it works just fine.
import xlrd
import os
exceldoc = raw_input("Enter the path to the doc [C:\\folder\\file.xlsx]: ")
wb = xlrd.open_workbook(exceldoc,'rb')
.... some code....
The problem I'm encountering is attempting to use py2exe to create an executable so this script can be used elsewhere.
Here is my setup.py file:
from distutils.core import setup
import py2exe
import sys
from glob import glob
setup(name='Excel Document Checker',console=['reader.py'])
I run the following command: python setup.py py2exe
It appears to run fine; it creates the dist folder that has my reader.exe file, but near the end of the command I get the following:
The following modules appear to be missing
['cElementTree', 'elementtree.ElementTree']
I did some searching online, and tried the recommendations here Re: Error: Element Tree not found, this changing my setup.py file:
from distutils.core import setup
import py2exe
import sys
from glob import glob
options={
"py2exe":{"unbuffered": True,"optimize": 2,
'includes':['xml.etree.ElementPath', 'xml.etree.ElementTree', 'xml.etree.cElementTree'],
"packages": ["elementtree", "xml"]}}
setup(name='Excel Document Checker',options = options,console=['reader.py'])
I'm now getting an error:
ImportError: No module named elementtree
I'm sort of at an impasse here. Any help or guidance is greatly appreciate.
Just some information - I'm running Python 2.6 on a 32 bit system.
You explicitly told setup.py to depend on a package named elementtree here:
"packages": ["elementtree", "xml"]}}
There is no such package in the stdlib. There's xml.etree, but obviously that's the same name.
The example you found is apparently designed for someone who has installed the third-party package elementtree, which is necessary if you need features added after Python 2.6's version of xml.etree, or if you need to work with Python 1.5-2.4, but not if you just want to use Python 2.6's version. (And anyway, if you do need the third-party package… then you have to install it or it won't work, obviously.)
So, just don't do that, and that error will go away.
Also, if your code—or the code you import (e.g., xlrd) is using xml.etree.cElementTree, then, as the py2exe FAQ says, you must also import xml.etree.ElementTree before using it to get it working. (And you also may need to specify it manually as a dependency.)
You presumably don't want to change all the third-party modules you're using… but I believe that making sure to import xml.etree.ElementTree before importing any of those third-party modules works fine.

How to Bootstrap numpy installation in setup.py

I have a project which has a C extension which requires numpy. Ideally, I'd like whoever downloads my project to just be able to run python setup.py install or use one call to pip. The problem I have is that in my setup.py I need to import numpy to get the location of the headers, but I'd like numpy to be just a regular requirement in install_requires so that it will automatically be downloaded from the Python Package Index.
Here is a sample of what I'm trying to do:
from setuptools import setup, Extension
import numpy as np
ext_modules = [Extension('vme', ['vme.c'], extra_link_args=['-lvme'],
include_dirs=[np.get_include()])]
setup(name='vme',
version='0.1',
description='Module for communicating over VME with CAEN digitizers.',
ext_modules=ext_modules,
install_requires=['numpy','pyzmq', 'Sphinx'])
Obviously, I can't import numpy at the top before it's installed. I've seen a setup_requires argument passed to setup() but can't find any documentation on what it is for.
Is this possible?
The following works at least with numpy1.8 and python{2.6,2.7,3.3}:
from setuptools import setup
from setuptools.command.build_ext import build_ext as _build_ext
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
setup(
...
cmdclass={'build_ext':build_ext},
setup_requires=['numpy'],
...
)
For a small explanation, see why it fails without the "hack", see this answer.
Note, that using setup_requires has a subtle downside: numpy will not only be compiled before building extensions, but also when doing python setup.py --help, for example. To avoid this, you could check for command line options, like suggested in https://github.com/scipy/scipy/blob/master/setup.py#L205, but on the other hand I don't really think it's worth the effort.
I found a very easy solution in [this post][1]:
Or you can stick to https://github.com/pypa/pip/issues/5761. Here you install cython and numpy using setuptools.dist before actual setup:
from setuptools import dist
dist.Distribution().fetch_build_eggs(['Cython>=0.15.1', 'numpy>=1.10'])
Works well for me!
This is a fundamental problem with packages that need to use numpy (for distutils or get_include). I do not know of a way to "boot-strap" it using pip or easy-install.
However, it is easy to make a conda package for your module and provide the list of dependencies so that someone can just do a conda install pkg-name which will download and install everything needed.
Conda is available in Anaconda or in Miniconda (python + just conda).
See this website: http://docs.continuum.io/conda/index.html
or this slide-deck for more information: https://speakerdeck.com/teoliphant/packaging-and-deployment-with-conda
The key is to defer importing numpy until after it has been installed. A trick I learned from this pybind11 example is to import numpy in the __str__ method of a helper class (get_numpy_include below).
from setuptools import setup, Extension
class get_numpy_include(object):
"""Defer numpy.get_include() until after numpy is installed."""
def __str__(self):
import numpy
return numpy.get_include()
ext_modules = [Extension('vme', ['vme.c'], extra_link_args=['-lvme'],
include_dirs=[get_numpy_include()])]
setup(name='vme',
version='0.1',
description='Module for communicating over VME with CAEN digitizers.',
ext_modules=ext_modules,
install_requires=['numpy','pyzmq', 'Sphinx'])
To get pip to work, you can do similarly as Scipy: https://github.com/scipy/scipy/blob/master/setup.py#L205
Namely, the egg_info command needs to be passed to standard setuptools/distutils, but other commands can use numpy.distutils.
Perhaps a more practical solution is to just require numpy to be installed beforehand and import numpy inside a function scope. #coldfix solution works but compiling numpy takes forever. Much faster to pip install it first as a wheels package, especially now that we have wheels for most systems thanks to efforts like manylinux.
from __future__ import print_function
import sys
import textwrap
import pkg_resources
from setuptools import setup, Extension
def is_installed(requirement):
try:
pkg_resources.require(requirement)
except pkg_resources.ResolutionError:
return False
else:
return True
if not is_installed('numpy>=1.11.0'):
print(textwrap.dedent("""
Error: numpy needs to be installed first. You can install it via:
$ pip install numpy
"""), file=sys.stderr)
exit(1)
def ext_modules():
import numpy as np
some_extention = Extension(..., include_dirs=[np.get_include()])
return [some_extention]
setup(
ext_modules=ext_modules(),
)
This should now (since 2018-ish) be solved by adding numpy as a buildsystem dependency in pyproject.toml, so that pip install makes numpy available before it runs setup.py.
The pyproject.toml file should also specify that you're using Setuptools to build the project. It should look something like this:
[build-system]
requires = ["setuptools", "wheel", "numpy"]
build-backend = "setuptools.build_meta"
See Setuptools' Build System Support docs for more details.
This doesn't cover many other uses of setup.py other than install, but as those are mainly for you (and other developers of your project), so an error message saying to install numpy might work.
#coldfix's solution doesn't work for Cython-extensions, if Cython isn't pre-installed on the target-machine, as it fails with the error
error: unknown file type '.pyx' (from 'xxxxx/yyyyyy.pyx')
The reason for the failure is the premature import of setuptools.command.build_ext, because when imported, it tries to use Cython's build_ext-functionality:
try:
# Attempt to use Cython for building extensions, if available
from Cython.Distutils.build_ext import build_ext as _build_ext
# Additionally, assert that the compiler module will load
# also. Ref #1229.
__import__('Cython.Compiler.Main')
except ImportError:
_build_ext = _du_build_ext
And normally setuptools is successful, because the import happens after setup_requirements are fulfilled. However by importing it already in setup.py, only fall back solution can be used, which doesn't know any about Cython.
One possibility to bootstrap Cython alongside with numpy, would be to postpone the import of setuptools.command.build_ext with help of an indirection/proxy:
# factory function
def my_build_ext(pars):
# import delayed:
from setuptools.command.build_ext import build_ext as _build_ext#
# include_dirs adjusted:
class build_ext(_build_ext):
def finalize_options(self):
_build_ext.finalize_options(self)
# Prevent numpy from thinking it is still in its setup process:
__builtins__.__NUMPY_SETUP__ = False
import numpy
self.include_dirs.append(numpy.get_include())
#object returned:
return build_ext(pars)
...
setup(
...
cmdclass={'build_ext' : my_build_ext},
...
)
There are other possibilities, discussed for example in this SO-question.
You can simply add numpy into your pyproject.toml file. This works for me.
[build-system]
requires = [
"setuptools>=42",
"wheel",
"Cython",
"numpy"
]
build-backend = "setuptools.build_meta"

Categories

Resources