I'm trying to install a python package in windows 10 using the following setup.py file.
"""Setup file for uhd module"""
from setuptools import setup
setup(name='uhd',
version='3.14.0',
description='Universal Software Radio Peripheral (USRP) Hardware Driver Python API',
classifiers=[
'Development Status :: 4 - Beta',
'License :: OSI Approved :: GNU General Public License v3 (GPLv3)',
'Programming Language :: C++',
'Programming Language :: Python',
'Topic :: System :: Hardware :: Hardware Drivers',
],
keywords='SDR UHD USRP',
author='Ettus Research',
author_email='packages#ettus.com',
url='https://www.ettus.com/',
license='GPLv3',
package_dir={'': 'C:/Users/bcollins/UHD_PY/uhd/host/build/python'},
package_data={'uhd': ['*.so']},
zip_safe=False,
packages=['uhd'],
install_requires=['numpy'])
I execute the script using the command
python setup.py install
I do this from the directory that contains the setup.py file.
This returns the following error
error: package directory 'C:Users\bcollins\UHD_PY\uhd\host\build\python\uhd' does not exist
There is a folder called "uhd" at that location though. The folder contains the __init__.py file
If the script isn't looking for this folder, what is it looking for?
I'm not exactly experienced in this area but my best guess is that its looking for a .so file within the "uhd" folder at that location, but I'm not sure.
I am using python 2.7.
This doesn't answer the original question, but it's how I fixed the same error.
I had:
from setuptools import setup, find_packages
setup(
...
packages=find_packages('src', exclude=['test']),
...
)
I had added the src argument because my packages are located in src, but it turns out find_packages is smart enough on it's own.
Remove the first argument:
from setuptools import setup, find_packages
setup(
...
packages=find_packages(exclude=['test']),
...
)
This was on Python 3.5, but I imagine it applies to most other versions.
package_dir has to be a relative path, not an absolute path. The distutils layer under setuptools tries to reject absolute paths, but the C: confuses it. It ends up converting your path to
C:Users\bcollins\UHD_PY\uhd\host\build\python\uhd
Note the missing backslash between C: and Users. This path is relative to your current working directory on the C drive (windows drive handling is weird), and relative to your working directory, this path is invalid.
I found out that this error can occur when in the python scripts folder (%python_root%\scripts) is not in the environment PATH.
I had this problem, it turned out that you just need to add a slash after your package directory: packages=['uhd'] should be packages=['uhd/'].
Related
When I install one of my own Python applications from PyPi, it fails to run citing
File "/home/me/.local/lib/python3.9/site-packages/refrapt/refrapy.py", line 20, in
from classes import (
ModuleNotFoundError: No module named 'classes'.
I have the following directory layout in my local area:
/refrapt
setup.py
/refrapt
classes.py
helpers.py
refrapt.conf
refrapt.py
settings.py
__init__.py
To build the project, I'm using setuptools, running the following command:
python setup.py sdist bdist_wheel
This builds and works happily enough, and I'm able to upload the resulting /dist.
I then install the project using pip3 install refrapt. When I run it using refrapt, I get the error ModuleNotFoundError above.
When I run the development code locally, it runs fine, but installed via pip, it is failing. I assume it's a problem with my setup.py, but this is my first time and I haven't really a clue what is correct. I tried adding the init.py (which is empty) as suggested by some python docs, but to no avail. The contents of setup.py are as follows:
import pathlib
from setuptools import setup, find_packages
HERE = pathlib.Path(__file__).parent
README = (HERE / "README.md").read_text()
setup(
name='Refrapt',
version='0.1.5',
description='A tool to mirror Debian repositories for use as a local mirror.',
python_requires='>=3.9',
long_description=README,
long_description_content_type="text/markdown",
packages=find_packages(),
install_requires=[
'Click >= 7.1.2',
'Colorama >= 0.4.4',
'tqdm >= 4.60.0',
'wget >= 3.2',
'filelock >= 3.0.12'
],
classifiers=[
"Development Status :: 4 - Beta",
"Operating System :: Microsoft :: Windows :: Windows 10",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python :: Implementation",
"Topic :: System :: Archiving :: Mirroring"
],
keywords=['Mirror', 'Debian', 'Repository'],
entry_points='''
[console_scripts]
refrapt=refrapt:refrapt
''',
)
If anyone could help, I'd greatly appreciate. I'm out of my depth on this one, and haven't been able to find an answer so far.
from classes import …
In Python 2 this was relative import — the statements imports classes from the current directory of the importing module.
But in Python 3 it was changed to absolute import. The import fails because there is no a global module or a package named classes. You need to convert the import to absolute or relative. Either
from refrapt.classes import …
or
from .classes import …
Potentially I've found out the answer to my question, but it's not the answer I wanted.
I spun up a virtual environment, and installed an application that I've used before via pip. When I went to run the app, I got the ModuleNotFoundError: No module named 'app'.
I tried to run it manually via the .py file by using python .\.venv\Lib\site-packages\app\cli.py, which resulting in the same error.
Seems to be something about the environment setup in Windows VS Code that just doesn't operate the same as on a Linux machine.
I guess I'll just have to remove the "refrapt." prefix from import statements when developing locally, and then add it back when pushing to GitHub.
I have an internal tool that we are using at work to pull some data and generate a report. The tool was written in python. I made some changes to the tool, and decided to tidy up the folder structure which seemed a bit messy to me. I probably broke something, but I'm determined to fix it and figure out what the issue is.
The project is named ReportCardLite, and my folder structure is like this:
ReportCardLite root folder
-Folder 1
-Folder 2
-rclite
-----__init.py__
-----A.py
-----B.py
-----C.py
-setup.py
-setup.cfg
-__init__.py
Originally, the author was importing by using the package name, which was odd to me since all the script files were in the same directory. So, in A.py for example, he would say something like "from rclite.B import fun".
I decided to remove the "unnecessary" module name from before all the import statements. Of course, that broke it, but I quickly figured out that I could add a line in my settings.json file to look within the rclite folder for all modules. Now my scripts were importing from one another without the module name, and running fine from within the IDE terminal window.
I next needed to build an executable from this module. The original author had included a setup.py and a setup.cfg file, so I used that to build and install this. But when I run this new executable, I receive errors that modules cannot be found. If I change it back to how it originally was, namely the "rclite.A" qualifier, it runs fine. I've spent hours trying to understand what is going on here, and I'm just out of ideas and cannot find any relevant questions using Google.
Can someone kindly point out which configuration I need to change in order to not have to put "rclite" in front of the import statements? Thanks!
Here is the setup.py script.. didn't see anything that looked promising.
"""ReportCardLite Packaging
See:
https://code.amazon.com/packages/ReportCardLite/trees/mainline
"""
# Always prefer setuptools over distutils
from codecs import open
from os import path
from setuptools import setup, find_packages
from cx_Freeze import setup, Executable
here = path.abspath(path.dirname(__file__))
import sys
# Get the long description from the README file
with open(path.join(here, 'README.md'), encoding='utf-8') as f:
long_description = f.read()
setup(name='ReportCardLite',
# Versions should comply with PEP440. For a discussion on single-sourcing
# the version across setup.py and the project code, see
# https://packaging.python.org/en/latest/single_source_version.html
version='0.9.0',
description='RC.Lite for SIM',
long_description=long_description,
# See https://pypi.python.org/pypi?%3Aaction=list_classifiers
classifiers=[
# How mature is this project? Common values are
# 3 - Alpha
# 4 - Beta
# 5 - Production/Stable
'Development Status :: 3 - Alpha',
# Indicate who your project is intended for
# 'Intended Audience :: Developers',
# 'Topic :: Software Development :: Build Tools',
# Pick your license as you wish (should match "license" above)
# 'License :: OSI Approved :: MIT License',
# Specify the Python versions you support here. In particular, ensure
# that you indicate whether you support Python 2, Python 3 or both.
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6'
],
# You can just specify the packages manually here if your project is
# simple. Or you can use find_packages().
packages=find_packages(exclude=['test']),
#packages = ['rclite'],
# Alternatively, if you want to distribute just a my_module.py, uncomment
# this:
# py_modules=["my_module"],
# List run-time dependencies here. These will be installed by pip when
# your project is installed. For an analysis of "install_requires" vs pip's
# requirements files see:
# https://packaging.python.org/en/latest/requirements.html
install_requires=[
'isoweek',
'jinja2',
'jsonpickle',
'jsonplus',
'markdown',
'pkg-resources',
'premailer',
'python-dateutil',
'pyxdg',
'six'
],
options={
'build_exe': {
'packages': ['lxml', 'asyncio', 'markdown.extensions'],
'include_files': ['templates/', 'webapp/']
},
},
# executables = [Executable("rclite/aiohttp_app.py", base="Win32GUI")],
executables=[Executable(script="rclite/aiohttp_app.py", targetName="run_rclite.exe")],
# List additional groups of dependencies here (e.g. development
# dependencies). You can install these using the following syntax,
# for example:
# $ pip install -e .[dev,test]
extras_require={
'dev': ['wheel'],
'test': ['mock'],
},
# If there are data files included in your packages that need to be
# installed, specify them here. If using Python 2.6 or less, then these
# have to be included in MANIFEST.in as well.
package_data={
# 'sample': ['package_data.dat'],
},
# Although 'package_data' is the preferred approach, in some case you may
# need to place data files outside of your packages. See:
# http://docs.python.org/3.4/distutils/setupscript.html#installing-additional-files # noqa
# In this case, 'data_file' will be installed into '<sys.prefix>/my_data'
# data_files=[('my_data', ['data/data_file'])],
# To provide executable scripts, use entry points in preference to the
# "scripts" keyword. Entry points provide cross-platform support and allow
# pip to create the appropriate form of executable for the target platform.
entry_points={
'console_scripts': [
'rclite = rclite.__main__:main',
'rclite-gui = rclite.aiohttp_app.__main__'
],
},
)
I want to create a pip package which dependent on some OS specific files:
Let's say there are:
dependency_Windows_x86_64.zip
dependency_Linux_x86_64.zip
dependency_MAC_OS_X.zip
I do not want to include all three archives in a package project, but download them dynamically during the pip install my-package based on user's OS. How can I do that ? Where should I put the code responsible for downloading/unzipping those files ?
My setup.py looks like this:
from setuptools import setup
setup(
name='my-package',
version='0.0.1',
description='Package description',
py_modules=['my_package'],
package_dir={'': 'src'},
classifiers=[
'Intended Audience :: Developers',
'License :: OSI Approved :: MIT License',
'Operating System :: POSIX :: Linux',
'Operating System :: Microsoft :: Windows',
'Operating System :: MacOS',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7'
],
python_requires='>=3.7'
)
The platform specific dependencies could be kept in separate Python projects (wrappers around data only packages) and then required from the main project like the following:
# setup.cfg
# ...
[options]
install_requires =
my_package_win_amd64 ; platform_system=="Windows" and platform_machine=="x86_64"
my_package_linux-x86_64 ; platform_system=="Linux" and platform_machine=="x86_64"
This approach doesn't depend on setuptools and could be used with other build systems.
First answer is give up and use setuptools. Look at Platform specific dependencies for a good write up of what to do.
Second answer is to make separate packages such as 'mylib-mac', 'mylib-win', 'mylib-linux'.
Third answer is to use the "console_script" approach. This will break. While it generates .exe files on Windows, it has odd failure modes. Also, some users will not be able to dynamically download files because they work from an internal clone of a repository. Randomly running code from the Internet on production can scare people.
Hope this helps!
A solution could be to publish platform specific Python wheels of your project. The platform specific files could be added to the pre-built distributions via a custom setuptools command (probably a sub-command of build, or maybe install).
This is not a full solution, but something like this might be a good start:
#!/usr/bin/env python3
import distutils.command.build
import setuptools
class build_something(setuptools.Command):
user_options = [
('plat-name=', 'p', "platform name to build for"),
]
def initialize_options(self):
self.plat_name = None
def finalize_options(self):
self.set_undefined_options('bdist_wheel', ('plat_name', 'plat_name'))
def run(self):
print(" *** plat_name: {} ***".format(self.plat_name))
print(" *** download the platform specific bits to 'build' ***")
class build(distutils.command.build.build):
sub_commands = [(
'build_something',
None,
)] + distutils.command.build.build.sub_commands
setuptools.setup(
cmdclass={
'build_something': build_something,
'build': build,
},
# ...
)
And then the Python wheels could be built like this:
$ ./setup.py bdist_wheel -p win_amd64
I'm did a python library, it's my first python library
published on pypl and github
The library works very well, but the setup() doesn't.
When I install it by pip install, it dowloand the appfly package but do not install the requirements: Flask,flask_cors, Flask-SocketIO and jsonmerge. So I need install it by myself.
If I install the dependencies myself, it's works very well, but i think it's the wrong way to use a python library right?
here is my setup.py file, I'm doing something wrong?
from setuptools import setup, find_packages
from appfly import __version__ as version
with open('README.md') as readme_file:
readme = readme_file.read()
# with open('HISTORY.md') as history_file:
# history = history_file.read()
requirements = [
'Flask==1.0.2',
'flask_cors==3.0.6',
'Flask-SocketIO==3.0.2',
'jsonmerge==1.5.2'
]
setup(
author="Italo José G. de Oliveira",
author_email='italo.i#live.com',
classifiers=[
'Natural Language :: English',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
],
description="This pkg encapsulate the base flask server configurations",
install_requires=requirements,
license="MIT license",
long_description=readme,
include_package_data=True,
keywords='appfly',
name='appfly',
packages=find_packages(),
url='https://github.com/italojs/appfly',
version=version,
zip_safe=False,
)
The reason for this error is that the setup.py imports from the package. This means that python will try importing the library while processing the setup.py (ie. before any of the dependencies get installed).
Since you are only importing the package to get the version information, this import can be replaced with a different method.
An easy way to do this is to include the version information directly in the setup.py, but the drawback with this is that the version is no longer single sourced.
Other methods involve a bit of work but allow the version information to continue to be single sourced. See https://packaging.python.org/guides/single-sourcing-package-version/ for recommendations. That page has a list of options, some of which may be better suited to your package setup than others. I personally prefer option 3:
Set the value to a __version__ global variable in a dedicated module
in your project (e.g. version.py), then have setup.py read and exec
the value into a variable.
...
Using exec:
version = {}
with open("...sample/version.py") as fp:
exec(fp.read(), version)
# later on we use: version['__version__']
You can also define the version in the __init__.py of your package like:
__version__ = "1.1.0"
Then, instead of importing __version__ in your setup.py, you can read the init.py and exctract the version.
Actually, this is the solution proposed in the official Python guides:
https://packaging.python.org/guides/single-sourcing-package-version/
Another option could be using versioneer package (pandas uses it).
versioneer uses Git tags to create the version value.
Steps
Step 0: Use git tags
Versioneer uses yout Git tags.
Step 1: Install versioneer
pip install versionner
Step 2: Execute versioneer
versioneer install
Step 3: Follow the versioneer instructions
Versioneer will ask you to do some changes in your setup.py and setup.cfg.\
In the setup.py you'll have to add something like:
import versioneer
setup(
version=versioneer.get_version(),
cmdclass=versioneer.get_cmdclass()
)
And in setup.cfg:
[versioneer]
VCS = git
style = pep440
versionfile_source = your_project_name/_version.py
versionfile_build = your_project_name/_version.py
tag_prefix = v
parentdir_prefix =
Note: In the example I set "tag_prefix = v" because I like tagging like: v0.1.0, v1.0.0 and so on
After that, try:
python setup.py version
Ok, I want to create a pre-built Python package which contains a C module. That is, at the end of it I want to have a tarball which contains everything needed to use my module and is pip install-able, ie at the end I can do a:
pip install whatevertarballgetsproduced.tar.gz
and mylibrary will be available. It also needs to be virtual environment friendly.
My current directory structure is:
project/
+ setup.py
+ mylibrary/
+ __init__.py
+ mylibrary.py
+ _mylibrary.so
+ README
That is, the compiled C library is in _mylibrary.so. The C source from which this file is derived is NOT to be included in the tarball. I am also doing this on OSX (Lion). mylibrary.py simply contains Python wrappers to the C library code.
How do I achieve this? I thought about doing a python setup.py bdist but this isn't really what I want (unless I'm missing something the tarball produced by that isn't pip install-able).
For the sake of completion, my setup.py looks like:
from setuptools import setup
setup(
name='mylibrary_py3mac',
version='0.1.1',
description='My library which is tied to OSX & Python 3',
long_description=open('README').read(),
packages=['mylibrary'],
classifiers = [
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Operating System :: MacOS :: MacOS X',
'Programming Language :: Python :: 3',
],
)
Note, I don't plan on distributing this tarball publicly, it is for internal deployment purposes only (hence why I don't care about it being precompiled, or tied to OSX only).
Adam, have you tried setup.py bdist_egg?
It will produce an egg file that can be installed using easy_install (not with pip, unfortunately).
Also, it looks like your project is missing mylibrary/__init__.py.
Once you add the missing "init" file, I am curious to know if setup.py will be smart enough to include the "so" file into the resulting egg file. There's a chance you have to prod it somehow (I myself haven't had a reason to deal with ".so" Python extensions, so I don't know how to).
In the following example, I have gevent-1.0b2-py2.7-macosx-10.4-x86_64.egg that was built using its setup.py bdist_egg. I've placed it into pypi/ directory under the current directory. I know that it happens to have other dependencies that I've also strategically placed into the same pypi/ directory.
easy_install --allow-hosts=None \
--find-links=pypi/ \
--always-unzip \
pypi/gevent-1.0b2-py2.7-macosx-10.4-x86_64.egg
Your case is simpler, because you don't seem to have any extra dependencies.
So you'd do something like:
easy_install --always-unzip \
dist/mylibrary_py3mac-0.1.1-py2.7-macosx-10.4-x86_64.egg
Note that dist/ directory is where built eggs files go, when setup.py bdist_egg is invoked.