I have a set of namespace packages that are meant to run in a python3.6 environment.
They are each set up as follows:
if sys.version_info < (3, 6):
print("Python versions < 3.6 unsupported", file=sys.stderr)
sys.exit(1)
setup(
name="mynamespace.subpackage",
version=VERSION,
packages=[
"mynamespace.subpackage",
],
package_dir={"": "src"},
package_data={
"": [],
},
include_package_data=True,
zip_safe=False,
install_requires=[
"mynamespace.core", # May have explicit dependencies that are not cyclic
],
namespace_packages=["mynamespace"],
...
)
All of the subpackages install just fine side-by-side.
The problem comes when I want to get robust type-checking via mypy. mypy is unable to find the mynamespace.core subpackage when being run on the source files for mynamespace.subpackage (for example) which means that I don't get robust typing checking across my sub-package boundaries.
This appears to be a known problem: https://github.com/python/mypy/issues/1645
Guido mentions that the workaround is to "add a dummy __init__.py or __init__.pyi files", but he doesn't really elaborate and it turns out that this isn't as obvious to me as I had hoped. Adding those files to the local repo allows mypy to run over the local repo as expected, I can't figure out how to get access to the typing information in a sibling namespace package.
My question is: how would I modify mynamespace.core -- so that when installed, mypy is able to pick up it's type information in other modules?
Hopefully you've fixed it by now (or given up!!) but for any fellow dwellers, the answer is to run:
mypy --namespace-packages -p mynamespace.subpackage
Note that when you use the -p arg you cannot specify a directory as well.
Related
I'm trying to build an egg for my python project using setuptools, however, whenever I build an egg all of the contents are built with the first letter if each file/folder removed.
For example, my parent folder is called dp which gets renamed to p. I.e. when I unzip the egg file, I see a parent folder named p and another folder named GG-INFO (these should be named dp and EGG-INFO respectively). All of the other folders inside folder p are named correctly.
This is an issue because I reference functions in modules within that folder - e.g. from dp.module import function which doesn't work because it complains about not finding the folder dp (which is true since for some reason it's been renamed p).
My setup.py file looks like this:
from setuptools import setup, find_packages
setup(
name="dp",
version="1.0",
author="XXXX",
author_email="XXXX",
description="Data pipeline for XXX algorithm.",
long_description_content_type="text/markdown",
url="XXXX",
packages=find_packages(),
package_data={'': ['*.sql', '*.json', '*.txt']},
include_package_data=True,
classifiers=[
"Programming Language :: Python :: 3"
],
python_requires='>=3.6',
install_requires=['argparse', 'boto3', 'datetime', 'mmlspark', 'pandas', 'pyspark', 'pypandoc', 'scikit-learn',
'numpy', 'googleads', 'mlflow']
)
I've tried renaming the parent directory and the same thing happens. I'm running this via PyCharm (updated to the latest version) on Mac OS Mojave.
Would appreciate any ideas on how to fix this.
Update:
I used a different method to generate the egg which unblocked me, but the issue still remains with the initial method.
Steps to reproduce
Create a new project in Pycharm
Add a setup.py file to root, see above.
Tools -> Run setup.py task -> bdist.egg
Generates an egg. Rename the extension to file_name.zip, unzip the file, check the contents of the folder.
I found that the first letter of the folder names was always missing (i changed the names of the folder and it consistently removed the first letter).
Workaround
Instead of building an egg via Pycharm, i used the command python setup.py bdist_egg in the terminal which created an egg without any issues.
I think this confirms it is a Pycharm bug(?). A colleague managed to intermittently reproduce this bug using Pycharm.
Give the wheel a try.
pip install wheel setuptools pip -U
pip wheel --no-deps --wheel-dir=build .
Similar to https://stackoverflow.com/questions/12518499/pip-ignores-dependency-links-in-setup-py
I'm modifying faker in anticipation to an open PR I have open with validators, and I want to be able to test the new dependency i will have.
setup(
name='Faker',
...
install_requires=[
"python-dateutil>=2.4",
"six>=1.10",
"text-unidecode==1.2",
],
tests_require=[
"validators#https://github.com/kingbuzzman/validators/archive/0.13.0.tar.gz#egg=validators-0.13.0", # TODO: this will change # noqa
"ukpostcodeparser>=1.1.1",
...
],
...
)
python setup.py test refuses to install the 0.13.0 version.
If I move the trouble line up to install_requires=[..] (which SHOULD not be there)
setup(
name='Faker',
...
install_requires=[
"python-dateutil>=2.4",
"six>=1.10",
"text-unidecode==1.2",
"validators#https://github.com/kingbuzzman/validators/archive/0.13.0.tar.gz#egg=validators-0.13.0", # TODO: this will change # noqa
],
tests_require=[
"ukpostcodeparser>=1.1.1",
...
],
...
)
using pip install -e . everything works great -- the correct version gets installed.
using python setup.py develop same issue.
My guess is setuptools/distutils doing something weird -- pip seems to address the issue. My question: how do I fix this?
Problematic code and references can be found here:
https://github.com/kingbuzzman/faker/commit/20f69082714fae2a60d356f4c63a061ce99a975e#diff-2eeaed663bd0d25b7e608891384b7298R72
https://github.com/kingbuzzman/faker
https://gist.github.com/kingbuzzman/e3f39ba217e2c14a9065fb14a502b63d
https://github.com/pypa/setuptools/issues/1758
Easiest way to see the issue at hand:
docker run -it --rm python:3.7 bash -c "git clone https://github.com/kingbuzzman/faker.git; cd faker; pip install -e .; python setup.py test"
UPDATE: Since this has been fixed, the issue wont be replicated anymore -- all tests will pass
Unfortunately, neither setup_requires nor tests_require support URL-based lookup or environment markers from PEP 508 yet. You need to use dependency_links, for example
setup(
...
tests_require=["validators>=0.13.0"],
dependency_links=['git+https://github.com/kingbuzzman/validators#master#egg=validators-0.13.0'],
)
My question is regarding multiple custom plugins in pytest.
I have two (or more) pytest plugins that I created which are installed using setuptools and pytest11 entry point, each plugin has its own setup.py. It seems like only the first installed plugin is active. I have verified this via print statements in the pytest_configure hook. If the first installed plugin is uninstalled, then only the second configure hook for the second plugin seems to get called. Also, the same behavior is observed with the addoption hook, options for the plugin installed second is unrecognized.
I'm thoroughly confused because I've used third party plugins and they seem to work just fine. Aren't hooks for all the installed plugins supposed to be called ?
Could this be a problem with the way plugins are installed, i.e. with setuptools ? (the command I use is python setup.py -v install). Pip correctly shows all the plugin modules as installed.
Edit:
Names are different, below are the setup files:
from setuptools import setup
setup(
name="pytest_suite",
version="0.1",
packages=['suite_util'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'name_of_plugin = suite_util.conftest',
]
},
)
and
from setuptools import setup
setup(
name="pytest_auto_framework",
version="0.1",
packages=['automation_framework'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'name_of_plugin = automation_framework.conftest',
]
},
)
If your pytest entry points both have the same name (as they do in your example above), only the first one will be loaded by pytest.
Note that this is not an inherent limitation of pkg_resources entry points but due to they way plugins are registered in pytest. There can only be one plugin with the same name - which makes sense imho.
The Pytest official document is ambiguous. The reason why your code don't work is because you followed that doc when you writing your both setup.py with same plugin name:
In this code:
entry_points={
'pytest11': [
'name_of_plugin = automation_framework.conftest',
]
the name_of_plugin is customizble and should be unique, otherwise pytest will load one of all plugins with same name (I guess is the last one with same name)
So, the solution to your question is:
setup.py 1:
from setuptools import setup
setup(
name="pytest_auto_framework",
version="0.1",
packages=['automation_framework'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'automation_framework = automation_framework.conftest',
]
},
)
setup.py 2:
from setuptools import setup
setup(
name="pytest_suite",
version="0.1",
packages=['suite_util'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'suite_util = suite_util.conftest',
]
},
)
POC
2 entry points with same plugin name left hand value
2 engry points with different plugin name
I have a little problem with setuptools/easy_install; maybe someone could give me a hint what might be the cause of the problem:
To easily distribute one of my python webapps to servers I use setuptools' sdist command to build a tar.gz file which is copied to servers and locally installed using easy_install /path/to/file.tar.gz.
So far this seems to work great. I have listed everything in the MANIFEST.in file like this:
global-include */*.py */*.mo */*.po */*.pot */*.css */*.js */*.png */*.jpg */*.ico */*.woff */*.gif */*.mako */*.cfg
And the resulting tar.gz file does indeed contain all of the files I need.
It gets weird as soon as easy_install tries to actually install it on the remote system. For some reason a directory called locales and a configuration file called migrate.cfg won't get installed. This is odd and I can't find any documentaiton about this, but I guess it's some automatic ignore feature of easy_install?
Is there something like that? And if so, how do I get easy_install to install the locales and migrate.cfg files?
Thanks!
For reference here is the content of my setup.py:
from setuptools import setup, find_packages
requires = ['flup', 'pyramid', 'WebError', 'wtforms', 'webhelpers', 'pil', 'apns', \
'pyramid_beaker', 'sqlalchemy', 'poster', 'boto', 'pypdf', 'sqlalchemy_migrate', \
'Babel']
version_number = execfile('pubserverng/version.py')
setup(
author='Bastian',
author_email='test#domain.com',
url='http://domain.de/',
name = "mywebapp",
install_requires = requires,
version = __version__,
packages = find_packages(),
zip_safe=False,
entry_points = {
'paste.app_factory': [
'pubserverng=pubserverng:main'
]
},
namespace_packages = ['pubserverng'],
message_extractors = { 'pubserverng': [
('**.py', 'python', None),
('templates/**.html', 'mako', None),
('templates/**.mako', 'mako', None),
('static/**', 'ignore', None),
('migrations/**', 'ignore', None),
]
},
)
I hate to answer my own question this quickly, but after some trial and error I found out what the reason behind the missing files was. In fact it was more than one reason:
The SOURCES.txt file was older and included a full list of most files, which resulted in them being bundled correctly.
The MANIFEST.in file was correct, too, so all required files were actually in the .tar.gz archive as expected. The main problem was that a few files simply would not get installed on the target machine.
I had to add include_package_data = True, to my setup.py file. After doing that all files installed as expected.
I'll have to put some research into include_package_data to find out if this weird behavior is documented somewhere. setuptools is a real mess - especially the documentation.
The entire package distribution system in python leaves a lot to be desired. My issues were similar to yours and were eventually solved by using distutils (rather than setuptools) which honored the include_package_data = True setting as expected.
Using distutils allowed me to more or less keep required file list in MANIFEST.inand avoid using the package_data setting where I would have had to duplicate the source list; the draw back is find_packages is not available. Below is my setup.py:
from distutils.core import setup
package = __import__('simplemenu')
setup(name='django-simplemenu',
version=package.get_version(),
url='http://github.com/danielsokolowski/django-simplemenu',
license='BSD',
description=package.__doc__.strip(),
author='Alex Vasi <eee#someuser.com>, Justin Steward <justin+github#justinsteward.com>, Daniel Sokolowski <unemelpmis-ognajd#danols.com>',
author_email='unemelpmis-ognajd#danols.com',
include_package_data=True, # this will read MANIFEST.in during install phase
packages=[
'simplemenu',
'simplemenu.migrations',
'simplemenu.templatetags',
],
# below is no longer needed as we are utilizing MANIFEST.in with include_package_data setting
#package_data={'simplemenu': ['locale/en/LC_MESSAGES/*',
# 'locale/ru/LC_MESSAGES/*']
# },
scripts=[],
requires=[],
)
And here is a MANIFEST.in file:
include LICENSE
include README.rst
recursive-include simplemenu *.py
recursive-include simplemenu/locale *
prune simplemenu/migrations
You need to use the data_files functionality of setup - your files aren't code, so easy_install won't install them by default (it doesn't know where they go).
The upside of this is that these files are added to MANIFEST automatically - you don't need to do any magic to get them there yourself. (In general if a MANIFEST automatically generated by setup.py isn't sufficient, adding them yourself isn't going to magically get them installed.)
how can I make setup.py file for my own script? I have to make my script global.
(add it to /usr/bin) so I could run it from console just type: scriptName arguments.
OS: Linux.
EDIT:
Now my script is installable, but how can i make it global? So that i could run it from console just name typing.
EDIT: This answer deals only with installing executable scripts into /usr/bin. I assume you have basic knowledge on how setup.py files work.
Create your script and place it in your project like this:
yourprojectdir/
setup.py
scripts/
myscript.sh
In your setup.py file do this:
from setuptools import setup
# you may need setuptools instead of distutils
setup(
# basic stuff here
scripts = [
'scripts/myscript.sh'
]
)
Then type
python setup.py install
Basically that's it. There's a chance that your script will land not exactly in /usr/bin, but in some other directory. If this is the case, type
python setup.py install --help
and search for --install-scripts parameter and friends.
I know that this question is quite old, but just in case, I post how I solved the problem for myself, that was wanting to setup a package for PyPI, that, when installing it with pip, would install it as a system package, not just for Python.
setup(
# rest of setup
console_scripts={
'console_scripts': [
'<app> = <package>.<app>:main'
]
},
)
Details