multiple custom plugins in py.test - python

My question is regarding multiple custom plugins in pytest.
I have two (or more) pytest plugins that I created which are installed using setuptools and pytest11 entry point, each plugin has its own setup.py. It seems like only the first installed plugin is active. I have verified this via print statements in the pytest_configure hook. If the first installed plugin is uninstalled, then only the second configure hook for the second plugin seems to get called. Also, the same behavior is observed with the addoption hook, options for the plugin installed second is unrecognized.
I'm thoroughly confused because I've used third party plugins and they seem to work just fine. Aren't hooks for all the installed plugins supposed to be called ?
Could this be a problem with the way plugins are installed, i.e. with setuptools ? (the command I use is python setup.py -v install). Pip correctly shows all the plugin modules as installed.
Edit:
Names are different, below are the setup files:
from setuptools import setup
setup(
name="pytest_suite",
version="0.1",
packages=['suite_util'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'name_of_plugin = suite_util.conftest',
]
},
)
and
from setuptools import setup
setup(
name="pytest_auto_framework",
version="0.1",
packages=['automation_framework'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'name_of_plugin = automation_framework.conftest',
]
},
)

If your pytest entry points both have the same name (as they do in your example above), only the first one will be loaded by pytest.
Note that this is not an inherent limitation of pkg_resources entry points but due to they way plugins are registered in pytest. There can only be one plugin with the same name - which makes sense imho.

The Pytest official document is ambiguous. The reason why your code don't work is because you followed that doc when you writing your both setup.py with same plugin name:
In this code:
entry_points={
'pytest11': [
'name_of_plugin = automation_framework.conftest',
]
the name_of_plugin is customizble and should be unique, otherwise pytest will load one of all plugins with same name (I guess is the last one with same name)
So, the solution to your question is:
setup.py 1:
from setuptools import setup
setup(
name="pytest_auto_framework",
version="0.1",
packages=['automation_framework'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'automation_framework = automation_framework.conftest',
]
},
)
setup.py 2:
from setuptools import setup
setup(
name="pytest_suite",
version="0.1",
packages=['suite_util'],
# the following makes a plugin available to pytest
entry_points={
'pytest11': [
'suite_util = suite_util.conftest',
]
},
)
POC
2 entry points with same plugin name left hand value
2 engry points with different plugin name

Related

mypy and distributing a namespace package

I have a set of namespace packages that are meant to run in a python3.6 environment.
They are each set up as follows:
if sys.version_info < (3, 6):
print("Python versions < 3.6 unsupported", file=sys.stderr)
sys.exit(1)
setup(
name="mynamespace.subpackage",
version=VERSION,
packages=[
"mynamespace.subpackage",
],
package_dir={"": "src"},
package_data={
"": [],
},
include_package_data=True,
zip_safe=False,
install_requires=[
"mynamespace.core", # May have explicit dependencies that are not cyclic
],
namespace_packages=["mynamespace"],
...
)
All of the subpackages install just fine side-by-side.
The problem comes when I want to get robust type-checking via mypy. mypy is unable to find the mynamespace.core subpackage when being run on the source files for mynamespace.subpackage (for example) which means that I don't get robust typing checking across my sub-package boundaries.
This appears to be a known problem: https://github.com/python/mypy/issues/1645
Guido mentions that the workaround is to "add a dummy __init__.py or __init__.pyi files", but he doesn't really elaborate and it turns out that this isn't as obvious to me as I had hoped. Adding those files to the local repo allows mypy to run over the local repo as expected, I can't figure out how to get access to the typing information in a sibling namespace package.
My question is: how would I modify mynamespace.core -- so that when installed, mypy is able to pick up it's type information in other modules?
Hopefully you've fixed it by now (or given up!!) but for any fellow dwellers, the answer is to run:
mypy --namespace-packages -p mynamespace.subpackage
Note that when you use the -p arg you cannot specify a directory as well.

How to write setup.py to include a Git repository as a dependency

I am trying to write setup.py for my package. My package needs to specify a dependency on another Git repository.
This is what I have so far:
from setuptools import setup, find_packages
setup(
name='abc',
packages=find_packages(),
url='https://github.abc.com/abc/myabc',
description='This is a description for abc',
long_description=open('README.md').read(),
install_requires=[
"requests==2.7.0",
"SomePrivateLib>=0.1.0",
],
dependency_links = [
"git+git://github.abc.com/abc/SomePrivateLib.git#egg=SomePrivateLib",
],
include_package_data=True,
)
When I run:
pip install -e https://github.abc.com/abc/myabc.git#egg=analyse
I get
Could not find a version that satisfies the requirement
SomePrivateLib>=0.1.0 (from analyse) (from versions: ) No matching
distribution found for SomePrivateLib>=0.1.0 (from analyse)
What am I doing wrong?
After digging through the pip issue 3939 linked by #muon in the comments above and then the PEP-508 specification, I found success getting my private repo dependency to install via setup.py using this specification pattern in install_requires (no more dependency_links):
install_requires = [
'some-pkg # git+ssh://git#github.com/someorgname/pkg-repo-name#v1.1#egg=some-pkg',
]
The #v1.1 indicates the release tag created on github and could be replaced with a branch, commit, or different type of tag.
This answer has been updated regularly as Python has evolved over the years. Scroll to the bottom for the most current answer, or read through to see how this has evolved.
Unfortunately the other answer does not work with private repositories, which is one of the most common use cases for this. I eventually did get it working with a setup.py file that looks like this (now deprecated) method:
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository - needs entry in `dependency_links`
'ExampleRepo'
],
dependency_links=[
# Make sure to include the `#egg` portion so the `install_requires` recognizes the package
'git+ssh://git#github.com/example_org/ExampleRepo.git#egg=ExampleRepo-0.1'
]
)
Newer versions of pip make this even easier by removing the need to use "dependency_links"-
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git#egg=ExampleRepo-0.1'
]
)
However, with the very latest pip you'll run into issues with the EGG format handler. This is because while the egg is ignored pip is now doing direct URL matching and will consider two URLs, one with the egg fragment and the other without, to be completely different versions even if they point to the same package. As such it's best to leave any egg fragments off.
June 2021 - setup.py
So, the best way (current to June 2021) to add a dependency from Github to your setup.py that will work with public and private repositories:
from setuptools import setup, find_packages
setup(
name = 'MyProject',
version = '0.1.0',
url = '',
description = '',
packages = find_packages(),
install_requires = [
# Github Private Repository
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git'
]
)
February 2022 - setup.cfg
Apparently setup.py is being deprecated (although my guess is it'll be around for awhile) and setup.cfg is the new thing.
[metadata]
name = MyProject
version = 0.1.1
[options]
packages = :find
install_requires =
ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git
June 16th 2022 - pyproject.toml
setup.cfg is already "pre" deprecated. as setuptools now has experimental support for pyproject.toml files.
This is the future, but since this is still experimental it should not be used in real projects for now. Even though setup.cfg is on its way out you should use it for a declarative format, otherwise setup.py is still the way to go. This answer will be updated when setuptools has stabilized their support of the new standard.
January 4th 2023 - pyproject.toml
It is now possible to define all of your dependencies in pyproject.toml. Other options such as setup.cfg still work.
[build-system]
requires = ["setuptools", "setuptools-scm"]
build-backend = "setuptools.build_meta"
[project]
dependencies = [
'ExampleRepo # git+ssh://git#github.com/example_org/ExampleRepo.git',
]
[project.optional-dependencies]
dev = ['ExtraExample # git+ssh://git#github.com/example_org/ExtraExample.git']
Note: this answer is now outdated. Have a look at this answer from #Dick Fox for up-to-date instructions: https://stackoverflow.com/a/54794506/2272172
You can find the right way to do it here.
dependency_links=['http://github.com/user/repo/tarball/master#egg=package-1.0']
The key is not to give a link to a Git repository, but a link to a tarball. GitHub creates a tarball of the master branch for you if you append /tarball/master as shown above.
A more general answer: To get the information from the requirements.txt file I do:
from setuptools import setup, find_packages
from os import path
loc = path.abspath(path.dirname(__file__))
with open(loc + '/requirements.txt') as f:
requirements = f.read().splitlines()
required = []
dependency_links = []
# Do not add to required lines pointing to Git repositories
EGG_MARK = '#egg='
for line in requirements:
if line.startswith('-e git:') or line.startswith('-e git+') or \
line.startswith('git:') or line.startswith('git+'):
line = line.lstrip('-e ') # in case that is using "-e"
if EGG_MARK in line:
package_name = line[line.find(EGG_MARK) + len(EGG_MARK):]
repository = line[:line.find(EGG_MARK)]
required.append('%s # %s' % (package_name, repository))
dependency_links.append(line)
else:
print('Dependency to a git repository should have the format:')
print('git+ssh://git#github.com/xxxxx/xxxxxx#egg=package_name')
else:
required.append(line)
setup(
name='myproject', # Required
version='0.0.1', # Required
description='Description here....', # Required
packages=find_packages(), # Required
install_requires=required,
dependency_links=dependency_links,
)
Actually if you like to make your packages installable recursively (YourCurrentPackage includes your SomePrivateLib), e.g. when you want to include YourCurrentPackage into another one (like OuterPackage → YourCurrentPackage → SomePrivateLib) you'll need both:
install_requires=[
...,
"SomePrivateLib # git+ssh://github.abc.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
],
dependency_links = [
"git+ssh://github.abc.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
And make sure you have a tag created with your version number.
Also if your Git project is private and you like to install it inside the container, e.g., a Docker or GitLab runner, you will need authorized access to your repository. Please consider to use Git + HTTPS with access tokens (like on GitLab: https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html):
import os
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
....
install_requires=[
...,
f"SomePrivateLib # git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
],
dependency_links = [
f"git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Updated:
You have to put #egg=SomePrivateLib at the end of dependency line if you like to have this dependency in requirements.txt file. Otherwise pip install -r requirements.txt won't work for you and you wil get something like:
ERROR: Could not detect requirement name for
'git+https://gitlab-ci-token:gitlabtokenvalue#gitlab.server.com/abc/SomePrivateLib.git#0.1.0',
please specify one with #egg=your_package_name
If you use reuirements.txt, this part is resposible for name of dependency’s folder that would be created inside python_home_dir/src and for name of egg-link in site-packages/
You can use a environment variable in your requirements.txt to store your dependency’s token value safe in your repo:
Example row in requrements.txt file for this case:
....
-e git+https://gitlab-ci-token:${EXPORTED_VAR_WITH_TOKEN}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib
....
I was successful with these three options in GitLab. I am using version 11 of GitLab.
Option 1 - no token specified. The shell will prompt for username/password.
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
"SomePrivateLib # git+https://gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Option 2 - user access token specified. The token generated by going to GitLab → account top right → settings → access tokens. Create the token with read_repository rights.
Example:
import os
from setuptools import setup
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
f"SomePrivateLib # git+https://gitlab-ci-token:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
Option 3 - repository-level token specified. The token generated by going to the repository → settings → repository → deploy tokens. From here, create a token with read_repository rights.
Example:
import os
from setuptools import setup
TOKEN_USER = os.getenv('EXPORTED_TOKEN_USER')
TOKEN_VALUE = os.getenv('EXPORTED_VAR_WITH_TOKEN')
setup(
install_requires=[
f"SomePrivateLib # git+https://{TOKEN_USER}:{TOKEN_VALUE}#gitlab.server.com/abc/SomePrivateLib.git#0.1.0#egg=SomePrivateLib"
]
)
In all three, I was able to do simply: "SomePrivateLib # git+https://gitlab.server.com/abc/SomePrivateLib.git" without the #egg marking at the end.
This solution works for me when I run python setup.py install:
setuptools.setup(
...,
install_requires=[
'numpy',
'pandas',
'my_private_pkg'
],
dependency_links=["git+https://github.com/[username]/[my_private_pkg].git#main#egg=my_private_pkg"],
...
)

Esky not including sub-module

I have a medium-size PyQT5 desktop application that has been working fine with py2app. I want to incorporate Esky so that the app can update itself, but the app terminates during startup (before displaying the main window) with a log entry that says "HelloApp Error" (where "HelloApp" is the name of my application).
I've created a small test case that reproduces the problem that is available at https://github.com/markmont/esky-package-question
The test-case app has the following structure:
HelloApp/
HelloApp/
HelloApp.py
helloform
__init__.py
setup.py
setup.py contains:
from esky import bdist_esky
from distutils.core import setup
PY2APP_OPTIONS = {
'argv_emulation': True,
'includes': [ 'sip', 'PyQt5', 'helloform' ],
'qt_plugins': [ '*' ]
}
ESKY_OPTIONS = {
"freezer_module": "py2app",
"freezer_options": PY2APP_OPTIONS,
"includes": [ 'sip', 'PyQt5', 'helloform' ]
}
HelloApp = bdist_esky.Executable( "HelloApp/HelloApp.py", gui_only=True )
setup(
name='HelloApp',
version = "2014060301",
data_files=[],
options = { "bdist_esky": ESKY_OPTIONS },
scripts=[ HelloApp ]
)
HelloApp.py contains the statement from helloform import Form -- this appears to be what is causing the app to fail to start with the error "HelloApp Error", as if I remove that statement and paste in the contents of helloform/init.py the application starts up and works properly.
Also, if I move everything into a single directory and adjust the paths in setup.py, then the problem does not occur -- Esky finds helloform.py (formerly named helloform/init.py), includes it, and the application starts up and works properly:
HelloApp/
HelloApp.py
helloform.py # formerly ./HelloApp/helloform/__init__.py
setup.py
...but putting everything in single directory is not a scalable solution for a medium-to-large application.
There are no error messages in the output of python setup.py bdist_esky when the problem occurs, and I have not found the answer in the Esky documentation or in various examples on the web.
The full error from /var/log/system.log is:
2014-06-03 13:03:07.100 HelloApp[14968]: HelloApp Error
I'm assuming that I'm not using Esky's includes option properly in setup.py, but I've got no clue as to how to fix this -- can anyone help?
Other possibly relevant details: MacOS X 10.9 Mavericks, Python 2.7.6 (local build), qt-5.3.0 opensource, sip 4.16, PyQT 5.3.0 (GPL), py2app 0.8.1 patched to support PyQT5, and the latest version of Esky from GitHub.
Thanks in advance!
I've solved this problem -- the problem was due to my limited knowledge of Python distutils and setuptools. Since things "just worked" with py2app (which was using setuptools), I assumed that the problem was with how Etsy was configured when the problem was really with how I was using distutils.
The problem was that helloworld.py was not being copied into the frozen app.
The solution involved restructuring the files and changing the disutils configuration to explicitly add HelloApp as a package.
New file structure:
HelloApp/
hello.py # formerly HelloApp.py
HelloApp/
__init__.py
helloform.py
setup.py
New setup.cfg:
from esky import bdist_esky
from distutils.core import setup
PY2APP_OPTIONS = {
'argv_emulation': True,
'includes': [ 'sip', 'PyQt5' ],
'qt_plugins': [ '*' ]
}
ESKY_OPTIONS = {
"freezer_module": "py2app",
"freezer_options": PY2APP_OPTIONS,
"includes": [ 'sip', 'PyQt5' ]
}
HelloApp = bdist_esky.Executable( "hello.py", gui_only=True )
setup(
name='hello',
version = "2014060301",
data_files=[],
options = { "bdist_esky": ESKY_OPTIONS },
scripts=[ HelloApp ],
packages=[ 'HelloApp' ],
)

setuptools/easy_install does not install *.cfg files and locale directories?

I have a little problem with setuptools/easy_install; maybe someone could give me a hint what might be the cause of the problem:
To easily distribute one of my python webapps to servers I use setuptools' sdist command to build a tar.gz file which is copied to servers and locally installed using easy_install /path/to/file.tar.gz.
So far this seems to work great. I have listed everything in the MANIFEST.in file like this:
global-include */*.py */*.mo */*.po */*.pot */*.css */*.js */*.png */*.jpg */*.ico */*.woff */*.gif */*.mako */*.cfg
And the resulting tar.gz file does indeed contain all of the files I need.
It gets weird as soon as easy_install tries to actually install it on the remote system. For some reason a directory called locales and a configuration file called migrate.cfg won't get installed. This is odd and I can't find any documentaiton about this, but I guess it's some automatic ignore feature of easy_install?
Is there something like that? And if so, how do I get easy_install to install the locales and migrate.cfg files?
Thanks!
For reference here is the content of my setup.py:
from setuptools import setup, find_packages
requires = ['flup', 'pyramid', 'WebError', 'wtforms', 'webhelpers', 'pil', 'apns', \
'pyramid_beaker', 'sqlalchemy', 'poster', 'boto', 'pypdf', 'sqlalchemy_migrate', \
'Babel']
version_number = execfile('pubserverng/version.py')
setup(
author='Bastian',
author_email='test#domain.com',
url='http://domain.de/',
name = "mywebapp",
install_requires = requires,
version = __version__,
packages = find_packages(),
zip_safe=False,
entry_points = {
'paste.app_factory': [
'pubserverng=pubserverng:main'
]
},
namespace_packages = ['pubserverng'],
message_extractors = { 'pubserverng': [
('**.py', 'python', None),
('templates/**.html', 'mako', None),
('templates/**.mako', 'mako', None),
('static/**', 'ignore', None),
('migrations/**', 'ignore', None),
]
},
)
I hate to answer my own question this quickly, but after some trial and error I found out what the reason behind the missing files was. In fact it was more than one reason:
The SOURCES.txt file was older and included a full list of most files, which resulted in them being bundled correctly.
The MANIFEST.in file was correct, too, so all required files were actually in the .tar.gz archive as expected. The main problem was that a few files simply would not get installed on the target machine.
I had to add include_package_data = True, to my setup.py file. After doing that all files installed as expected.
I'll have to put some research into include_package_data to find out if this weird behavior is documented somewhere. setuptools is a real mess - especially the documentation.
The entire package distribution system in python leaves a lot to be desired. My issues were similar to yours and were eventually solved by using distutils (rather than setuptools) which honored the include_package_data = True setting as expected.
Using distutils allowed me to more or less keep required file list in MANIFEST.inand avoid using the package_data setting where I would have had to duplicate the source list; the draw back is find_packages is not available. Below is my setup.py:
from distutils.core import setup
package = __import__('simplemenu')
setup(name='django-simplemenu',
version=package.get_version(),
url='http://github.com/danielsokolowski/django-simplemenu',
license='BSD',
description=package.__doc__.strip(),
author='Alex Vasi <eee#someuser.com>, Justin Steward <justin+github#justinsteward.com>, Daniel Sokolowski <unemelpmis-ognajd#danols.com>',
author_email='unemelpmis-ognajd#danols.com',
include_package_data=True, # this will read MANIFEST.in during install phase
packages=[
'simplemenu',
'simplemenu.migrations',
'simplemenu.templatetags',
],
# below is no longer needed as we are utilizing MANIFEST.in with include_package_data setting
#package_data={'simplemenu': ['locale/en/LC_MESSAGES/*',
# 'locale/ru/LC_MESSAGES/*']
# },
scripts=[],
requires=[],
)
And here is a MANIFEST.in file:
include LICENSE
include README.rst
recursive-include simplemenu *.py
recursive-include simplemenu/locale *
prune simplemenu/migrations
You need to use the data_files functionality of setup - your files aren't code, so easy_install won't install them by default (it doesn't know where they go).
The upside of this is that these files are added to MANIFEST automatically - you don't need to do any magic to get them there yourself. (In general if a MANIFEST automatically generated by setup.py isn't sufficient, adding them yourself isn't going to magically get them installed.)

py2app setup.py usage question

Ok so I'm trying to use py2app to generate a distribution for my project. I'm still not sure I get the hang of it tho. So my setup.py looks like this:
"""
This is a setup.py script generated by py2applet
Usage:
python setup.py py2app
"""
from setuptools import setup
import setuptools
PACKAGES = ['sqlalchemy.dialects.sqlite']
MODULES = ['sqlite3']
APP = ['tvb/interfaces/web/run.py']
OPTIONS = {'argv_emulation': True,
'packages': PACKAGES ,
'includes' : MODULES }
DATA_FILES = []
setup(
app=APP,
data_files=DATA_FILES,
packages = setuptools.find_packages(),
include_package_data=True,
options={'py2app': OPTIONS},
setup_requires=['py2app', "pyopengl", "cherrypy", "sqlalchemy", "simplejson",
"formencode", "genshi", "quantities","numpy", "scipy",
"numexpr", "nibabel", "cfflib", "mdp", "apscheduler",
"scikits.learn"]
)
So my first question would be: What should I include in my MODULES for py2app here? Does py2app know to scan for the things in setup_requires and include them or do I need to add some entries for them in MODULES ?
Another problem is that I'm getting an: sqlalchemy.exc.ArgumentError: Could not determine dialect for 'sqlite' when trying to run my app. After lots of googling I only saw that for py2exe you need to include the sqlalchemy.dialects.sqlite as a package but it doesn't seem to work for me. Am I missing something here?
The last one is that I'm getting a: malformed object (load command 3 cmdsize not a multiple of 8) just before the python setup.py py2app. Is this normal?
Regards,
Bogdan
Well seems I got the whole thing wrong.
'includes' : ['sqlalchemy.dialects.sqlite']
Instead of packages, and that seems to have done the trick.

Categories

Resources