Does pytest import modules twice? - python

I have a helper class in a test module that has a class-level member in which I cache already-created members of the class (sql dumps of database configurations composed in fixtures, so that I don't have to derive the data again for multiple tests).
It starts:
class SqlDump:
FIXUP = re.compile(r"^(\s*CREATE SEQUENCE[^;]*)AS INTEGER([^;]*);",
flags=re.MULTILINE | re.IGNORECASE | re.DOTALL)
PATH = os.path.join(os.path.dirname(__file__), 'test_data/sql_dumps/{script}.sql')
all = {}
def __init__(self, script):
self.__class__.all[script] = self
self.script = script
self.content = self.load()
If I place a breakpoint on the initialization of this member all, using it outside pytest, it is initialized only once.
But when I run pytest, the line that initializes the member is executed twice. This results in some values being lost.
Is there ever any reason a class-level member should be initialized twice? Why is pytest doing this?

This is a year old now, but in case it helps someone else:
There was a very similar issue in my case, where a module was getting rerun over and over from pytest. This particular module (SQLAlchemy) is highly sensitive to reinitializations, resulting in an opaque error of Multiple classes found for path in the registry of this declarative base. This did not occur during normal runs of the platform - only when pytest was run.
Here's how the project was structured:
ProjectName
│ __init__.py
│ script_outside_modules.py
│
└───actual_module
│ __init__.py
│ pytest.ini
│ some_code.py
│
└───some_subfolder
│ │ __init__.py
│ │ a_file_to_test.py
│
└───tests
│ __init__.py
│ test_suite.py
All imports were absolute from the actual_module root, e.g. actual_module.some_code.
If you want to triage exactly how sys sees your module imports, and whether the same module was imported such that it's seen as two different modules, try using the following code in a module which you believe could be getting double-imported, outside of any encapsulation (e.g. above class SqlDump in your example):
import sys
import traceback
print(f"{__file__} imported by module {__name__}, in sys.modules as {sys.modules[__name__]}")
try:
if hasattr(sys, 'double_import'):
raise AssertionError(f"{__file__} was double-imported!")
except Exception as e:
traceback.print_exc()
raise e
sys.double_import = 1
Reading what it's registered as in sys.modules should help you identify where the disconnect is happening and whether you have odd module import scenarios playing out.
After hours of investigating possible causes and logging, I found that the extra import was due to the __init__.py at the root of the project, inside ProjectName in this case. The code above helped to illustrate this, where sys.modules showed an module for actual_module.some_code during preparation phases, but then showed another module at ProjectName.actual_module.some_code within the tests themselves.
This seems to be because pytest will identify a root separately from whatever is defined in imports and prepend it when running tests (though that's just a working hypothesis). Deleting ProjectName/__init__.py resolved my issue.

Related

Utterly lost importing functions for pytest

I know there are several pytest and importing questions and answers here but I'm not able to find a solution :(
PS After rereading import tutorials have changed my project folder structure a bit and edited the examples below.
My simple directory structure for a demo python project is as follows:
.
├── src
│ └── addsub
│ ├── __init__.py
│ └── addsub.py
└── tests
├── test_add.py
└── test_subtract.py
src/addsub/addsub.py has the following content:
""" a demo module """
def add(a, b):
""" sum two numbers or concat two strings """
return a + b
def subtract(a: int, b:int) -> int:
""" subtract b from a --- bugged for the demo """
return a + b
def main():
print(f"Adding 3 to 4 makes: {add(3,4)}")
print(f"Subtracting 4 from 3 makes: {subtract(3,4)}")
if __name__ == "__main__":
main()
while my test files are under the tests directory with test_add.py having the following content:
from src.addsub import add
def test_add():
assert add(2, 3) == 5
assert add('space', 'ship') == 'spaceship'
Now when I run pytest from the projects root directory I get the following errors:
====================================================== test session starts ======================================================
platform darwin -- Python 3.8.15, pytest-7.2.1, pluggy-1.0.0
rootdir: /Users/bob/Documents/work/code/ci-example
plugins: anyio-3.6.2
collected 0 items / 1 error
============================================================ ERRORS =============================================================
______________________________________________ ERROR collecting tests/test_add.py _______________________________________________
ImportError while importing test module '/Users/bob/Documents/work/code/ci-example/tests/test_add.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
/Users/bob/opt/miniconda3/lib/python3.8/importlib/__init__.py:127: in import_module
return _bootstrap._gcd_import(name[level:], package, level)
tests/test_add.py:1: in <module>
from src.addsub import add
E ModuleNotFoundError: No module named 'src'
==================================================== short test summary info ====================================================
ERROR tests/test_add.py
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
======================================================= 1 error in 0.06s ========================================================
Thanks in advance for any clarification.
src should not be an importable package (should not contain an __init__.py) but be a path where python can find importable packages.
The tooling you are using (pipenv, poetry, etc) can affect this or present an opinionated default behaviour, but the issue boils down to making sure path/to/your/project/src is in sys.path and just write from addsub import add, subtract.
A simple, vanilla way to fix this is
export PYTHONPATH=./src
using an absolute or relative path depending on your needs.
But either pipenv and poetry should do something that saves the day with that folder structure.
I've had similar problems in the past and thus I've created a new import library: ultraimport
It gives the programmer more control over their imports and allows you to do file system based relative and absolute imports.
In your test_add.py you could then write:
import ultraimport
add = ultraimport("__dir__/../src/addsub/addsub.py", "add")
This will always work, independent of your sys.path and independent of your current working directory. It does not care about PYTHONPATH and it also does not care about any init files.

Can't import functions from different module in the same project

I have two scripts:
script1:
from test import script2
if __name__ == '__main__':
foo()
script2:
def foo():
print 'hello'
In the structure:
test:
│ A
│-----B:
│ script2.py
│script1.py
I am trying to call the function foo() from script2.py in script1.py.
I received the error:
This inspection detect names that should resolve but don't. Due to
dynamic dispatch and duck typing, this is possible in a limited but
useful number of cases. Top-level and class-level itesm are supported
better than instance items.
I read number of similar cases but they didn't help me:
Call a function from another file in Python
How to import a function from parent folder in python?
Import Script from a Parent Directory
For Python packages to work, you need __init__.py files in the folders. These can be empty.
Then, you are importing from a subfolder, so you need import script2 from A.B
At that point, you may use script2.foo()
Also worth mentioning that Python3 should be targeted for any new projects .

How to patch (mocking) tests with Errbot?

I'm trying to patch dependencies with my errbot tests. The problem I'm having is how errbot imports modules. It is not static and breaks my patch decorators as I add tests or they test in a different order.
I have plugin called EDB (edb.py). Inside of edb.py I import pyedb with import pyedb. This is located in my site-packages.
I have my test file test_edb.py and I try to patch my test methods like this
pytest_plugins = ["errbot.backends.test"]
extra_plugin_dir = '.'
from unittest.mock import patch # noqa: E402
#patch('yapsy_loaded_plugin_EDB_1.pyedb', autospec=True)
def test_edb_testlist(pyedb_mock, testbot):
testbot.push_message('!edb testlist')
assert "Okay, let me get..." == testbot.pop_message()
assert "I don't see any..." == testbot.pop_message()
Errbot adds this yapsy_loaded_plugin_EDB_<xx> path for module import but the xx depends on the order the test is run. This doesn't work, I need some static import path mypath.pyedb.
I'm hoping there is a different way to approach this. Maybe I can change the how I import the module so it's not dependent on errbot imports?
Here is a link to Errbot testing for reference.
My solution feels a bit hacky but it works. If anyone has a more elegant solution please share. I'll accept my own answer after awhile if there are no additional responses.
So I've come across this before but I guess I still wasn't familiar enough with how patching works in Python with knowing where to patch. After reading the "Where to patch" documentation ( again :) ) I have a work-around given the dynamic imports with errbot.
An errbot project folder will look something
errbot-project/
├── data/
│ ├── ...
├── plugins/
│ ├── plugin1/
| ├── ...
| ├── plugin2/
| ├── ...
I noticed that when errbot runs both the project directory ../errbot-project and all the plugin directories (e.g. ../errbot-project/plugins/plugin1) are added to sys.path.
So I added a package to my project directory and I import that in my plugins. I then can patch my dependencies reliably from that package. Again read the Where to Patch documentation for full explanation why. It looks something like this.
errbot-project/
├── data/
│ ├── ...
├── plugins/
│ ├── plugin1/
| ├── ...
| ├── plugin2/
| ├── ...
├── plugin_deps/
| ├── __init__.py
Where my ../errbot-project/plugin_deps/__init__.py looks like
...
import dep1
import dep2
...
And then in my plugin I use
...
import plugin_deps as pdep
...
def method():
pdep.dep1.method()
...
# note, you cannot use
# from plugin_deps import dep1
# this changes 'where' python looks up the module and
# and 'breaks' your patch
And finally my test code looks like
#patch('plugin_deps.dep1', autospec=True)
def test_get_tl_tabulation(my_mock, testbot):
# test code here

In Python: imports shenanigans (import subfolder.module in folder, use in subfolder/module2.py)

 Ok, this is not very clear at all. Allow me to rephrase.
Quick note: "Solved" while writing this question. Will accept answer with inputs regarding best practices.
Original quick note: This is probably a duplicate. I apparently couldn't phrase the question well enough to fit this situation. My apologies if it is.
First of all, to get you situated, here is my project's hierarchy as well as some relevant code:
Project/
├── main.py
└── lib
   ├── __init__.py
   ├── A.py
   ├── B.py
   └── C.py
main.py:
## Imports
from lib.A import ClassA as clA
from lib.B import ClassB as clB
from lib.C import ClassC as clC
#
# ... Some other code ...
#
a = clA()
b = clB()
a.add_ClB_Inst_Ref(b)
A.py:
## Imports
if __name__ == "__main__":
from B import ClassB as clB
from C import ClassC as clC
class ClassA:
dict_Of_Weighted_ClB_Refs = {}
#
# ... Some other code (attributes, init, ...) ...
#
def add_ClB_Inst_Ref(self, ClB_Instance):
if isinstance(ClB_Instance, clB):
key = clB.ID
if key in self.dict_Of_Weighted_ClB_Refs:
self.dict_Of_Weighted_ClB_Refs[key] +=1
else:
self.dict_Of_Weighted_ClB_Refs[key] = 1
Issue:
The program crashes in the add_ClB_Inst_Ref method when checking that the method's parameter is an instance of the ClassB object with the following error:
NameError: name 'clB' is not defined
"Well", you might say, "of course it crashes, the A.py files has never heard of that fancy clB!".
And, yes this may be the problem, but when I try to import it inside this file, replacing the imports section with this:
## Imports
from B import ClassB as clB
from C import ClassC as clC
I get the following error: ImportError: No module named 'B'
Question:
How can I fix this? Your input on best practices would be most welcomed.
Suspicions:
All this leads me to these hypothesis:
Modules imported in a file at the root of the project aren't globally available for the duration of the program's execution.
import searches a match from where the initial script has been executed. (Note: This lead me to double check that I had tried it this way from lib.B import ClassB as clB just like in the main file; turns out I hadn't and this works... I get the logic, but it seems rather counter intuitive. See answer below for details.)
My project's architecture is flawed.
As noted in the question, here is what is needed to fix the issue of the unavailable module B in module A.
Simply replace the imports section in A.py with the following:
## Imports
if __name__ == "__main__":
from B import ClassB as clB
from C import ClassC as clC
else:
from lib.B import ClassB as clB
from lib.C import ClassC as clC
This doesn't seem like the best possible solution to me, as this means that each time the module is called from somewhere else than the lib folder, it assumes this "somewhere" is the project's root folder, which could lead to some serious conflicts.
Please write you own answer if you are able to provide a more general approach and/or best practices on this issue.

Append different modules to the list by imp.load_module() will became a same modules

I write in Python 3.
I want to add plugins support to my program. I don't want to use heavy frameworks, so I deiced to write a minimal one by my self.
By the way, the plugins need to run by time. I don't always unload and load the plugin when re-run -- the plugins will lose all data.
My folder structs is here:
interfaces/
├── dummy
├── gmail
│   ├── __init__.py
│   └── __pycache__
│   └── __init__.cpython-33.pyc
└── hello
├── __init__.py
└── __pycache__
└── __init__.cpython-33.pyc
Then, I wrote a piece of code to load and execute the plugins:
#!/usr/bin/python3
import os
import imp
INTERFACES_FOLDER = './interfaces'
MAIN_MODULE = '__init__'
def search_plugins():
plugins = []
plugins_folders = os.listdir(INTERFACES_FOLDER)
for i in plugins_folders:
plugin_folder = os.path.join(INTERFACES_FOLDER, i)
if not os.path.isdir(plugin_folder):
continue
if not MAIN_MODULE + '.py' in os.listdir(plugin_folder):
continue
info = imp.find_module(MAIN_MODULE, [plugin_folder])
plugins.append({'name': i, 'info': info})
return plugins
def load_plugin(plugin):
return imp.load_module(MAIN_MODULE, *plugin["info"])
plugins_list = search_plugins()
plugins = []
for i in plugins_list:
module = load_plugin(i)
print(module)
plugins.append(module)
print(plugins)
Outputs:
# it works!
<module '__init__' from './interfaces/gmail/__init__.py'>
<module '__init__' from './interfaces/hello/__init__.py'>
# what's wrong?
[<module '__init__' from './interfaces/hello/__init__.py'>,
<module '__init__' from './interfaces/hello/__init__.py'>]
As you see, when I loaded a plugin, everything is working correctly. But when I append them to the a list, different modules will became a same one.
What's wrong?
You import different modules, but with the same name (they are all called "init"). Python will, when you import the second module notice that you have already imported a module called "init" and return that.
The module you should import is rather the "gmail" and "hello" modules. That might work.
However, I'd urge you to reconsider about writing your own plugin system. It really isn't that easy. A quick search finds several plugin systems, some slightweight, some not so much. Most have been abandoned, which is an indication that this isn't so easy as you might think.
The plugin system in Python that probably is the most widely used, also the most flexible and amongst the oldest, is the Zope Component Architecture. See also the docs. Note that in Python 3 you use class decorators instead of the class body statements.
Another popular plugin system is the "entry points" of Distribute. That's quite lightweight, although I don't know if you can load or unload plugins with that.
Others I have never looked at are Yapsy, one called "Plugins" that seem abandoned, another one called "PyPlugin" also abandoned, and one called "easy_plugins" which seem very new, and isn't abandoned yet.
And here is example code for a plugin manager.

Categories

Resources