Python Sphinx imported method organization within a class - python

I have a Class that is importing many methods from submodules. Currently Sphinx is setup to organize 'bysource' so they're at least sorting by the order in which the submodules are imported.
What I would like though, is some kind of header or searchable text for the title of the file they come from.
Current Directory Structure:
my_project
├── setup.py
└── my_package
   ├── __init__.py # Classes for subpackage_1 and subpackage_2 and imports occur here
   ├── subpackage_1
   │   ├── __init__.py
   │   ├── _submodule_a.py
   │   └── _submodule_b.py
   └── subpackage_2
   ├── __init__.py
   ├── _submodule_c.py
   └── _submodule_d.py
Sphinx module rst file:
Module contents
---------------
.. automodule:: my_package
:members:
:undoc-members:
:show-inheritance:
:member-order: bysource
In the my_package init.py there are there parent Classes defined where all the submodules/methods are imported to their related Class.
class MyClass_1():
...
from .subpackage_1._submodule_a import method_a
from .subpackage_1._submodule_b import method_b, method_c
class MyClass_2():
...
from .subpackage_2._submodule_c import method_d, method_e
from .subpackage_2._submodule_d import method_f
In the resulting Sphinx method documentation I see the methods under each class, but the desire is to be able to see which submodule file the methods sourced from. Doesn't have to be a subsection, merely including a header/note etc. would be nice without having to resort to manually listing all the modlues/methods in the Sphinx file.
There's hundreds of functions in the real package so it would be helpful for the user to be able to quickly discern what submodule file a method came from when viewing the documentation.
Current Sphinx Output:
Desired Sphinx Output:
Class MyClass_1...
Class MyClass_1...
submodule_a
* method_a
* method_a
submodule_b
* method_b
* method_b
* method_c
* method_c
Class MyClass_2...
Class MyClass_2...
submodule_c
* method_d
* method_d
* method_d
* method_e
submodule_d
* method_f
* method_f
I see worse case scenario, putting the submodule filename in docstring for each method within the file, but would love if someone's figured this out in a more automated fashion

Related

cannot patch class because of __init__.py

I'm writing unit tests for a codebase. My issue is I need to patch a database abstraction class but it is masked by the __init__.py file. My file structure is:
.
├── __init__.py
├── tools
│   ├── __init__.py
│   ├── test
│   │   ├── __init__.py
│   │   └── test_tool1.py
│   ├── tool1.py
│   └── tool2.py
└── utils
├── __init__.py
└── sql_client.py
Now the content of the files:
# tools/__init__.py
from .tool1 import * # `SQLClient` is imported here as part of tool1
from .tool2 import * # `SQLClient` is imported here again but now it's part of tool2
# tools/tool1.py
from utils import SQLClient
class A(object):
...
def run(self, **kwargs):
# the function I want to test
sql = SQLClient("some id")
# tools/tool2.py
from utils import SQLClient
...
# utils/__init__.py
from sql_client import *
# utils/sql_client.py
class SQLClient(object):
# The sql abstraction class that needs to be patched
In my test file, I'm creating a mock class to be used as the patch. I'm using absolute imports in my tests because later I want to move all the tests outside of the source folders.
# tools/test/test_tool1.py
from unittest.mock import MagicMock
from utils import SQLClient
from tools import A
class MockSQLClient(MagicMock):
def __init__(self, *args, **kwargs):
super().__init__(spec=SQLClient)
self._mocks = {"select *": "rows"}
def make_query(query)
return self._mocks[query]
def test_run_func(monkeypatch):
monkeypatch.setattr("tools.SQLClient", MockSQLClient)
a = A()
a.run()
# rest of the test
Now, the issue is that tools/__init__.py imports everything from all the submodules so SQLClient from tool1 is being masked by SQLClient from tool2. As a result, my monkeypatch is patching tool2.SQLClient. I cannot patch the tool1 part directly by monkeypatch.setattr("tools.tool1.SQLClient") because of tools/__init__.py which throws an error complaining there is no tool1 in tools.
EDIT
changed question title, more detail on where the test is.

Airflow on Docker - Path issue

Working with airflow I try simple DAG work.
I wrote custom operators and other files that I want to import into the main file where the DAG logic is.
Here the folder's structure :
├── airflow.cfg
├── dags
│   ├── __init__.py
│   ├── dag.py
│   └── sql_statements.sql
├── docker-compose.yaml
├── environment.yml
└── plugins
├── __init__.py
└── operators
├── __init__.py
├── facts_calculator.py
├── has_rows.py
└── s3_to_redshift.py
I setup the volume right in the compose file since I can see them when I log into the container's terminal.
I followed some tutorials online from where I have added some __init__.py.
The 2 none empty __init__ are into
/plugins/operators:
from operators.facts_calculator import FactsCalculatorOperator
from operators.has_rows import HasRowsOperator
from operators.s3_to_redshift import S3ToRedshiftOperator
__all__ = [
'FactsCalculatorOperator',
'HasRowsOperator',
'S3ToRedshiftOperator'
]
/plugins:
from airflow.plugins_manager import AirflowPlugin
import operators
# Defining the plugin class
class CustomPlugin(AirflowPlugin):
name = "custom_plugin"
# A list of class(es) derived from BaseOperator
operators = [
operators.FactsCalculatorOperator,
operators.HasRowsOperator,
operators.S3ToRedshiftOperator
]
# A list of class(es) derived from BaseHook
hooks = []
# A list of class(es) derived from BaseExecutor
executors = []
# A list of references to inject into the macros namespace
macros = []
# A list of objects created from a class derived
# from flask_admin.BaseView
admin_views = []
# A list of Blueprint object created from flask.Blueprint
flask_blueprints = []
# A list of menu links (flask_admin.base.MenuLink)
menu_links = []
But I keep getting errors from my IDE (saying No module named operators or Unresolved reference operators inside the operator's __init__).
Since everything fails to launch on the webserver.
Any idea how to set this up ? Where I'm wrong ?
Are you using the puckel's image?
If you are, you need to uncomment the # - ./plugins:/usr/local/airflow/plugins lines (may there are more than one) in the docker-compose files (either Local or Celery). The rest of your setup looks fine to me.

Python3: Import class problems

Suppose we have a tree like this:
└── my_project
├── A.py
├── __init__.py
├── my_apps
   ├── app_framework.py
   ├── app1.py
    ├── app2.py
    ├── __init__.py
Inside the folder my_apps, there is a general class defined in app_framework.py. The rest of the files all defines their own child class based on it.
The files would look like:
app_framework.py:
Class App():
...
app1.py:
from app_framework import App
Class MyApp1(APP):
...
...
app2.py:
from app_framework import App
Class MyApp2(APP):
...
...
So, in my project folder, I want to use
from my_apps import MyApp1, MyApp2
But I got two errors:
First is ModuleNotFoundError: No module named app_framework. I partially fix it by changing from app_framework import App to from .app_framework import App
ImportError: cannot import name 'MyApp1' from 'my_apps'
I can use from my_apps.app1 import MyApp1, MyApp2, but I would prefer from my_apps import MyApp1, MyApp2, which looks more concise. How to do it?
Create an __init__.py file in my_apps and import the required classes into that file:
# my_project/my_apps/__init__.py
from .app1 import MyApp1
from .app2 import MyApp2
Then in your A.py you can do:
from .my_apps import MyApp1, MyApp2

Python directory structure for modules

I have the following directory and file structure in my current directory:
├── alpha
│   ├── A.py
│   ├── B.py
│   ├── Base.py
│   ├── C.py
│   └── __init__.py
└── main.py
Each file under the alpha/ directory is it's own class and each of those classes inheirts the Base class in Base.py. Right now, I can do something like this in main.py:
from alpha.A import *
from alpha.B import *
from alpha.C import *
A()
B()
C()
And it works fine. However, if I wanted to add a file and class "D" and then use D() in main.py, I'd have to go into my main.py and do "from alpha.D import *". Is there anyway to do an import in my main file so that it imports EVERYTHING under the alpha directory?
depens what you are trying to do with the objects, one possible solution could be:
import importlib
import os
for file in os.listdir("alpha"):
if file.endswith(".py") and not file.startswith("_") and not file.startswith("Base"):
class_name = os.path.splitext(file)[0]
module_name = "alpha" + '.' + class_name
loaded_module = importlib.import_module(module_name)
loaded_class = getattr(loaded_module, class_name)
class_instance = loaded_class()
Importing everything with * is not a good practice, so if your files have only one class, importing this class is "cleaner" ( class_name is in your case)

how to write a custom assert Python

I am planning to split my one big test file to smaller test based on the part of code it tests. And I have a custom assert function for one of my tests. If I split them out to a separate file, how should I import them to other test files.
TestSchemacase:
class TestSchemaCase(unittest.TestCase):
"""
This will test our schema against the JSONTransformer output
just to make sure the schema matches the model
"""
# pylint: disable=too-many-public-methods
_base_dir = os.path.realpath(os.path.dirname(__file__))
def assertJSONValidates(self, schema, data):
"""
This function asserts the validation works as expected
Args:
schema(dict): The schema to test against
data(dict): The data to validate using the schema
"""
# pylint: disable=invalid-name
validator = jsonschema.Draft4Validator(schema)
self.assertIsNone(validator.validate(data))
def assertJSONValidateFails(self, schema, data):
"""
This function will assertRaises an ValidationError Exception
is raised.
Args:
schema(dict): The schema to validate from
data(dict): The data to validate using the schema
"""
# pylint: disable=invalid-name
validator = jsonschema.Draft4Validator(schema)
with self.assertRaises(jsonschema.ValidationError):
validator.validate(data)
My question is,1. When I try to import them I get an Import error with No module name found. I am breaking the TestValidation to mentioned small files. 2. I know I can raise Validation error in the assertJSONValidateFails but what should I return in case of validation pass.
tests/schema
├── TestSchemaCase.py
├── TestValidation.py
├── __init__.py
└── models
├── Fields
│   ├── TestImplemen.py
│   ├── TestRes.py
│   └── __init__.py
├── Values
│   ├── TestInk.py
│   ├── TestAlue.py
│   └── __init__.py
└── __init__.py
3.And is this how we should inherit them?
class TestRes(unittest.TestCase, TestSchemaCase):
Thanks for your time. Sorry for the big post
I did see the post, But that doesn't solve the problem.
I would suggest using a test framework that doesn't force you to put your tests in classes, like pytest.

Categories

Resources