I have an application that embeds python and exposes its internal object model as python objects/classes.
For autocompletion/scripting purposes I'd like to extract a mock of the inernal object model, containing the doc tags, structure, functions, etc so I can use it as library source for the IDE autocompletion.
Does someone know of a library, or has some code snippet that could be used to dump those classes to source?
Use the dir() or globals() function to get the list of what has been defined yet. Then, to filter and browse your classes use the inspect module
Example toto.py:
class Example(object):
"""class docstring"""
def hello(self):
"""hello doctring"""
pass
Example browse.py:
import inspect
import toto
for name, value in inspect.getmembers(toto):
# First ignore python defined variables
if name.startswith('__'):
continue
# Now only browse classes
if not inspect.isclass(value):
continue
print "Found class %s with doctring \"%s\"" % (name, inspect.getdoc(value))
# Only browse functions in the current class
for sub_name, sub_value in inspect.getmembers(value):
if not inspect.ismethod(sub_value):
continue
print " Found method %s with docstring \"%s\"" % \
(sub_name, inspect.getdoc(sub_value))
python browse.py:
Found class Example with doctring "class docstring"
Found method hello with docstring "hello doctring"
Also, that doesn't really answer your question, but if you're writing a sort of IDE, you can also use the ast module to parse python source files and get information about them
Python data structures are mutable (see What is a monkey patch?), so extracting a mock would not be enough. You could instead ask the interpreter for possible autocompletion strings dynamically using the dir() built-in function.
Related
I'd like to retrieve qualified name like function or class property __qualname__ for the current function. inspect.stack() provides only simple unqualified name.
So I'm looking for a trick to calculate qualified name from available introspection info. I recoursed to trial-and-error in absence of python's internal representation knowledge and still could not figure the method. Python naturally calculates qualified names, so all required information are already available but I don't know where to find it.
I searched on the stackoverflow site and had found few solutions each incurring a drawback.
Check frame's locals
The method was proposed by #Aran-Fey in #42322828. Just scan stack locals for the first occurred __qualname__
def retrieveQualifiedName():
f = inspect.currentframe()
while f:
if '__qualname__' in f.f_locals:
return f.f_locals['__qualname__']
f = f.f_back
return None
It works good for classes nested in other classes directly but fails when there is a function in between.
good for
class A:
class B: pass
bad for
class A:
def a(self):
class B: pass
return B()
Offload all work to the 'executing' library
#Alex Hall suggested in #12190238 to use third-party introspection library executing
import executing
def retrieveQualifiedName():
f = inspect.currentframe().f_back.f_back
return executing.Source.executing(f).code_qualname()
The library calculates qualified name pretty reliably, but two things make me search for another solution.
First using third-party dependency complicates support. I could not just write a simple python script, I need to use full project definition and build artefacts. I would like to avoid it if possible.
Second the executing library parses source code to infer qualified name causing excess processing. I would like to extract qualified name from information already available to the interpreter.
I’ve tried to develop a « module expander » tool for Python 3 but I've some issues.
The idea is the following : for a given Python script main.py, the tool generates a functionally equivalent Python script expanded_main.py, by replacing each import statement by the actual code of the imported module; this assumes that the Python source code of the imported is accessible. To do the job the right way, I’m using the builtin module ast of Python as well as astor, a third-party tool allowing to dump the AST back into Python source. The motivation of this import expander is to be able to compile a script into one single bytecode chunk, so the Python VM should not take care of importing modules (this could be useful for MicroPython, for instance).
The simplest case is the statement:
from import my_module1 import *
To transform this, my tool looks for a file my_module1.py and it replaces the import statement by the content of this file. Then, the expanded_main.py can access any name defined in my_module, as if the module was imported the normal way. I don’t care about subtle side effects that may reveal the trick. Also, to simplify, I treat from import my_module1 import a, b, c as the previous import (with asterisk), without caring about possible side effect. So far so good.
Now here is my point. How could you handle this flavor of import:
import my_module2
My first idea was to mimic this by creating a class having the same name as the module and copying the content of the Python file indented:
class my_module2:
# content of my_module2.py
…
This actually works for many cases but, sadly, I discovered that this has several glitches: one of these is that it fails with functions having a body referring to a global variable defined in the module. For example, consider the following two Python files:
# my_module2.py
g = "Hello"
def greetings():
print (g + " World!")
and
# main.py
import my_module2
print(my_module2.g)
my_module2.greetings()
At execution, main.py prints "Hello" and "Hello World!". Now, my expander tool shall generate this:
# expanded_main.py
class my_module2:
g = "Hello"
def greetings():
print (g + " World!")
print(my_module2.g)
my_module2.greetings()
At execution of expanded_main.py, the first print statement is OK ("Hello") but the greetings function raises an exception: NameError: name 'g' is not defined.
What happens actually is that
in the module my_module2, g is a global variable,
in the class my_module2, g is a class variable, which should be referred as my_module2.g.
Other similar side effects happens when you define functions, classes, … in my_module2.py and you want to refer to them in other functions, classes, … of the same my_module2.py.
Any idea how these problems could be solved?
Apart classes, are there other Python constructs that allow to mimic a module?
Final note: I’m aware that the tool should take care 1° of nested imports (recursion), 2° of possible multiple import of the same module. I don't expect to discuss these topics here.
You can execute the source code of a module in the scope of a function, specifically an instance method. The attributes can then be made available by defining __getattr__ on the corresponding class and keeping a copy of the initial function's locals(). Here is some sample code:
class Importer:
def __init__(self):
g = "Hello"
def greetings():
print (g + " World!")
self._attributes = locals()
def __getattr__(self, item):
return self._attributes[item]
module1 = Importer()
print(module1.g)
module1.greetings()
Nested imports are handled naturally by replacing them the same way with an instance of Importer. Duplicate imports shouldn't be a problem either.
I want to make something like plugin system but can't make it working. To be specific I have some requirements.
I have main script who should search for other python scripts in ./plugins dir and load them.
This main script is searching for classes who inherits from Base using globals()
If I place these classes in the same main file it works very well but I can't get it worked as I want.
Is it possible to do this in Python?
I try to make some like this:
source: plugins/test.py
class SomeClass(Base):
def __init__(self):
self.name = "Name of plugin"
Main script just execute some methods on this class.
You could either import the python file dynamically or use the exec statement (make sure to define a context to execute in, otherwise the context you use the statement in will be used). Then use Base.__subclasses__, assuming Base being a new-style class, or call a function from the imported plugin module. In the latter case, you must provide a plugin-registration mechanism.
Use http://docs.python.org/2/library/imp.html#imp.load_module
For py3 I think there is importlib but I don't know how to use that one offhand.
Try importing the modules using imp -- imp.loadmodule will let you create namespace names dynamically if you need to. Then you can use inspect.getmembers() and inspect.is_class() to find the classes in your imported module (example code in this answer) to find all the clases defined there. Test those for being subclasses of your plugin.
...or, more pythonically, just use hasattr to find out if the imported classes 'quack like a duck' (ie, have the methods you expect from your plugin).
PS - I'm assuming you're asking for python 2.x. Good idea to tag the post with version # in future.
I'm trying to create a document out of my module. I used pydoc from the command-line in Windows 7 using Python 3.2.3:
python "<path_to_pydoc_>\pydoc.py" -w myModule
This led to my shell being filled with text, one line for each file in my module, saying:
no Python documentation found for '<file_name>'
It's as if Pydoc's trying to get documentation for my files, but I want to autocreate it. I couldn't find a good tutorial using Google. Does anyone have any tips on how to use Pydoc?
If I try to create documentation from one file using
python ... -w myModule\myFile.py
it says wrote myFile.html, and when I open it, it has one line of text saying:
# ../myModule/myFile.py
Also, it has a link to the file itself on my computer, which I can click and it shows what's inside the file on my web browser.
Another thing that people may find useful...make sure to leave off ".py" from your module name. For example, if you are trying to generate documentation for 'original' in 'original.py':
yourcode_dir$ pydoc -w original.py
no Python documentation found for 'original.py'
yourcode_dir$ pydoc -w original
wrote original.html
pydoc is fantastic for generating documentation, but the documentation has to be written in the first place. You must have docstrings in your source code as was mentioned by RocketDonkey in the comments:
"""
This example module shows various types of documentation available for use
with pydoc. To generate HTML documentation for this module issue the
command:
pydoc -w foo
"""
class Foo(object):
"""
Foo encapsulates a name and an age.
"""
def __init__(self, name, age):
"""
Construct a new 'Foo' object.
:param name: The name of foo
:param age: The ageof foo
:return: returns nothing
"""
self.name = name
self.age = age
def bar(baz):
"""
Prints baz to the display.
"""
print baz
if __name__ == '__main__':
f = Foo('John Doe', 42)
bar("hello world")
The first docstring provides instructions for creating the documentation with pydoc. There are examples of different types of docstrings so you can see how they look when generated with pydoc.
As RocketDonkey suggested, your module itself needs to have some docstrings.
For example, in myModule/__init__.py:
"""
The mod module
"""
You'd also want to generate documentation for each file in myModule/*.py using
pydoc myModule.thefilename
to make sure the generated files match the ones that are referenced from the main module documentation file.
works on windows7:
python -m pydoc -w my_module
That's it. If you want to document a function or a class, you put a string just after the definition. For instance:
def foo():
"""This function does nothing."""
pass
But what about a module? How can I document what a file.py does?
Add your docstring as the first statement in the module.
"""
Your module's verbose yet thorough docstring.
"""
import foo
# ...
For packages, you can add your docstring to __init__.py.
For the packages, you can document it in __init__.py.
For the modules, you can add a docstring simply in the module file.
All the information is here: http://www.python.org/dev/peps/pep-0257/
Here is an Example Google Style Python Docstrings on how module can be documented. Basically there is an information about a module, how to execute it and information about module level variables and list of ToDo items.
"""Example Google style docstrings.
This module demonstrates documentation as specified by the `Google
Python Style Guide`_. Docstrings may extend over multiple lines.
Sections are created with a section header and a colon followed by a
block of indented text.
Example:
Examples can be given using either the ``Example`` or ``Examples``
sections. Sections support any reStructuredText formatting, including
literal blocks::
$ python example_google.py
Section breaks are created by resuming unindented text. Section breaks
are also implicitly created anytime a new section starts.
Attributes:
module_level_variable1 (int): Module level variables may be documented in
either the ``Attributes`` section of the module docstring, or in an
inline docstring immediately following the variable.
Either form is acceptable, but the two should not be mixed. Choose
one convention to document module level variables and be consistent
with it.
Todo:
* For module TODOs
* You have to also use ``sphinx.ext.todo`` extension
.. _Google Python Style Guide:
http://google.github.io/styleguide/pyguide.html
"""
module_level_variable1 = 12345
def my_function():
pass
...
...
You do it the exact same way. Put a string in as the first statement in the module.
It's easy, you just add a docstring at the top of the module.
For PyPI Packages:
If you add doc strings like this in your __init__.py file as seen below
"""
Please refer to the documentation provided in the README.md,
which can be found at gorpyter's PyPI URL: https://pypi.org/project/gorpyter/
"""
# <IMPORT_DEPENDENCIES>
def setup():
"""Verify your Python and R dependencies."""
Then you will receive this in everyday usage of the help function.
help(<YOUR_PACKAGE>)
DESCRIPTION
Please refer to the documentation provided in the README.md,
which can be found at gorpyter's PyPI URL: https://pypi.org/project/gorpyter/
FUNCTIONS
setup()
Verify your Python and R dependencies.
Note, that my help DESCRIPTION is triggered by having that first docstring at the very top of the file.