In Python C API, I already know how to import a module via PyImport_ImportModule, as described in Python Documentation: Importing Modules. I also know that there is a lot of ways to create or allocate or initialize a module and some functions for operating a module, as described in Python Documentation: Module Objects.
But how can I get a function from a module (and call it), or, get a type/class from a module (and instantiate it), or, get an object from a module (and operate on it), or get anything from a module and do anything I want to do?
I think this can be a fool question but I really cannot find any tutorial or documentation. The only way I think that I can achieve this is use PyModule_GetDict to get the __dict__ property of the module and fetch what I want, as described in the latter documentation I mentioned. But the documentation also recommend that one should not use this function to operate the module.
So any "official way" or best practice for getting something from a module?
According to the documentation for PyModule_GetDict:
It is recommended extensions use other PyModule_*() and PyObject_*() functions rather than directly manipulate a module’s __dict__.
The functions you need are generic object functions (PyObject_*) rather than module functions (PyModule_*), and I suspect this is where you were looking in the wrong place.
You want to use PyObject_GetAttr or PyObject_GetAttrString.
Related
Hi I'm trying to learn python, when looking at examples sometimes I see a whole package simply imported, while other times a specific module within the package is imported. What is the advantage of this?
For example:
import face_detection
vs
from face_detection import model
The former seems better to me as then it would require you to be more explicit by needing to use face_detection.model every time you'd like to use the model module rather than being vague and simply calling module. Is it really just a code style thing?
Here is good explanation: https://softwareengineering.stackexchange.com/a/187471
Short answer - there is no such difference.
If I use a function to read the contents of a file in one module:
def get_objectstore():
with open(os.getcwd() + "\\data.store", "rb") as infile:
objA = cPickle.load(infile)
objectstore = eval((str(objA)).translate(string.maketrans(coder, alpha)))
return objectstore
and I call this function from my main program like this:
from main_vars import get_objectstore
objectstore=get_objectstore()
now objectstore has all images and sound used by my program. How can I use
objectstore
in all other modules loaded into the main program.
How can I use objectstore in all other modules loaded into the main program.
This is one of those things that you can do, but almost certainly shouldn't… so first I'll explain how, then explain why not.
If you want something to be directly available in every module, without having to import it in each module, just as if it were a builtin… the answer is to add it to the builtins. So, how do you do that?
Well, technically, there's no guaranteed safe way to do it, but practically, I believe monkeypatching the builtins module works in every version of every major implementation. So:
import builtins
builtins.objectstore = objectstore
Note that in Python 2.x, the module is called __builtin__, but works the same way.
This doesn't work in a few cases, e.g., inside code being run by an exec with a custom globals that provides a custom __builtins__. If you need to handle that… well, you can't handle it portably. But what if you only care about CPython (and I think PyPy, but not Jython, and I don't know about Iron…)? In CPython, it's guaranteed that every global environment, even the ones created for compile, exec, etc., will contain something named __builtins__ that's either the builtin dict for that environment, or some object whose namespace is the builtin dict for the environment. If there are no builtins at all, or you're looking at the global environment for the builtins module itself, it may be missing. So, you can write this:
try:
__builtins__['objectstore'] = objectstore
except AttributeError:
__builtins__.objectstore = objectstore
(That doesn't handle the case where you're running code inside the builtins namespace itself, because… I'm not sure what you'd want to do there, to be honest.)
Of course that's ugly. That's probably on purpose.
Now, why don't you want to do that?
Well, for one thing, it makes your code a lot harder to read. Anyone looking at your code can't tell where objectstore was defined, except by searching some completely unrelated module that isn't even referenced in the current module.
For another, it can lead to problems that are hard to debug. If you later change the list of which modules import which other modules, some module that depends on objectstore being patched in may run before it's been patched in and therefore fail.
Also, the fact that it's not actually documented that you can monkeypatch builtins, or that doing so affects the builtin environment, is a good sign that it's not something you should be relying on.
It's better to make what you're doing explicit. Have objectstore be a module global for some module that every other module imports from. Then it's immediately clear what's happening, on even a cursory glance.
For example, why not add that objectstore = get_objectstore() to mainvars, and then have every module do from mainvars import objectstore? Or maybe you even want to make it a singleton, so it's safe for anyone to call get_objectstore() and know that they'll all get back a single, shared value. Without knowing exactly what you're trying to accomplish, it's hard to suggest a best solution. All I can say for sure is that making objectstore a builtin-like cross-module global is very unlikely to be the best solution for almost anything you might be trying to accomplish.
I tried to look around but I couldn't find anything clear about this topic.
Are built-in functions implemented in a module that is automatically imported every time Python is launched? In the case which is the module?
Or are built-in functions just embedded functions inside the Python interpreter?
For CPython, the built-in functions are (for the most part) implemented in the bltinmodule.c file.
The exceptions are mostly the types; things like str and dict and list have their own C files in the Objects directory of the C source; these are listed as a table in the bltinmodule source.
Technically speaking, this is treated as a separate module object by the implementation, but one that is automatically searched when the current global namespace does not contain a name. So when you use abs() in your code, and there is no abs object in the global namespace, the built-ins module is also searched for that name.
It is also exposed as the __builtin__ module (or builtins in Python 3) so you can access the built-in names even if you shadowed any in your code. Like the sys module, however, it is compiled into the Python binary, and is not available as a separate dynamically loaded file.
I'm attempting to broadcast a module to other python processes with MPI. Of course, a module itself isn't pickleable, but the __dict__ is. Currently, I'm pickling the __dict__ and making a new module in the receiving process. This worked perfectly with some simple, custom modules. However, when I try to do this with NumPy, there's one thing that I can't pickle easily: the ufunc.
I've read this thread that suggests pickling the __name__ and __module__ of the ufunc, but it seems they rely on having numpy fully built and present before they rebuild it. I need to avoid using the import statement all-together in the receiving process, so I'm curious if the getattr(numpy,name) statement mentioned would work with a module that doesn't have ufuncs included yet.
Also, I don't see a __module__ attribute on the ufunc in the NumPy documentation:
http://docs.scipy.org/doc/numpy/reference/ufuncs.html
Any help or suggestions, please?
EDIT: Sorry, forgot to include thread mentioned above. http://mail.scipy.org/pipermail/numpy-discussion/2007-January/025778.html
Pickling a function in Python only serializes its name and the module it comes from. It does not transport code over the wire, so when unpickling you need to have the same libraries available as when pickling. On unpickling, Python simply imports the module in question, and grabs the items via getattr. (This is not limited to Numpy, but applies to pickling in general.)
Ufuncs don't pickle cleanly, which is a wart. Your options mainly are then to pickle just the __name__ (and maybe the __class__) of the ufunc, and reconstruct them later on manually. (They are not actually Python functions, and do not have a __module__ attribute.)
I'm trying to use a Python library written in C that has no documentation of any kind. I want to use introspection to at least see what methods and classes are in the modules. Does somebody have a function or library I can use to list the functions (with argument lists) and classes (with methods and member variables) within a module?
I found this article about Python introspection, but I'm pretty sure it doesn't apply to Python 2.5. Thanks for the help.
Here are some things you can do at least:
import module
print dir(module) # Find functions of interest.
# For each function of interest:
help(module.interesting_function)
print module.interesting_function.func_defaults
Mark Pilgrim's chapter 4, which you mention, does actually apply just fine to Python 2.5 (and any other recent 2.* version, thanks to backwards compatibility). Mark doesn't mention help, but I see other answers do.
One key bit that nobody (including Mark;-) seems to have mentioned is inspect, an excellent module in Python's standard library that really helps with advanced introspection.
Just this is pretty good too:
import module
help(module)
It will print the docstring for the module, then list the contents of the module, printing their docstrings too.
The dir() functions shows all members a module has.