Is it possible to obtain a string containing the whole namespace from which a function was imported?
I would like some code like the following:
import math
import numpy
whole1 = get_whole_namespace(math.sin)
whole2 = get_whole_namespace(numpy.sin)
The resulting whole1 should be the string "math.sin", and the resulting whole2 should be the string "numpy.sin". Is there any standard library (or third party) functionality equivalent to the get_whole_namespace function in my example?
I originally thought that my desired result should be available in some dunder method or attribute. But the __name__ attribute of both math.sin and numpy.sin is the string "sin".
The math.sin function also has a __qualname__ attribute (which is the string "sin"), and a __module__ attribute (which is the string "math").
The numpy.sin function does not have either __qualname__ or __math__ attributes.
I am using Python 3.10.6 and NumPy 1.23.4.
Related
I am trying to create a dynamic method executor, where I have a list that will always contain two elements. The first element is the name of the file, the second element is the name of the method to execute.
How can I achieve this?
My below code unfortunately doesn't work, but it will give you an good indication of what I am trying to achieve.
from logic.intents import CenterCapacity
def method_executor(event):
call_reference = ['CenterCapacity', 'get_capacity']
# process method call
return call_reference[0].call_reference[1]
Thanks!
You can use __import__ to look up the module by name and then then use getattr to find the method. For example if the following code is in a file called exec.py then
def dummy(): print("dummy")
def lookup(mod, func):
module = __import__(mod)
return getattr(module, func)
if __name__ == "__main__":
lookup("exec","dummy")()
will output
dummy
Addendum
Alternatively importlib.import_module can be used, which although a bit more verbose, may be easier to use.
The most important difference between these two functions is that import_module() returns the specified package or module (e.g. pkg.mod), while __import__() returns the top-level package or module (e.g. pkg).
def lookup(mod, func):
import importlib
module = importlib.import_module(mod)
return getattr(module, func)
starting from:
from logic.intents import CenterCapacity
def method_executor(event):
call_reference = ['CenterCapacity', 'get_capacity']
# process method call
return call_reference[0].call_reference[1]
Option 1
We have several options, the first one is using a class reference and the getattr. For this we have to remove the ' around the class and instantiate the class before calling a reference (you do not have to instantiate the class when the method is a staticmethod.)
def method_executor(event):
call_reference = [CenterCapacity, 'get_capacity'] # We now store a class reference
# process method call
return getattr(call_reference[0](), call_reference[1])
option 2
A second option is based on this answer. It revolves around using the getattr method twice. We firstly get module using sys.modules[__name__] and then get the class from there using getattr.
import sys
def method_executor(event):
call_reference = ['CenterCapacity', 'get_capacity']
class_ref = getattr(sys.modules[__name__], call_reference[0])
return getattr(class_ref, call_reference[1])
Option 3
A third option could be based on a full import path and use __import__('module.class'), take a look at this SO post.
(Note: This answer assumes that the necessary imports have already happened, and you just need a mechanism to invoke the functions of the imported modules. If you also want the import do be done by some program code, I will have to add that part, using importlib library)
You can do this:
globals()[call_reference[0]].__dict__[call_reference[1]]()
Explanation:
globals() returns a mapping between global variable names and their referenced objects. The imported module's name counts as one of these global variables of the current module.
Indexing this mapping object with call_reference[0] returns the module object containing the function to be called.
The module object's __dict__ maps each attribute-name of the module to the object referenced by that attribute. Functions defined in the module also count as attributes of the module.
Thus, indexing __dict__ with the function name call_reference[1] returns the function object.
What is the difference between a python built-in and a normal object? we often say that in python everything is an object.
for example, when I do this in Python 3.6:
>>> import os, inspect
>>> inspect.getsource(os.scandir)
TypeError: <built-in function scandir> is not a module, class, method, function, traceback, frame, or code object
I have two questions:
is built-in function an object? if not is this why getsource throws TypeError?
why can't I find scandir in python3 documentation as a built-in?
You can't access the source of builtins and other modules that were written using the C API, since there isn't a Python source for them.
From the documentation for os.getsourcefile
Return the name of the Python source file in which an object was defined. This will fail with a TypeError if the object is a built-in module, class, or function.
I am not really a programmer but a computational statistician, so I may understand complex algorithms but not simple programming constructs.
My original problem is to check within a function if a module function is callable. I looked around and decided to go for a try (call function) - except (import module) to make it simple. I'd love to search sys.mod for this but I am running in some identifiability problems.
My current problem is that there are many ways of importing a function from a module: import module will define the function as module.function but from module import function will define it as function. Not to mention from module import function as myfunction. Therefore the same function can be called in several different ways.
My question is: is there a unique "signature" for a function that can be traced if the module is loaded? It would be fantastic to have the actual call alias to it.
ps besides: mod is mathematical function and sys.mod returns a list of loaded modules, but python (2.7) does not complain when you shadow the built-in mod function by doing the following, from sys import mod. I find this a bit awkward - is there any way to avoid this sort of shadowing programatically?
My original problem is to check within a function if a module function is callable.
By definition, all functions are callable. This will test if an object is callable: http://docs.python.org/library/functions.html#callable
Therefore the same function can be called in several different ways.
Yes, but it will be the same object. You can just use f is g to test if f and g are the same object.
Update: Why would you need to use a unique ID? Seriously, don't do this. You have is for identity tests, and the __hash__ method to define the hash function applicable.
It would be fantastic to have the actual call alias to it.
Not sure at all what you mean, but I think you just want it to always be one object. Which it is already.
mod is mathematical function and sys.mod returns a list of loaded modules, but python (2.7) does not complain to from sys import mod. I find this a bit awkward?
Then don't do that. You know about the import ... as syntax. Also mod is not by default in the global namespace (the operator % is for that).
Finally, python does complain about your import line:
>>> from sys import mod
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: cannot import name mod
(Thanks to kindall for pointing this out).
Assume I have a module with the following:
def foo(): pass
bar = foo
You can easily see that they're the same functions by using is or id():
>>> import demo
>>> from demo import *
>>> demo.foo is foo
True
>>> id(demo.foo) == id(foo)
True
>>> demo.bar is foo
True
They all refer to the same code object, it's just stored under different names in the scope dictionary.
# define modulus f
def mod(a,b):
return b % a
print mod(5,2)
alias:
modulus=mod
print modulus(5,2)
this is pretty pythonic construct, and it is pretty intuitive for mathematicians
different ways of import serve to help you place a function into different "name space" for later use in your program, sometimes you wish to use a function a lot so you choose variant that is shorter to write.
you can also do something like:
myat=math.atanh
to make alias in another "name space"
and use it as:
myat(x)
as it would use math.atanh(x) - becomes shorter to write
Typical programmers approach would be define all you want to use and then use it. What you are trying in my belief is to do it "lazy" => import module when you need a function. That is why you wish to know if function is "callable".
Python is not functional programming language (e.g. like haskel) so that you can load or refer "on demand".
hope this helps.
Here is some sample Python code:
import re
some_regex = re.compile(r"\s+1\s+")
result = some_regex.search(" 1 ")
dir(result)
I get back the following using Python 2.6.1:
['__copy__', '__deepcopy__', 'end', 'expand', 'group', 'groupdict', 'groups', 'span', 'start']
Yet result.re exists (from the interpreter):
>>> result.re
<_sre.SRE_Pattern object at 0x10041bc90>
How can an attribute not be listed when using the dir() function?
This page confirms the existence of the re attribute:
http://docs.python.org/library/re.html#re.MatchObject.re
Now I understand that if one tries to access an attribute which is not listed via dir(), then __getattr__ is called, but I don't see __getattr__ listed as one of the object's attributes either, so I'm left scratching my head.
Update
And here is proof of the existence of matchobject.re in the Python 2.6.1 documentation:
http://docs.python.org/release/2.6.1/library/re.html#re.MatchObject.re
You see this behavior because the class is implemented in C, and in the same way that dir() is unreliable with a custom __getattr__(), it is also unreliable when the C code defines a getattr function.
Here is a link to the Python 2.6 C code for the SRE_Match getattr function:
http://hg.python.org/cpython/file/f130ce67387d/Modules/_sre.c#l3565
Note that the methods defined in the match_methods array have Python implementations and are visible in the dir() output, but handled by an if in the match_getattr() function is not visible.
In Python 2.6, it looks like this includes the following attributes: lastindex, lastgroup, string, regs, re, pos, and endpos.
Here is a link to some of the Python 2.7 code which is slightly different. Here there is not a getattr function implemented for SRE_Match, and all methods and attributes can be found in the match_methods, match_members, and match_getset arrays, and everything is visible in dir().
http://hg.python.org/cpython/file/60a7b704de5c/Modules/_sre.c#l3612
The built-in function dir() is a convenience function and results in an approximate list of attributes. From the documentation:
Because dir() is supplied primarily as a convenience for use at an interactive prompt, it tries to supply an interesting set of names more than it tries to supply a rigorously or consistently defined set of names, and its detailed behavior may change across releases. For example, metaclass attributes are not in the result list when the argument is a class.
Note that it is impossible to always give a complete list of attributes, since classes can do in their __getattr__() and __getattribute__() methods whatever they want.
How can I get the int(), float(), dict(), etc. callables from their names? For example, I'm trying to save Python values to xml and storing the variable type as a string. Is there a way to get the callable from the string when converting from string back to the Python type?
Normally I would do something like getattr(myobj, 'str'), but there is no module to use as the first argument for these built-in conversion functions. I've also tried getattr(object, 'str'), but this doesn't work either since these functions are not part of the base 'object' type, merely globals to the language.
Normally I would do something like getattr(myobj, 'str'), but there is no module to use as the first argument for these built-in conversion functions.
Wrong, there is:
import __builtin__
my_str = getattr(__builtin__, "str")
(In Python 3.x: import builtins)
You don't need to import anything
vars(__builtins__)['dict']
vars(__builtins__)['float']
vars(__builtins__)['int']
etc.
One quick way is to invoke it from the __builtin__ module. For example
>>> import __builtin__
>>> __builtin__.__dict__['str'](10)
'10'