How to add builtin functions? - python

I am new to python programming. How can I add new built-in functions and keywords to python interpreter using C or C++?

In short, it is technically possible to add things to Python's builtins†, but it is almost never necessary (and generally considered a very bad idea).
In longer, it's obviously possible to modify Python's source and add new builtins, keywords, etc… But the process for doing that is a bit out of the scope of the question as it stands.
If you'd like more detail on how to modify the Python source, how to write C functions which can be called from Python, or something else, please edit the question to make it more specific.
If you are new to Python programming and you feel like you should be modifying the core language in your day-to-day work, that's probably an indicator you should simply be learning more about it. Python is used, unmodified, for a huge number of different problem domains (for example, numpy is an extension which facilitates scientific computing and Blender uses it for 3D animation), so it's likely that the language can handle your problem domain too.
†: you can modify the __builtin__ module to “add new builtins”… But this is almost certainly a bad idea: any code which depends on it will be very difficult (and confusing) to use anywhere outside the context of its original application. Consider, for example, if you add a greater_than_zero “builtin”, then use it somewhere else:
$ cat foo.py
import __builtin__
__builtin__.greater_than_zero = lambda x: x > 0
def foo(x):
if greater_than_zero(x):
return "greater"
return "smaller"
Anyone who tries to read that code will be confused because they won't know where greater_than_zero is defined, and anyone who tries to use that code from an application which hasn't snuck greater_than_zero into __builtin__ won't be able to use it.
A better method is to use Python's existing import statement: http://docs.python.org/tutorial/modules.html

for python 3.6 onward use import builtins.
# example 1
import builtins
def f():
print('f is called')
builtins.g = f
g() # output = f is called
####################################
# example 2
import builtins
k = print
def f(s):
k('new print called : ' + s)
builtins.print = f
print('abc') # output = new print is called abc

While David Wolever's answer is perfect, it should be noted again that the asker is new to Python. Basically all he wants is a global function, which can be done in two different ways...
Define a function in your module and use it.
Define a function in a different module and import it using the "from module import *" statement.
I think the asker's solution is the 2nd option and anyone new to Python having this question should look in to the same.
For an advance user, I would agree with Wolever's suggestion that it is a bad idea to insert a new function in to the builtin module. However, may be the user is looking for a way to avoid importing an always-used module in every script in the project. And that is a valid use case. Of course the code will not make sense to people who aren't part of the project but that shouldn't be a concern. Anyways, such users should look in to the PYTHONSTARTUP environment variable. I would suggest looking it up in the Index of the Python documentation and look at all links that talks about this environment variable and see which page serves your purpose. However, this solution works for interactive mode only and does not work for sub-main script.
For an all around solution look in to this function that I have implemented: https://drive.google.com/file/d/19lpWd_h9ipiZgycjpZW01E34hbIWEbpa/view
Yet another way is extending or embedding Python and it is a relatively complex topic. It is best to read the Python documentation on the same. For basic users, all I would say is that...
Extending means adding new builtin modules to the Python interpreter.
Embedding means inserting Python interpreter into your application.
And advanced users already know what they are doing!

You can use builtins module.
Example 1:
import builtins
def write(x):
print(x)
builtins.write = write
write("hello")
# output:
# Hello
Example 2:
import builtins
def hello(*name):
print(f"Hello, {' '.join(name)}!")
builtins.hello = hello
hello("Clark", "Kent")
# output:
# Hello, Clark Kent!

Related

Using __future__ style imports for module specific features in Python

The Python future statement from __future__ import feature provides a nice way to ease the transition to new language features. Is it is possible to implement a similar feature for Python libraries: from myproject.__future__ import feature?
It's straightforward to set a module wide constants on an import statement. What isn't obvious to me is how you could ensure these constants don't propagate to code executed in imported modules -- they should also require a future import to enable the new feature.
This came up recently in a discussion of possible indexing changes in NumPy. I don't expect it will actually be used in NumPy, but I can see it being useful for other projects.
As a concrete example, suppose that we do want to change how indexing works in some future version of NumPy. This would be a backwards incompatible change, so we decide we to use a future statement to ease the transition. A script using this new feature looks something like this:
import numpy as np
from numpy.__future__ import orthogonal_indexing
x = np.random.randn(5, 5)
print(x[[0, 1], [0, 1]]) # should use the "orthogonal indexing" feature
# prints a 2x2 array of random numbers
# we also want to use a legacy project that uses indexing, but
# hasn't been updated to the use the "orthogonal indexing" feature
from legacy_project import do_something
do_something(x) # should *not* use "orthogonal indexing"
If this isn't possible, what's the closest we can get for enabling local options? For example, is to possible to write something like:
from numpy import future
future.enable_orthogonal_indexing()
Using something like a context manager would be fine, but the problem is that we don't want to propagate options to nested scopes:
with numpy.future.enable_orthogonal_indexing():
print(x[[0, 1], [0, 1]]) # should use the "orthogonal indexing" feature
do_something(x) # should *not* use "orthogonal indexing" inside do_something
The way Python itself does this is pretty simple:
In the importer, when you try to import from a .py file, the code first scans the module for future statements.
Note that the only things allowed before a future statement are strings, comments, blank lines, and other future statements, which means it doesn't need to fully parse the code to do this. That's important, because future statements can change the way the code is parsed (in fact, that's the whole point of having them…); strings, comments, and blank lines can be handled by the lexer step, and future statements can be parsed with a very simple special-purpose parser.
Then, if any future statements are found, Python sets a corresponding flag bit, then re-seeks to the top of the file and calls compile with those flags. For example, for from __future__ import unicode_literals, it does flags |= __future__.unicode_literals.compiler_flag, which changes flags from 0 to 0x20000.
In this "real compile" step, the future statements are treated as normal imports, and you will end up with a __future__._Feature value in the variable unicode_literals in the module's globals.
Now, you can't quite do the same thing, because you're not going to reimplement or wrap the compiler. But what you can do is use your future-like statements to signal an AST transform step. Something like this:
flags = []
for line in f:
flag = parse_future(line)
if flag is None:
break
flags.append(flag)
f.seek(0)
contents = f.read()
tree = ast.parse(contents, f.name)
for flag in flags:
tree = transformers[flag](tree)
code = compile(tree, f.name)
Of course you have to write that parse_future function to return 0 for a blank line, comment, or string, a flag for a recognized future import (which you can look up dynamically if you want), or None for anything else. And you have to write the AST transformers for each flag. But they can be pretty simple—e.g., you can transform Subscript nodes into different Subscript nodes, or even into Call nodes that call different functions based on the form of the index.
To hook this into the import system, see PEP 302. Note that this gets simpler in Python 3.3, and simpler again in Python 3.4, so if you can require one of those versions, instead read the import system docs for your minimum version.
For a great example of import hooks and AST transformers being used in real life, see MacroPy. (Note that it's using the old 2.3-style import hook mechanism; again, your own code can be simpler if you can use 3.3+ or 3.4+. And of course your code isn't generating the transforms dynamically, which is the most complicated part of MacroPy…)
The __future__ in Python is both a module and also not. The Python __future__ is actually not imported from anywhere - it is a construct used by the Python bytecode compiler, deliberately chosen so that no new syntax needs to be created. There is also a __future__.py in the library directory; it can be imported as such: import __future__; and then you can for example access the __future__.print_function to find out which Python version makes the feature optionally available and in which version the feature is on by default.
It is possible to make a __future__ module that knows what is being imported. Here is an example of myproject/__future__.py that can intercept feature imports on per module basis:
import sys
import inspect
class FutureMagic(object):
inspect = inspect
#property
def more_magic(self):
importing_frame = self.inspect.getouterframes(
self.inspect.currentframe())[1][0]
module = importing_frame.f_globals['__name__']
print("more magic imported in %s" % module)
sys.modules[__name__] = FutureMagic()
On load time the module is replaced with a FutureMagic() instance. Whenever more_magic is imported from myproject.FutureMagic, the more_magic property method will be called, and it will print out the name of the module that imported the feature:
>>> from myproject.__future__ import more_magic
more magic imported in __main__
Now, you could have a bookkeeping of the modules that have imported this feature. Doing import myproject.__future__; myproject.__future__.more_magic would trigger the same machinery, but you could also ensure that the more_magic import be at the beginning of the file - its global variables at that point shouldn't contain anything else except values returned from this fake module; otherwise the value is being accessed for inspection only.
However the real question is: how could you use this - to find out from which module the function is being called is quite expensive, and would limit the usefulness of this feature.
Thus a possibly more fruitful approach could be to use import hooks to do source translation on abstract syntax trees on modules that do from mypackage.__future__ import more_magic, possibly changing all object[index] into __newgetitem__(operand, index).
No, you can't. The real __future__ import is special in that its effects are local to the individual file where it occurs. But ordinary imports are global: once one module does import blah, blah is executed and is available globally; other modules that later do import blah just retrieve the already-imported module. This means that if from numpy.__future__ changes something in numpy, everything that does import numpy will see the change.
As an aside, I don't think this is what that mailing list message is suggesting. I read it as suggesting an effect that is global, equivalent to setting a flag like numpy.useNewIndexing = True. This would mean that you should only set that flag at the top level of your application if you know that all parts of your application will work with that.
No, there is no reasonable way to do this. Let's go through the requirements.
First, you need to figure out which modules have your custom future statement enabled. Standard imports aren't up to this, but you could require them to e.g. call some enabling function and pass __name__ as a parameter. This is somewhat ugly:
from numpy.future import new_indexing
new_indexing(__name__)
This falls apart in the face of importlib.reload(), but meh.
Next, you need to figure out whether your caller is running in one of these modules. You'd start by pulling out the stack via inspect.stack() (which won't work under all Python implementations, misses C extension modules, etc.) and then goof around with inspect.getmodule() and the like.
Frankly, this is just a Bad Idea.
If the "feature" that you want to control can be boiled down to changing a name, then this is easy to do, like
from module.new_way import something
vs
from module.old_way import something
The feature you suggested is not, of course, but I would argue that this is the only Pythonic way of having different behavior in different scopes (and I do think you mean scope, not module, e.g., what if someone does an import inside a function definition), since scoping names is controlled and well supported by the interpreter itself.

How to change the string to class object in another file

I already use this function to change some string to class object.
But now I have defined a new module. How can I implement the same functionality?
def str2class(str):
return getattr(sys.modules[__name__], str)
I want to think some example, but it is hard to think. Anyway, the main problem is maybe the file path problem.
If you really need an example, the GitHub code is here.
The Chain.py file needs to perform an auto action mechanism. Now it fails.
New approach:
Now I put all files under one filefold, and it works, but if I use the modules concept, it fails. So if the problem is in a module file, how can I change the string object to relative class object?
Thanks for your help.
You can do this by accessing the namespace of the module directly:
import module
f = module.__dict__["func_name"]
# f is now a function and can be called:
f()
One of the greatest things about Python is that the internals are accessible to you, and that they fit the language paradigm. A name (of a variable, class, function, whatever) in a namespace is actually just a key in a dictionary that maps to that name's value.
If you're interested in what other language internals you can play with, try running dir() on things. You'd be surprised by the number of hidden methods available on most of the objects.
You probably should write this function like this:
def str2class(s):
return globals()[s]
It's really clearer and works even if __name__ is set to __main__.

How can I dynamically import in Python?

I'm totally new with Python,can anyone please let me know how I can do the following two imports in a python script followed by the other line WHILE i IS BEING CHANGED IN EACH LOOP?
(The following three lines are in a "for" loop whose counter is "i")
import Test_include_i
from Test_include_i import*
model = Test_include_i.aDefinedFunction
Thank you very much :)
This is not a good idea, but this is the implementation of it:
from importlib import import_module # Awesome line! :)
for i in range(1000):
test_include = import_module("Test_include_%s" % i)
model = test_include.aDefinedFunction
Regarding the differences between the provided methods:
__import__ is the low-level interface that handles from bla import blubb and import bla statements. It's direct use is according to the docs discouraged nowadays.
importlib.import_module is a convenience wrapper to __import__ which is preferred. The imported module will be recorded in sys.modules and thus be cached. If you changed the code during the session and want to use the new version you have to reload it explicitly using imp.reload.
imp.load_module is even closer to the internals and will always load the newest version of the module for you, i.e. if it is already loaded load_module is equivalent to a imp.reload call on the module. However to use this function you have to provide all 4 arguments, which are basically what imp.find_module returns.
You need to use the __import__ function, and perhaps importlib, although you should consider if that's what you really want to do. Perhaps explain what you're trying to achieve, and there will probably be a better way.
You can use imp.load_module(), which accepts a string as module name. See http://docs.python.org/2/library/imp.html#imp.load_module

Python: Print functioncalls and namespaces

I was wondering, for debugging purposes, if it is possible to see what namespaces and modules you are operating with once you do an import and furthermore to see where a function was called.
If I have a function f(x) and a rather complicated structure in my code, is there a way to see where f(x) is being called without adding prints all over the place?
Something like f.print_occurance()
"f was called in function integrate"
"f was called in function linspace"
"f was called in function enumerate"
Something similar to do this.
As for the first question, suppose I import a module "import somemodule"
Now if that module imports other modules, can I see what namespaces and modules have been imported/used without looking up somemodule.py (or its header file if it exists like in c/cpp).
Sorry if this is a newbie question, just seems like basic tricks I should know for error handling and debugging but googling returned nothing useful.
You could possibly write your own f.print_occurence() attribute. Create a varible that flags 'true' when the function starts then the f.print_occurence() will recognize the flag and print accordingly.
You should definitely look at the traceback and inspect modules.
For a simple way to do this:
traceback.print_stack(limit=2)
This will be ugly, but tell you which function is being called and what called it. You can look at the modules for how to use them to fit your needs.
You can look at the imported modules with sys.modules

dynamic module creation

I'd like to dynamically create a module from a dictionary, and I'm wondering if adding an element to sys.modules is really the best way to do this. EG
context = { a: 1, b: 2 }
import types
test_context_module = types.ModuleType('TestContext', 'Module created to provide a context for tests')
test_context_module.__dict__.update(context)
import sys
sys.modules['TestContext'] = test_context_module
My immediate goal in this regard is to be able to provide a context for timing test execution:
import timeit
timeit.Timer('a + b', 'from TestContext import *')
It seems that there are other ways to do this, since the Timer constructor takes objects as well as strings. I'm still interested in learning how to do this though, since a) it has other potential applications; and b) I'm not sure exactly how to use objects with the Timer constructor; doing so may prove to be less appropriate than this approach in some circumstances.
EDITS/REVELATIONS/PHOOEYS/EUREKA:
I've realized that the example code relating to running timing tests won't actually work, because import * only works at the module level, and the context in which that statement is executed is that of a function in the testit module. In other words, the globals dictionary used when executing that code is that of __main__, since that's where I was when I wrote the code in the interactive shell. So that rationale for figuring this out is a bit botched, but it's still a valid question.
I've discovered that the code run in the first set of examples has the undesirable effect that the namespace in which the newly created module's code executes is that of the module in which it was declared, not its own module. This is like way weird, and could lead to all sorts of unexpected rattlesnakeic sketchiness. So I'm pretty sure that this is not how this sort of thing is meant to be done, if it is in fact something that the Guido doth shine upon.
The similar-but-subtly-different case of dynamically loading a module from a file that is not in python's include path is quite easily accomplished using imp.load_source('NewModuleName', 'path/to/module/module_to_load.py'). This does load the module into sys.modules. However this doesn't really answer my question, because really, what if you're running python on an embedded platform with no filesystem?
I'm battling a considerable case of information overload at the moment, so I could be mistaken, but there doesn't seem to be anything in the imp module that's capable of this.
But the question, essentially, at this point is how to set the global (ie module) context for an object. Maybe I should ask that more specifically? And at a larger scope, how to get Python to do this while shoehorning objects into a given module?
Hmm, well one thing I can tell you is that the timeit function actually executes its code using the module's global variables. So in your example, you could write
import timeit
timeit.a = 1
timeit.b = 2
timeit.Timer('a + b').timeit()
and it would work. But that doesn't address your more general problem of defining a module dynamically.
Regarding the module definition problem, it's definitely possible and I think you've stumbled on to pretty much the best way to do it. For reference, the gist of what goes on when Python imports a module is basically the following:
module = imp.new_module(name)
execfile(file, module.__dict__)
That's kind of the same thing you do, except that you load the contents of the module from an existing dictionary instead of a file. (I don't know of any difference between types.ModuleType and imp.new_module other than the docstring, so you can probably use them interchangeably) What you're doing is somewhat akin to writing your own importer, and when you do that, you can certainly expect to mess with sys.modules.
As an aside, even if your import * thing was legal within a function, you might still have problems because oddly enough, the statement you pass to the Timer doesn't seem to recognize its own local variables. I invoked a bit of Python voodoo by the name of extract_context() (it's a function I wrote) to set a and b at the local scope and ran
print timeit.Timer('print locals(); a + b', 'sys.modules["__main__"].extract_context()').timeit()
Sure enough, the printout of locals() included a and b:
{'a': 1, 'b': 2, '_timer': <built-in function time>, '_it': repeat(None, 999999), '_t0': 1277378305.3572791, '_i': None}
but it still complained NameError: global name 'a' is not defined. Weird.

Categories

Resources