I am trying to make an interpreter of code written in some language in python and currently stuck on interpreting functions. It seems there is a way of creating classes dynamically with something like MyClass = type("MyClass", (object, ), dict()) but I can't find a way of creating functions. I have an idea of direct line to line translation of code in python code and execution but that's not really what I want to do. So is there a way to create functions dynamically or the best I can get is something like:
foo_code = compile('def foo(): return "bar"', "<string>", "exec")
foo_func = FunctionType(foo_code.co_consts[0], globals(), "foo")
with need of translation?
Just found out the answer. The best way to dynamically create the function is to not create it at all. It kind of conflicts with my question but still solves the problem. The idea is to put a structure of a function as key in a dictionary and a line of the beginning of the function as a value. So now it is possible by structure of the function find it, execute and then return to normal execution by saving the last line of Main in advance.
Related
def enumerator(fruits):
for index, fruit in enumerate(fruits):
print(f"Fruit: {fruit}, under the index: {index}.")
just_a_variable = enumerator(["apple", "banana", "lemon"]) # Im just assigning function call
# to the variable "just_a_variable"
# and boom, when I run the program the function is called. Makes no sense (it shouldn't work this way, does it?)
I assume this is happening because there is a print statement in the function but it still doesn't make sense. if I change the print statement to "return" it suddenly doesn't compile, that is what I was expecting from just using print. I'm I missing something here?
In general if you add parenthesis after a function (like one of the two examples below), it is called.
function_name(arguments)
variable = function_name(arguments)
If you just want a variable to point to a function:
variable = function
Then the following two statements will become identical:
variable(arguments)
function(arguments)
Having said so, this seems a bit useless to me. With you function defined the way it currently is, there isn't a way I know to "assign" it to a variable and pass arguments at the same time.
This does change the structure of your code, but you can perhaps use yield instead of return.
The line just_a_variable = enumerator(["apple", "banana", "lemon"]) is calling function enumerator. Technically, that is what the parenthesis after enumerator do.
Perhaps you noticed that simply running the file is running that line (and calling enumerator). As a scripting language, this is how Python works (in contrast to Java or other compiled languages).
I want to create python functions on the go, with the following template:
def x(sender,data):
r=b''
r+=sender.send_type0(data[0])
r+=sender.send_type1(data[1])
r+=sender.send_type2(data[2])
...
r+=sender.send_typen(data[n])
return r
I want to create many of those functions from an array which holds type data as a 2D array.
I can generate simple functions at runtime, but there I would like to run a for-statement only at the generation, and not at every call of the function.
How can I achieve this?
You could use getattr to dynamically type out the attribute...
def x(sender,data):
return b"".join(
getattr(sender, "send_type"+i)(data[i])
for i in xrange(len(data))
)
I don't think you're going to find much of a performance advantage in having the function precompiled, assuming that is even possible...
Assume I have a list my_list, a variable var, and a block of code that modifies the list using the variable
my_list = ['foo']
var = 'bar'
my_list.append(var)
In the actual task I have a lot of variables like var and a lot of commands like append which modify the list. I want to relegate those commands to another module. In the case at hand I would like to have two modules: modify.py which contains the modifying commands
my_list.append(var)
and main.py which defines the list and the variable and somehow uses the code from the modify.py
my_list = ['foo']
var = 'bar'
import_and_run modify
The goal is to make the main file more readable. Modifying commands in my case can be nicely grouped and would really be good as separate modules. However, I am only aware of the practice when one imports a function from a module, not a block of code. I do not want to make the whole modify.py module a function because
1) I don't want to pass all the arguments needed. Rather, I want modify.py to directly have access to main.py name space.
2) code in modify.py is not really a function. It runs only once. Also, I do not the whole module to be a body of a function, that just does not feel right.
How do I achieve that? Or the whole attitude is wrong?
If your goal is to make the code more readable, I'd suggest taking these steps.
Decompose your problem into a series of separate actions.
Give these actions names.
Define a function main in your module that calls functions named
after the actions:
def main():
do_setp1()
do_step2()
# etc
return
Separate you existing code into the functions that you're calling in
main()
As #flaschbier suggested, collect related, common parameters into dictionaries to make passing the around easier to manage.
Consider repeating these steps on your new functions, decomposing
them into sub-functions.
Done well, you should be left with a file that's easier to look at, because the function definitions and their indented bodies break up the flow of text.
The code should be easier to reason about because you only need to understand one function at a time, instead of the entire script.
Generally you want to keep all the code related to a particular task in a single module, unless there's more than say 500 lines. But before moving code into separate modules see if you can reduce the total lines of code by factoring repeated code into functions, or making your code more succinct: for example see if for loops can be replaced by list comprehensions.
Consider using code linting tools to help you make the code well-formatted.
So in summary: don't go against the grain of Python by hiding code in another
module and going down the import_and_run route. Instead use good code organisation and Python's inherent good visual structure to make your code readable.
By the way, seems like you still haven't grasped the concept of Python modules.
Well, modules in Python are the .py files. Each function, class or even variables in a .py file can be imported into another program.
Consider a (perhaps crazy) example like this crazy.py:
class crazyCl:
# crazy stuffs
pass
def crazyFn():
# some another crazy stuffs
crazyVar = 'Please do not try this at home'
Now, to import any of these, into another program, say goCrazy.py in the same folder, simply do this
import crazy # see ma, no .py
if __name__ == '__main__':
print crazy.crazyVar # Please do not try this at home
This is a simple introduction to Python modules. There are many other features like packages that have to be tried out.
As a simple introduction, this should do. Hope you got some idea.
Question: Is there a way to make a function object in python using strings?
Info: I'm working on a project which I store data in a sqlite3 server backend. nothing to crazy about that. a DAL class is very commonly done through code generation because the code is so incredibly mundane. But that gave me an idea. In python when a attribute is not found, if you define the function __getattr__ it will call that before it errors. so the way I figure it, through a parser and a logic tree I could dynamically generate the code I need on its first call, then save the function object as a local attrib. for example:
DAL.getAll()
#getAll() not found, call __getattr__
DAL.__getattr__(self,attrib)#in this case attrib = getAll
##parser logic magic takes place here and I end up with a string for a new function
##convert string to function
DAL.getAll = newFunc
return newFunc
I've tried the compile function, but exec, and eval are far from satisfactory in terms of being able to accomplish this kind of feat. I need something that will allow multiple lines of function. Is there another way to do this besides those to that doesn't involve writing the it to disk? Again I'm trying to make a function object dynamically.
P.S.: Yes, I know this has horrible security and stability problems. yes, I know this is a horribly in-efficient way of doing this. do I care? no. this is a proof of concept. "Can python do this? Can it dynamically create a function object?" is what I want to know, not some superior alternative. (though feel free to tack on superior alternatives after you've answered the question at hand)
The following puts the symbols that you define in your string in the dictionary d:
d = {}
exec "def f(x): return x" in d
Now d['f'] is a function object. If you want to use variables from your program in the code in your string, you can send this via d:
d = {'a':7}
exec "def f(x): return x + a" in d
Now d['f'] is a function object that is dynamically bound to d['a']. When you change d['a'], you change the output of d['f']().
can't you do something like this?
>>> def func_builder(name):
... def f():
... # multiline code here, using name, and using the logic you have
... return name
... return f
...
>>> func_builder("ciao")()
'ciao'
basically, assemble a real function instead of assembling a string and then trying to compile that into a function.
If it is simply proof on concept then eval and exec are fine, you can also do this with pickle strings, yaml strings and anything else you decide to write a constructor for.
I've been thinking about this far too long and haven't gotten any idea, maybe some of you can help.
I have a folder of python scripts, all of which have the same surrounding body (literally, I generated it from a shell script), but have one chunk that's different than all of them. In other words:
Top piece of code (always the same)
Middle piece of code (changes from file to file)
Bottom piece of code (always the same)
And I realized today that this is a bad idea, for example, if I want to change something from the top or bottom sections, I need to write a shell script to do it. (Not that that's hard, it just seems like it's very bad code wise).
So what I want to do, is have one outer python script that is like this:
Top piece of code
Dynamic function that calls the middle piece of code (based on a parameter)
Bottom piece of code
And then every other python file in the folder can simply be the middle piece of code. However, normal module wouldn't work here (unless I'm mistaken), because I would get the code I need to execute from the arguement, which would be a string, and thus I wouldn't know which function to run until runtime.
So I thought up two more solutions:
I could write up a bunch of if statements, one to run each script based on a certain parameter. I rejected this, as it's even worse than the previous design.
I could use:
os.command(sys.argv[0] scriptName.py)
which would run the script, but calling python to call python doesn't seem very elegant to me.
So does anyone have any other ideas? Thank you.
If you know the name of the function as a string and the name of module as a string, then you can do
mod = __import__(module_name)
fn = getattr(mod, fn_name)
fn()
Another possible solution is to have each of your repetitive files import the functionality from the main file
from topAndBottom import top, bottom
top()
# do middle stuff
bottom()
In addition to the several answers already posted, consider the Template Method design pattern: make an abstract class such as
class Base(object):
def top(self): ...
def bottom(self): ...
def middle(self): raise NotImplementedError
def doit(self):
self.top()
self.middle()
self.bottom()
Every pluggable module then makes a class which inherits from this Base and must override middle with the relevant code.
Perhaps not warranted for this simple case (you do still have to import the right module in order to instantiate its class and call doit on it), but still worth keeping in mind (together with its many Pythonic variations, which I have amply explained in many tech talks now available on youtube) for cases where the number or complexity of "pluggable pieces" keeps growing -- Template Method (despite its horrid name;-) is a solid, well-proven and highly scalable pattern [[sometimes a tad too rigid, but that's exactly what I address in those many tech talks -- and that problem doesn't apply to this specific use case]].
However, normal module wouldn't work here (unless I'm mistaken), because I would get the code I need to execute from the arguement, which would be a string, and thus I wouldn't know which function to run until runtime.
It will work just fine - use __import__ builtin or, if you have very complex layout, imp module to import your script. And then you can get the function by module.__dict__[funcname] for example.
Importing a module (as explained in other answers) is definitely the cleaner way to do this, but if for some reason that doesn't work, as long as you're not doing anything too weird you can use exec. It basically runs the content of another file as if it were included in the current file at the point where exec is called. It's the closest thing Python has to a source statement of the kind included in many shells. As a bare minimum, something like this should work:
exec(open(filename).read(None))
How about this?
function do_thing_one():
pass
function do_thing_two():
pass
dispatch = { "one" : do_thing_one,
"two" : do_thing_two,
}
# do something to get your string from the command line (optparse, argv, whatever)
# and put it in variable "mystring"
# do top thing
f = dispatch[mystring]
f()
# do bottom thing