Python, why does this function go into an infinite loop - python

def new_if (pred,then_clause,else_clause):
if pred:
then_clause
else:
else_clause
def p(x):
new_if(x>5,print(x),p(2*x))
p(1)
I think the function should stop once x reaches 8 and 8 is printed out.
Thanks a lot for helping

Your code doesn't do what you think it does.
Every time you call p it executes the code inside that method, which in your case calls new_if with some arguments. However you are evaluating those arguments immediately, which means before entering new_if your code is executing print(x) and p(2*x). This causes p to get called again, repeating the process.

There seems to be some general confusion in how you think your code is evaluated: in particular, what you think as predicates and clauses really are not. The arguments are evaluated before the call to new_if is made. Hence, you get a infinite recursive call to p, by evaluating p(2*x) almost as soon as you call p.
You could achieve what you want by passing functions, which you then evaluate within your new_if function. This can be done with lambda functions, like so:
def new_if (pred,then_clause,else_clause):
if pred():
then_clause()
else:
else_clause()
def p(x):
new_if(lambda: x>5, lambda: print(x), lambda: p(2*x))
p(1)
In this case, pred, then_clause, else_clause are callables which you need to call (()) for them to be executed.

Related

Python infinite loop and if statement

The question requires me to determine the output of the following code.
def new_if(pred, then_clause, else_clause):
if pred:
then_clause
else:
else_clause
def p(x):
new_if(x>5, print(x), p(2*x))
p(1)
I dont understand why it will be an infinite loop.
of output 1,2,4,8,16....and so on.
From what i understand, passing print(x) as a parameter will
straightaway print x, that is why the output has 1,2,4 even though the predicate is not True.
What i dont understand is after x>5, when pred is True,
Why the function does not end at the if pred:
Is it because there is no return value? Even after i put return then_clause or else_clause it is still an infinite loop.
I am unable to test this on pythontutor as it is infinite recursion.
Thank you for your time.
Python doesn't let you pass expressions like x > 5 as code to other functions (at least, not directly as the code is trying to do). If you call a function like foo(x > 5), the x > 5 expression is evaluated immediately in the caller's scope and only the result of the evaluation is passed to the function being called. The same happens for function calls within other function calls. When Python sees foo(bar()), it calls bar first, then calls foo with bar's return value.
In the p(x) function, the code is trying to pass p(2*x) to the new_if function, but the interpreter never gets to new_if since the p calls keep recursing forever (or rather until an exception is raised for exceeding the maximum recursion depth).
One way to make the code work would be to put the expressions into lambda functions, and changing new_if to call them. Bundling the code up into a function lets you delay the evaluation of the expression until the function is called, and there's no infinite recursion since pred_func is generally going to return True at some point (though it will still recurse forever if you call something like p(0) or p(-1)):
def new_if(pred_func, then_func, else_func):
if pred_func():
then_func()
else:
else_func()
def p(x):
new_if(lambda: x>5, lambda: print(x), lambda: p(2*x))
Note that lambdas feel a little bit odd to me for then_func or else_func, since we don't care about or use the return values from them at all. A lambda function always returns the result of its expression. That's actually pretty harmless in this case, since both print and p return None anyway, which is the same as what Python would return for us if we didn't explicitly return from a regular (non-lambda) function. But for me at least, it seems more natural to use a lambda when the return value means something. (Perhaps new_if should return the value returned from whichever function it calls?)
If you don't like writing closures (i.e. functions that have to look up x in the enclosing scope), you could instead use functools.partial to bind pre-calculated arguments to functions like print and p without calling those functions immediately. For instance:
from functools import partial
def p(x):
return new_if(partial((5).__lt__, x), partial(print, x), partial(p, 2*x))
This only works if each of the expressions can be turned into a single call to an existing function. It can be done in this case (with a little creativity and careful syntax for pred_func), but probably won't be possible in more complicated cases.
Its also worth noting that the evaluation of 2*x happens immediately in the p function's scope, before new_if is called. If that multiplication was the expensive part of the else_func logic, that could be problematic (you'd want to defer the work to when else_func was actually called).
you are calling function from itself, that causes infinite loop and you have nothing to stop the function.
def new_if(pred, then_clause, else_clause):
if pred:
then_clause
else:
else_clause
def p(x):
if x<5:
new_if(x>5, print(x),p(2*x))
p(1)
this will solve it

Understanding a specific decorator, Why is there a function inside a function?

def lowercasewrapper(func):
def wrapper(*args, **kwargs):
return [item.lower() for item in func(*args, **kwargs)]
return wrapper
I understand what decorators do, I have implemented the decorator above into my code and it works, but I'm little unsure about a few things.
Why can't (func) be replaced by (*args, **kwargs) and in the process remove the def wrapper line? I'm guessing the first 2 lines don't do the same thing, but to me that's what it seems like. It seems like:
def lowercasewrapper(accept function)
def wrapper(accept function)
What is the significance of the word 'func' here? I noticed I can replace that word with anything and my code still works. Does the function I put below #lowercasewrapper just feed into the decorator regardless of whats in the '( )'?
Also, a little off topic but the word item also has no significance right? I can replace that with any word as well and it still works.
I would appreciate if anyone would try to help and explain and answer in detail instead of redirecting me to a "what's a decorator" thread.
The short version is that a decorator actually turns this:
#decorated
def f(*args):
# function body
Into this:
def f(*args):
# function body
f = decorated(f)
So the reason you need the inner function is that the decorator must return a function, or the above makes no sense. With that in mind:
Point 1: notice that the last line returns wrapper, as in the function wrapper itself. This is why you can't remove that part; the function is actually building an altered function to return.
Points 2 and 3: You're right, it's just an arbitrary variable name and has no meaning outside this function.
So! With that in mind, here's what's going on in the decorator:
lowercasewrapper(f) is called (where f is apparently assumed to return an iterable of strings)
lowercasewrapper defines another function that takes some arbitrary arguments, then calls f on those arguments, then returns the result but with the items converted to lowercase
lowercasewrapper then returns the altered function
The biggest obstacle here is likely to be the idea of returning a function as opposed to returning the result of calling a function. Read up on first-class functions (or see Leon Young's link) if this makes no sense to you.

Is there a way to decorate/perform operations on every line of a function in Python?

Say I have a function or method that does something repetitive, like checking a value, before performing every operation it does, like so:
def myfunc():
if mybool:
do_operation_1()
else:
return
if mybool:
do_operation_2()
else:
return
...
These checks get repetitive, and end up wasting a lot of time and keyboard springs, especially when they are needed very often.
If you have control over the operation functions, like, do_operation_N you can decorate the functions with something that checks the boolean.
But what if you don't have control over the individual do_operation_N operations? If, for each line in a function or method, I want the same check to be performed, is there some way to "insert" it without explicitly writing it in on each operation line? For example, is there some decorator magic by which I could do the following?
def magic_decorator(to_decorate):
def check(*args, **kwargs):
for call in to_decorate: #magic
if mybool:
to_decorate.do_call(call) #magic
else:
return #or break, raise an exception, etc
return check
#magic_decorator
def myfunc():
do_operation_1()
do_operation_2()
...
If there is a way to achieve this, I don't care if it uses decorators or not; I just want some way to say "for every line in function/method X, do Y first".
The "magic" example of a do_call method above is shorthand for what I'm after, but it would encounter serious problems with out-of-order execution of individual lines (for example, if a function's first line was a variable assignment, and its second was a use of that variable, executing them out of order would cause problems).
To be clear: the ability to externally control the line-by-line order of a function's execution is not what I'm trying to achieve: ideally, I'd just implement something that, in the natural execution order, would perform an operation each time myfunc does something. If "does something" ends up being limited to "calls a function or method" (excluding assignments, if checks, etc), that is fine.
Store your operations in a sequence, then use a loop:
ops = (do_operation_1, do_operation_2, do_operation_3)
for op in ops:
if mybool:
op()
else:
return
Essentially, you can extract the file and line number from the decorated function, go re-read the function, compile it to an AST, insert nodes in the AST, and then compile the AST and use that as the function.
This method can be used for very long functions, which is a problem if you are using the approach above.

Python generators and coroutines

I am studying coroutines and generators in various programming languages.
I was wondering if there is a cleaner way to combine together two coroutines implemented via generators than yielding back at the caller whatever the callee yields?
Let's say that we are using the following convention: all yields apart from the last one return null, while the last one returns the result of the coroutine. So, for example, we could have a coroutine that invokes another:
def A():
# yield until a certain condition is met
yield result
def B():
# do something that may or may not yield
x = bind(A())
# ...
return result
in this case I wish that through bind (which may or may not be implementable, that's the question) the coroutine B yields whenever A yields until A returns its final result, which is then assigned to x allowing B to continue.
I suspect that the actual code should explicitly iterate A so:
def B():
# do something that may or may not yield
for x in A(): ()
# ...
return result
which is a tad ugly and error prone...
PS: it's for a game where the users of the language will be the designers who write scripts (script = coroutine). Each character has an associated script, and there are many sub-scripts which are invoked by the main script; consider that, for example, run_ship invokes many times reach_closest_enemy, fight_with_closest_enemy, flee_to_allies, and so on. All these sub-scripts need to be invoked the way you describe above; for a developer this is not a problem, but for a designer the less code they have to write the better!
Edit: I recommend using Greenlet. But if you're interested in a pure Python approach, read on.
This is addressed in PEP 342, but it's somewhat tough to understand at first. I'll try to explain simply how it works.
First, let me sum up what I think is the problem you're really trying to solve.
Problem
You have a callstack of generator functions calling other generator functions. What you really want is to be able to yield from the generator at the top, and have the yield propagate all the way down the stack.
The problem is that Python does not (at a language level) support real coroutines, only generators. (But, they can be implemented.) Real coroutines allow you to halt an entire stack of function calls and switch to a different stack. Generators only allow you to halt a single function. If a generator f() wants to yield, the yield statement has to be in f(), not in another function that f() calls.
The solution that I think you're using now, is to do something like in Simon Stelling's answer (i.e. have f() call g() by yielding all of g()'s results). This is very verbose and ugly, and you're looking for syntax sugar to wrap up that pattern. Note that this essentially unwinds the stack every time you yield, and then winds it back up again afterwards.
Solution
There is a better way to solve this problem. You basically implement coroutines by running your generators on top of a "trampoline" system.
To make this work, you need to follow a couple patterns:
1. When you want to call another coroutine, yield it.
2. Instead of returning a value, yield it.
so
def f():
result = g()
# …
return return_value
becomes
def f():
result = yield g()
# …
yield return_value
Say you're in f(). The trampoline system called f(). When you yield a generator (say g()), the trampoline system calls g() on your behalf. Then when g() has finished yielding values, the trampoline system restarts f(). This means that you're not actually using the Python stack; the trampoline system manages a callstack instead.
When you yield something other than a generator, the trampoline system treats it as a return value. It passes that value back to the caller generator through the yield statement (using .send() method of generators).
Comments
This kind of system is extremely important and useful in asynchronous applications, like those using Tornado or Twisted. You can halt an entire callstack when it's blocked, go do something else, and then come back and continue execution of the first callstack where it left off.
The drawback of the above solution is that it requires you to write essentially all your functions as generators. It may be better to use an implementation of true coroutines for Python - see below.
Alternatives
There are several implementations of coroutines for Python, see: http://en.wikipedia.org/wiki/Coroutine#Implementations_for_Python
Greenlet is an excellent choice. It is a Python module that modifies the CPython interpreter to allow true coroutines by swapping out the callstack.
Python 3.3 should provide syntax for delegating to a subgenerator, see PEP 380.
Are you looking for something like this?
def B():
for x in A():
if x is None:
yield
else:
break
# continue, x contains value A yielded

Python method that is also a generator function?

I'm trying to build a method that also acts like a generator function, at a flip of a switch (want_gen below).
Something like:
def optimize(x, want_gen):
# ... declaration and validation code
for i in range(100):
# estimate foo, bar, baz
# ... some code here
x = calculate_next_x(x, foo, bar, baz)
if want_gen:
yield x
if not want_gen:
return x
But of course this doesn't work -- Python apparently doesn't allow yield and return in the same method, even though they cannot be executed simultaneously.
The code is quite involved, and refactoring the declaration and validation code doesn't make much sense (too many state variables -- I will end up with difficult-to-name helper routines of 7+ parameters, which is decidedly ugly). And of course, I'd like to avoid code duplication as much as possible.
Is there some code pattern that would make sense here to achieve the behaviour I want?
Why do I need that?
I have a rather complicated and time-consuming optimization routine, and I'd like to get feedback about its current state during runtime (to display in e.g. GUI). The old behaviour needs to be there for backwards compatibility. Multithreading and messaging is too much work for too little additional benefit, especially when cross-platform operation is necessary.
Edit:
Perhaps I should have mentioned that since each optimization step is rather lengthy (there are some numerical simulations involved as well), I'd like to be able to "step in" at a certain iteration and twiddle some parameters, or abort the whole business altogether. The generators seemed like a good idea, since I could launch another iteration at my discretion, fiddling in the meantime with some parameters.
Since all you seem to want is some sort of feedback for a long running function, why not just pass in a reference to a callback procedure that will be called at regular intervals?
An edit to my answer, why not just always yield? You can have a function which yields a single value. If you don't want that then just choose to have your function either return a generator itself or the value:
def stuff(x, want_gen):
if want_gen:
def my_gen(x):
#code with yield
return my_gen
else:
return x
That way you are always returning a value. In Python, functions are objects.
Well...we can always remember that yield was implemented in the language as a way to facilitate the existence of generator objects, but one can always implement them either from scratch, or getting the best of both worlds:
class Optimize(object):
def __init__(self, x):
self.x = x
def __iter__(self):
x = self.x
# ... declaration and validation code
for i in range(100):
# estimate foo, bar, baz
# ... some code here
x = calculate_next_x(x, foo, bar, baz)
yield x
def __call__(self):
gen = iter(self)
return gen.next()
def optimize(x, wantgen):
if wantgen:
return iter(Optimize(x))
else:
return Optimize(x)()
Not that you don't even need the "optimize" function wrapper - I just put it in there so it becomes a drop-in replacement for your example (would it work).
The way the class is declared, you can do simply:
for y in Optimize(x):
#code
to use it as a generator, or:
k = Optimize(x)()
to use it as a function.
Kind of messy, but I think this does the same as your original code was asking:
def optimize(x, want_gen):
def optimize_gen(x):
# ... declaration and validation code
for i in range(100):
# estimate foo, bar, baz
# ... some code here
x = calculate_next_x(x, foo, bar, baz)
if want_gen:
yield x
if want_gen:
return optimize_gen(x)
for x in optimize_gen(x):
pass
return x
Alternatively the for loop at the end could be written:
return list(optimize_gen(x))[-1]
Now ask yourself if you really want to do this. Why do you sometimes want the whole sequence and sometimes only want the last element? Smells a bit fishy to me.
It's not completely clear what you want to happen if you switch between generator and function modes.
But as a first try: perhaps wrap the generator version in a new method which explicitly throws away the intermediate steps?
def gen():
for i in range(100):
yield i
def wrap():
for x in gen():
pass
return x
print "wrap=", wrap()
With this version you could step into gen() by looping over smaller numbers of the range, make adjustments, and then use wrap() only when you want to finish up.
Simplest is to write two methods, one the generator and the other calling the generator and just returning the value. If you really want one function with both possibilities, you can always use the want_gen flag to test what sort of return value, returning the iterator produced by the generator function when True and just the value otherwise.
How about this pattern. Make your 3 line of changes to convert the function to a generator. Rename it to NewFunctionName. Replace the existing function with one that either returns the generator if want_gen is True, or exhausts the generator and returns the final value.

Categories

Resources