Is it possible to write arbitrary depth delegated generators in Python? - python

I would like to write a class with the following interface.
class Automaton:
""" A simple automaton class """
def iterate(self, something):
""" yield something and expects some result in return """
print("Yielding", something)
result = yield something
print("Got \"" + result + "\" in return")
return result
def start(self, somefunction):
""" start the iteration process """
yield from somefunction(self.iterate)
raise StopIteration("I'm done!")
def first(iterate):
while iterate("what do I do?") != "over":
continue
def second(iterate):
value = yield from iterate("what do I do?")
while value != "over":
value = yield from iterate("what do I do?")
# A simple driving process
automaton = Automaton()
#generator = automaton.start(first) # This one hangs
generator = automaton.start(second) # This one runs smoothly
next_yield = generator.__next__()
for step in range(4):
next_yield = generator.send("Continue...({})".format(step))
try:
end = generator.send("over")
except StopIteration as excp:
print(excp)
The idea is that Automaton will regularly yield values to the caller which will in turn send results/commands back to the Automaton.
The catch is that the decision process "somefunction" will be some user defined function I have no control over. Which means that I can't really expect it to call the iterate method will a yield from in front. Worst, it could be that the user wants to plug some third-party function he has no control over inside this Automaton class. Meaning that the user might not be able to rewrite his somefunction for it to include yield from in front of iterate calls.
To be clear: I completely understand why using the first function hangs the automaton. I am just wondering if there is a way to alter the definition of iterate or start that would make the first function work.

Related

How do I run two or more methods in a class like a chain?

I'm trying to learn OOP but I'm getting very confused with how I'm supposed to run the methods or return values. In the following code I want to run read_chapters() first, then sendData() with some string content that comes from read_chapters(). Some of the solutions I found did not use __init__ but I want to use it (just to see/learn how i can use them).
How do I run them? Without using __init__, why do you only return 'self'?
import datetime
class PrinceMail:
def __init__(self):
self.date2 = datetime.date(2020, 2, 6)
self.date1 = datetime.date.today()
self.days = (self.date1 - self.date2).days
self.file = 'The_Name.txt'
self.chapter = '' # Not sure if it would be better if i initialize chapter here-
# or if i can just use a normal variable later
def read_chapters(self):
with open(self.file, 'r') as book:
content = book.readlines()
indexes = [x for x in range(len(content)) if 'CHAPTER' in content[x]]
indexes = indexes[self.days:]
heading = content[indexes[0]]
try:
for i in (content[indexes[0]:indexes[1]]):
self.chapter += i # can i use normal var and return that instead?
print(self.chapter)
except IndexError:
for i in (content[indexes[0]:]):
self.chapter += i
print(self.chapter)
return self????? # what am i supposed to return? i want to return chapter
# The print works here but returns nothing.
# sendData has to run after readChapters automatically
def sendData(self):
pass
#i want to get the chapter into this and do something with it
def run(self):
self.read_chapters().sendData()
# I tried this method but it doesn't work for sendData
# Is there anyother way to run the two methods?
obj = PrinceMail()
print(obj.run())
#This is kinda confusing as well
Chaining methods is just a way to shorten this code:
temp = self.read_chapters()
temp.sendData()
So, whatever is returned by read_chapters has to have the method sendData. You should put whatever you want to return in read_chapters in a field of the object itself (aka self) in order to use it after chaining.
First of all, __init__ has nothing to do with what you want to achieve here. You can consider it as a constructor for other languages, this is the first function that is called when you create an object of the class.
Now to answer your question, if I am correct you just want to use the output of read_chapters in sendData. One of the way you can do that is by making the read_chapters a private method (that is if you don't want it to use through the object) using __ in the starting of the name like __read_chapters then make a call to the function inside the sendData function.
Another point to consider here is, when you are using self and don't intend to use the function through the object you don't need to return anything. self assigns the value to the attribute of the current instance. So, you can leave the function read_chapters at self.chapter = i and access the same in sendData.
Ex -
def sendData(self):
print(self.chapter)
I'm not an expert but, the reason to return self is because it is the instance of the class you're working with and that's what allows you to chain methods.
For what you're trying to do, method chaining doesn't seem to be the best approach. You want to sendData() for each iteration of the loop in read_chapters()? (you have self.chapter = i which is always overwritten)
Instead, you can store the chapters in a list and send it after all the processing.
Also, and I don't know if this is a good practice but, you can have a getter to return the data if you want to do something different with (return self.chapter instead of self)
I'd change your code for:
import datetime
class PrinceMail:
def __init__(self):
self.date2 = datetime.date(2020, 2, 6)
self.date1 = datetime.date.today()
self.days = (self.date1 - self.date2).days
self.file = 'The_Name.txt'
self.chapter = []
def read_chapters(self):
with open(self.file, 'r') as book:
content = book.readlines()
indexes = [x for x in range(len(content)) if 'CHAPTER' in content[x]]
indexes = indexes[self.days:]
heading = content[indexes[0]]
try:
for i in (content[indexes[0]:indexes[1]]):
self.chapter.append(i)
except IndexError:
#not shure what you want to do here
for i in (content[indexes[0]:]):
self.chapter.append(i)
return self
# sendData has to run after readChapters automatically
def sendData(self):
pass
#do what ever with self.chapter
def get_raw_chapters(self):
return self.chapter
Also, check PEP 8 Style Guide for naming conventions (https://www.python.org/dev/peps/pep-0008/#function-and-variable-names)
More reading in
Method chaining - why is it a good practice, or not?
What __init__ and self do on Python?

Set with expiration not working inside running Python script

I have this class called DecayingSet which is a deque with expiration
class DecayingSet:
def __init__(self, timeout): # timeout in seconds
from collections import deque
self.timeout = timeout
self.d = deque()
self.present = set()
def add(self, thing):
# Return True if `thing` not already in set,
# else return False.
result = thing not in self.present
if result:
self.present.add(thing)
self.d.append((time(), thing))
self.clean()
return result
def clean(self):
# forget stuff added >= `timeout` seconds ago
now = time()
d = self.d
while d and now - d[0][0] >= self.timeout:
_, thing = d.popleft()
self.present.remove(thing)
I'm trying to use it inside a running script, that connects to a streaming api.
The streaming api is returning urls that I am trying to put inside the deque to limit them from entering the next step of the program.
class CustomStreamListener(tweepy.StreamListener):
def on_status(self, status, include_entities=True):
longUrl = status.entities['urls'][0]['expanded_url']
limit = DecayingSet(86400)
l = limit.add(longUrl)
print l
if l == False:
pass
else:
r = requests.get("http://api.some.url/show?url=%s"% longUrl)
When i use this class in an interpreter, everything is good.
But when the script is running, and I repeatedly send in the same url, l returns True every time indicating that the url is not inside the set, when is supposed to be. What gives?
Copying my comment ;-) I think the indentation is screwed up, but it looks like you're creating a brand new limit object every time on_status() is called. Then of course it would always return True: you'd always be starting with an empty limit.
Regardless, change this:
l = limit.add(longUrl)
print l
if l == False:
pass
else:
r = requests.get("http://api.some.url/show?url=%s"% longUrl)
to this:
if limit.add(longUrl):
r = requests.get("http://api.some.url/show?url=%s"% longUrl)
Much easier to follow. It's usually the case that when you're comparing something to a literal True or False, the code can be made more readable.
Edit
i just saw in the interpreter the var assignment is the culprit.
How would I use the same obj?
You could, for example, create the limit object at the module level. Cut and paste ;-)

Conditional if in asynchronous python program with twisted

I'm creating a program that uses the Twisted module and callbacks.
However, I keep having problems because the asynchronous part goes wrecked.
I have learned (also from previous questions..) that the callbacks will be executed at a certain point, but this is unpredictable.
However, I have a certain program that goes like
j = calc(a)
i = calc2(b)
f = calc3(c)
if s:
combine(i, j, f)
Now the boolean s is set by a callback done by calc3. Obviously, this leads to an undefined error because the callback is not executed before the s is needed.
However, I'm unsure how you SHOULD do if statements with asynchronous programming using Twisted. I've been trying many different things, but can't find anything that works.
Is there some way to use conditionals that require callback values?
Also, I'm using VIFF for secure computations (which uses Twisted): VIFF
Maybe what you're looking for is twisted.internet.defer.gatherResults:
d = gatherResults([calc(a), calc2(b), calc3(c)])
def calculated((j, i, f)):
if s:
return combine(i, j, f)
d.addCallback(calculated)
However, this still has the problem that s is undefined. I can't quite tell how you expect s to be defined. If it is a local variable in calc3, then you need to return it so the caller can use it.
Perhaps calc3 looks something like this:
def calc3(argument):
s = bool(argument % 2)
return argument + 1
So, instead, consider making it look like this:
Calc3Result = namedtuple("Calc3Result", "condition value")
def calc3(argument):
s = bool(argument % 2)
return Calc3Result(s, argument + 1)
Now you can rewrite the calling code so it actually works:
It's sort of unclear what you're asking here. It sounds like you know what callbacks are, but if so then you should be able to arrive at this answer yourself:
d = gatherResults([calc(a), calc2(b), calc3(c)])
def calculated((j, i, calc3result)):
if calc3result.condition:
return combine(i, j, calc3result.value)
d.addCallback(calculated)
Or, based on your comment below, maybe calc3 looks more like this (this is the last guess I'm going to make, if it's wrong and you'd like more input, then please actually share the definition of calc3):
def _calc3Result(result, argument):
if result == "250":
# SMTP Success response, yay
return Calc3Result(True, argument)
# Anything else is bad
return Calc3Result(False, argument)
def calc3(argument):
d = emailObserver("The argument was %s" % (argument,))
d.addCallback(_calc3Result)
return d
Fortunately, this definition of calc3 will work just fine with the gatherResults / calculated code block immediately above.
You have to put if in the callback. You may use Deferred to structure your callback.
As stated in previous answer - the preocessing logic should be handled in callback chain, below is simple code demonstration how this could work. C{DelayedTask} is a dummy implementation of a task which happens in the future and fires supplied deferred.
So we first construct a special object - C{ConditionalTask} which takes care of storring the multiple results and servicing callbacks.
calc1, calc2 and calc3 returns the deferreds, which have their callbacks pointed to C{ConditionalTask}.x_callback.
Every C{ConditionalTask}.x_callback does a call to C{ConditionalTask}.process which checks if all of the results have been registered and fires on a full set.
Additionally - C{ConditionalTask}.c_callback sets a flag of wheather or not the data should be processed at all.
from twisted.internet import reactor, defer
class DelayedTask(object):
"""
Delayed async task dummy implementation
"""
def __init__(self,delay,deferred,retVal):
self.deferred = deferred
self.retVal = retVal
reactor.callLater(delay, self.on_completed)
def on_completed(self):
self.deferred.callback(self.retVal)
class ConditionalTask(object):
def __init__(self):
self.resultA=None
self.resultB=None
self.resultC=None
self.should_process=False
def a_callback(self,result):
self.resultA = result
self.process()
def b_callback(self,result):
self.resultB=result
self.process()
def c_callback(self,result):
self.resultC=result
"""
Here is an abstraction for your "s" boolean flag, obviously the logic
normally would go further than just setting the flag, you could
inspect the result variable and do other strange stuff
"""
self.should_process = True
self.process()
def process(self):
if None not in (self.resultA,self.resultB,self.resultC):
if self.should_process:
print 'We will now call the processor function and stop reactor'
reactor.stop()
def calc(a):
deferred = defer.Deferred()
DelayedTask(5,deferred,a)
return deferred
def calc2(a):
deferred = defer.Deferred()
DelayedTask(5,deferred,a*2)
return deferred
def calc3(a):
deferred = defer.Deferred()
DelayedTask(5,deferred,a*3)
return deferred
def main():
conditional_task = ConditionalTask()
dFA = calc(1)
dFB = calc2(2)
dFC = calc3(3)
dFA.addCallback(conditional_task.a_callback)
dFB.addCallback(conditional_task.b_callback)
dFC.addCallback(conditional_task.c_callback)
reactor.run()

Circular Programming in Python (corecursion) - is it possible?

I know Python has some lazy implementations, and as such, I was wondering if it is possible to use circular programming in Python.
If it isn't, why?
I think you mean co-routines, not co-recursion. Yes, it's perfectly possible in Python, since PEP 342: Coroutines via Enhanced Generators has been implemented.
The canonical example is the consumer decorator:
def consumer(func):
def wrapper(*args,**kw):
gen = func(*args, **kw)
next(gen)
return gen
wrapper.__name__ = func.__name__
wrapper.__dict__ = func.__dict__
wrapper.__doc__ = func.__doc__
return wrapper
Using such consumer then let's you chain filters and push information through them, acting as a pipeline:
from itertools import product
#consumer
def thumbnail_pager(pagesize, thumbsize, destination):
while True:
page = new_image(pagesize)
rows, columns = pagesize / thumbsize
pending = False
try:
for row, column in product(range(rows), range(columns)):
thumb = create_thumbnail((yield), thumbsize)
page.write(
thumb, col * thumbsize.x, row * thumbsize.y
)
pending = True
except GeneratorExit:
# close() was called, so flush any pending output
if pending:
destination.send(page)
# then close the downstream consumer, and exit
destination.close()
return
else:
# we finished a page full of thumbnails, so send it
# downstream and keep on looping
destination.send(page)
#consumer
def jpeg_writer(dirname):
fileno = 1
while True:
filename = os.path.join(dirname,"page%04d.jpg" % fileno)
write_jpeg((yield), filename)
fileno += 1
# Put them together to make a function that makes thumbnail
# pages from a list of images and other parameters.
#
def write_thumbnails(pagesize, thumbsize, images, output_dir):
pipeline = thumbnail_pager(
pagesize, thumbsize, jpeg_writer(output_dir)
)
for image in images:
pipeline.send(image)
pipeline.close()
The central principles are python generators, and yield expressions; the latter lets a generator receive information from a caller.
Edit: Ah, Co-recursion is indeed a different concept. Note that the Wikipedia article uses python for it's examples, and moreover, uses python generators.
Did you try it?
def a(x):
if x == 1: return
print "a", x
b(x - 1)
def b(x):
if x == 1: return
print "b", x
a(x - 1)
a(10)
As a side note, python does not have tail recursion, and this will fail for x > 1000 (although this limit is configurable)

Any tips for a more pythonic way to implement state machine behavior?

For brevity, I'm just showing what can/must occur in states. I haven't run into any oddities in the state machine framework itself.
Here is a specific question:
Do you find it confusing that we have to return StateChange(...) and StateMachineComplete(...) whereas some of the of the other actions like some_action_1(...) and some_action_2(...) need not be returned - they're just direct method invocations?
I think that StateChange(...) needs to return because otherwise code beyond the StateChange(...) call will be executed. This isn't how a state machine should work! For example see the implementation of event1 in the ExampleState below
import abc
class State(metaclass=abc.ABCMeta):
# =====================================================================
# == events the state optionally or must implement ====================
# =====================================================================
# optional: called when the state becomes active.
def on_state_entry(self): pass
# optional: called when we're about to transition away from this state.
def on_state_exit(self): pass
#abc.abstractmethod
def event1(self,x,y,z): pass
#abc.abstractmethod
def event2(self,a,b): pass
#abc.abstractmethod
def event3(self): pass
# =====================================================================
# == actions the state may invoke =====================================
# =====================================================================
def some_action_1(self,c,d,e):
# implementation omitted for brevity
pass
def some_action_2(self,f):
# implementation omitted for brevity
pass
class StateChange:
def __init__(self,new_state_type):
# implementation omitted for brevity
pass
class StateMachineComplete: pass
class ExampleState(State):
def on_state_entry(self):
some_action_1("foo","bar","baz")
def event1(self,x,y,z):
if x == "asdf":
return StateChange(ExampleState2)
else:
return StateChange(ExampleState3)
print("I think it would be confusing if we ever got here. Therefore the StateChange calls above are return")
def event2(self,a,b):
if a == "asdf":
return StateMachineComplete()
print("As with the event above, the return above makes it clear that we'll never get here.")
def event3(self):
# Notice that we're not leaving the state. Therefore this can just be a method call, nothing need be returned.
self.some_action_1("x","y","z")
# In fact we might need to do a few things here. Therefore a return call again doesn't make sense.
self.some_action_2("z")
# Notice we don't implement on_state_exit(). This state doesn't care about that.
When I need a state machine in Python, I store it as a dictionary of functions. The indices into the dictionary are the current states, and the functions do what they need to and return the next state (which may be the same state) and outputs. Turning the crank on the machine is simply:
state, outputs = machine_states[state](inputs)
By putting the outgoing state changes in code you're obfuscating the whole process. A state machine should be driven by a simple set of tables. One axis is the current state, and the other is the possible events. You have two or three tables:
The "next-state" table that determines the exit state
The "action" table that determines what action to take
The "read" table that determines whether you stay on the current input event or move on to the next.
The third table may or may not be needed depending on the complexity of the input "grammar".
There are more esoteric variations, but I've never found a need for more than this.
I also struggled to find a good state_machine solution in python. So I wrote state_machine
It works like the following
#acts_as_state_machine
class Person():
name = 'Billy'
sleeping = State(initial=True)
running = State()
cleaning = State()
run = Event(from_states=sleeping, to_state=running)
cleanup = Event(from_states=running, to_state=cleaning)
sleep = Event(from_states=(running, cleaning), to_state=sleeping)
#before('sleep')
def do_one_thing(self):
print "{} is sleepy".format(self.name)
#before('sleep')
def do_another_thing(self):
print "{} is REALLY sleepy".format(self.name)
#after('sleep')
def snore(self):
print "Zzzzzzzzzzzz"
#after('sleep')
def big_snore(self):
print "Zzzzzzzzzzzzzzzzzzzzzz"
person = Person()
print person.current_state == person.sleeping # True
print person.is_sleeping # True
print person.is_running # False
person.run()
print person.is_running # True
person.sleep()
# Billy is sleepy
# Billy is REALLY sleepy
# Zzzzzzzzzzzz
# Zzzzzzzzzzzzzzzzzzzzzz
print person.is_sleeping # True

Categories

Resources