using random to choose a function to run (50/50 chance needed) - python

code being run(apart from modules and functions)
random.choice(frontright(),frontleft())
error raised:
TypeError: Random.choice() takes 2 positional arguments but 3 were given

random.choice expects a sequence to choose from. That's not what you passed it.
Functions are objects, so you can put them in a sequence and then choose from that. But if you use parens you aren't using the function itself; you are actually calling all of the functions, then passing their return values to random.choice
This code might demonstrate the difference.
import random
def frontright():
print ('frontright was called')
return 'frontright result'
def frontleft():
print ('frontleft was called')
return 'frontleft result'
my_functions = (frontright, frontleft)
fn = random.choice(my_functions)
print('calling one function')
fn()
print()
print('collection of results')
my_choice = random.choice((frontright(), frontleft()))
print('chosen result was:', my_choice)
You're actually passing 2 when it expects 1 sequence, but internally it is calling an instance method, and at that point it's really 3 instead of 2, if you include self.

You might need to do this random.choice([frontright(), frontleft()])
https://www.w3schools.com/python/ref_random_choice.asp

Related

Python: Decorator that reduces number of parameters in a function by fixing others

Let's say that I have the function
def add(a,b,c):
return a+b+c
I want a decorator that fixes the value of b, say to 5, and return a function with only two parameters a and c.
def add5(a,c):
return a+c+5
The function add5 should not have any other parameter. I'm not looking to solve this with a default parameters for b.
You can use functools.partial:
functools.partial(func, /, *args, **keywords)
Return a new partial
object which when called will behave like func called with the
positional arguments args and keyword arguments keywords.
from functools import partial
def add(a,b,c):
return a+b+c
If you want to give a fixed value to the first positional argument, you can do
add5 = partial(add, 5)
print(add5(1, 2))
# 8
As the first positional argument (a) will be replaced by 5, you can't do:
print(add5(a=3, b=4))
# TypeError: add() got multiple values for argument 'a'
If you want to control which parameter you fix, use keyword arguments:
add5 = partial(add, b=5)
print(add5(a=1, c=2))
# 8
In Python, functions are the first class objects, which means that –
Functions are objects; they can be referenced to, passed to a variable and returned from other functions as well.
Functions can be defined inside another function and can also be passed as argument to another function.
Decorators are very powerful and useful tool in Python since it allows programmers to modify the behavior of function or class. Decorators allow us to wrap another function in order to extend the behavior of wrapped function, without permanently modifying it.
In Decorators, functions are taken as the argument into another function and then called inside the wrapper function.
in your case:
def my_custom_decorator(f):
def outer_function(*args):
res = f(*args)
return res + 5
return outer_function
#my_custom_decorator
def A_and_C(a, c):
return a+c
print(A_and_C(2,3))
You can do it by
def add5(*arg):
return sum(args)+5
print(add5(1,2))
This will sum all the argument that you are passing to the function and will add 5 to the sum of the args.
Output
8

Can I implement a function or better a decorator that makes func(a1)(a2)(a3)...(an) == func(a1, a2, a3,...,an)? [duplicate]

On Codewars.com I encountered the following task:
Create a function add that adds numbers together when called in succession. So add(1) should return 1, add(1)(2) should return 1+2, ...
While I'm familiar with the basics of Python, I've never encountered a function that is able to be called in such succession, i.e. a function f(x) that can be called as f(x)(y)(z).... Thus far, I'm not even sure how to interpret this notation.
As a mathematician, I'd suspect that f(x)(y) is a function that assigns to every x a function g_{x} and then returns g_{x}(y) and likewise for f(x)(y)(z).
Should this interpretation be correct, Python would allow me to dynamically create functions which seems very interesting to me. I've searched the web for the past hour, but wasn't able to find a lead in the right direction. Since I don't know how this programming concept is called, however, this may not be too surprising.
How do you call this concept and where can I read more about it?
I don't know whether this is function chaining as much as it's callable chaining, but, since functions are callables I guess there's no harm done. Either way, there's two ways I can think of doing this:
Sub-classing int and defining __call__:
The first way would be with a custom int subclass that defines __call__ which returns a new instance of itself with the updated value:
class CustomInt(int):
def __call__(self, v):
return CustomInt(self + v)
Function add can now be defined to return a CustomInt instance, which, as a callable that returns an updated value of itself, can be called in succession:
>>> def add(v):
... return CustomInt(v)
>>> add(1)
1
>>> add(1)(2)
3
>>> add(1)(2)(3)(44) # and so on..
50
In addition, as an int subclass, the returned value retains the __repr__ and __str__ behavior of ints. For more complex operations though, you should define other dunders appropriately.
As #Caridorc noted in a comment, add could also be simply written as:
add = CustomInt
Renaming the class to add instead of CustomInt also works similarly.
Define a closure, requires extra call to yield value:
The only other way I can think of involves a nested function that requires an extra empty argument call in order to return the result. I'm not using nonlocal and opt for attaching attributes to the function objects to make it portable between Pythons:
def add(v):
def _inner_adder(val=None):
"""
if val is None we return _inner_adder.v
else we increment and return ourselves
"""
if val is None:
return _inner_adder.v
_inner_adder.v += val
return _inner_adder
_inner_adder.v = v # save value
return _inner_adder
This continuously returns itself (_inner_adder) which, if a val is supplied, increments it (_inner_adder += val) and if not, returns the value as it is. Like I mentioned, it requires an extra () call in order to return the incremented value:
>>> add(1)(2)()
3
>>> add(1)(2)(3)() # and so on..
6
You can hate me, but here is a one-liner :)
add = lambda v: type("", (int,), {"__call__": lambda self, v: self.__class__(self + v)})(v)
Edit: Ok, how this works? The code is identical to answer of #Jim, but everything happens on a single line.
type can be used to construct new types: type(name, bases, dict) -> a new type. For name we provide empty string, as name is not really needed in this case. For bases (tuple) we provide an (int,), which is identical to inheriting int. dict are the class attributes, where we attach the __call__ lambda.
self.__class__(self + v) is identical to return CustomInt(self + v)
The new type is constructed and returned within the outer lambda.
If you want to define a function to be called multiple times, first you need to return a callable object each time (for example a function) otherwise you have to create your own object by defining a __call__ attribute, in order for it to be callable.
The next point is that you need to preserve all the arguments, which in this case means you might want to use Coroutines or a recursive function. But note that Coroutines are much more optimized/flexible than recursive functions, specially for such tasks.
Here is a sample function using Coroutines, that preserves the latest state of itself. Note that it can't be called multiple times since the return value is an integer which is not callable, but you might think about turning this into your expected object ;-).
def add():
current = yield
while True:
value = yield current
current = value + current
it = add()
next(it)
print(it.send(10))
print(it.send(2))
print(it.send(4))
10
12
16
Simply:
class add(int):
def __call__(self, n):
return add(self + n)
If you are willing to accept an additional () in order to retrieve the result you can use functools.partial:
from functools import partial
def add(*args, result=0):
return partial(add, result=sum(args)+result) if args else result
For example:
>>> add(1)
functools.partial(<function add at 0x7ffbcf3ff430>, result=1)
>>> add(1)(2)
functools.partial(<function add at 0x7ffbcf3ff430>, result=3)
>>> add(1)(2)()
3
This also allows specifying multiple numbers at once:
>>> add(1, 2, 3)(4, 5)(6)()
21
If you want to restrict it to a single number you can do the following:
def add(x=None, *, result=0):
return partial(add, result=x+result) if x is not None else result
If you want add(x)(y)(z) to readily return the result and be further callable then sub-classing int is the way to go.
The pythonic way to do this would be to use dynamic arguments:
def add(*args):
return sum(args)
This is not the answer you're looking for, and you may know this, but I thought I would give it anyway because if someone was wondering about doing this not out of curiosity but for work. They should probably have the "right thing to do" answer.

Getting a random number each time in a script

I'm having trouble with the following bit of code:
from random import randint
class character():
__init__(self):
#init stuff here
def luck(self, Luck = randint(0, 3)):
return Luck
I have to call this method multiple times in my script, for multiple instances, to get a different number each time. The problem that I'm having is that whenever i call this method, no matter from what instance, I always to get the same result. For example, in the following code:
Foo = character()
Bar = character()
for foobar in range(3):
print(Foo.luck(), Bar.luck())
I'd get as my output:
1 1
1 1
1 1
By the way, in my code, I used randint(0, 3) as an argument for the luck() method because, in some specific situations, I'd like to assign values to it myself.
Back to the point, how can I get a different number each time?
This is a definition for the luck function. If the user specifies a number it will be returned. If instead no argument is given, Luck will be set from randint and that random value returned.
def luck(self, Luck = None):
if Luck is None:
Luck = randint(0,3)
return Luck
In python, the default expressions that set the default values for function arguments are only executed once. This means that once you define the luck method, whatever value randint() spit out the first time will stay for all invocations.
To get a new random number every time the method is called, you need to call it inside the body of the method:
class Character(object):
#staticmethod # required if you're not accepting `self`
def luck():
return randint(0, 3)
This will work as expected.
You can use None or something for a default argument, check if you got None and if you did - call randint() (if not just return what you did get).
If you use randint() in the function deceleration it will randomize it only once.

Handling a function argument with a decorator

At the core, what I'm trying to do is take a number of functions that look like this undecorated validation function:
def f(k: bool):
def g(n):
# check that n is valid
return n
return g
And make them look like this decorated validation function:
#k
def f():
def g(n):
# check that n is valid
return n
return g
The idea here being that k is describing the same functionality across all of the implementing functions.
Specifically, these functions are all returning 'validation' functions for use with the voluptuous validation framework. So all the functions of type f() are returning a function that is later executed by Schema(). k is actually allow_none, which is to say a flag that determines if a None value is ok. A very simple example might be this sample use code:
x = "Some input value."
y = None
input_validator = Schema(f(allow_none=True))
x = input_validator(x) # succeeds, returning x
y = input_validator(y) # succeeds, returning None
input_validator_no_none = Schema(f(allow_none=False))
x = input_validator(x) # succeeds, returning x
y = input_validator(y) # raises an Invalid
Without changing the sample use code I am attempting to achieve the same result by changing the undecorated validation functions to decorated validation functions. To give a concrete example, changing this:
def valid_identifier(allow_none: bool=True):
min_range = Range(min=1)
validator = Any(All(int, min_range), All(Coerce(int), min_range))
return Any(validator, None) if allow_none else validator
To this:
#allow_none(default=True)
def valid_identifier():
min_range = Range(min=1)
return Any(All(int, min_range), All(Coerce(int), min_range))
The function returned from these two should be equivalent.
What I've tried to write is this, utilizing the decorator library:
from decorator import decorator
#decorator
def allow_none(default: bool=True):
def decorate_validator(wrapped_validator, allow_none: bool=default):
#wraps(wrapped_validator)
def validator_allowing_none(*args, **kwargs):
if allow_none:
return Any(None, wrapped_validator)
else:
return wrapped_validator(*args, **kwargs)
return validator_allowing_none
return decorate_validator
And I have a unittest.TestCase in order to test if this works as expected:
#allow_none()
def test_wrapped_func():
return Schema(str)
class TestAllowNone(unittest.TestCase):
def test_allow_none__success(self):
test_string = "blah"
validation_function = test_wrapped_func(allow_none=False)
self.assertEqual(test_string, validation_function(test_string))
self.assertEqual(None, validation_function(None))
But my test returns the following failure:
def validate_callable(path, data):
try:
> return schema(data)
E TypeError: test_wrapped_func() takes 0 positional arguments but 1 was given
I tried debugging this, but couldn't get the debugger to actually enter the decoration. I suspect that because of naming issues, such as raised in this (very lengthy) blog post series, that test_wrapped_func isn't getting it's argument list properly set, and so the decorator is never even executed, but it may also be something else entirely.
I tried some other variations. By removing the function parentheses from #allow_none:
#allow_none
def test_wrapped_func():
return Schema(str)
I get a different error:
> validation_function = test_wrapped_func(allow_none=False)
E TypeError: test_wrapped_func() got an unexpected keyword argument 'allow_none'
Dropping the #decorator fails with:
> validation_function = test_wrapped_func(allow_none=False)
E TypeError: decorate_validator() missing 1 required positional argument: 'wrapped_validator'
Which makes sense because #allow_none takes an argument, and so the parentheses would logically be needed. Replacing them gives the original error.
Decorators are subtle, and I'm clearly missing something here. This is similar to currying a function, but it's not quite working. What am I missing about how this should be implemented?
I think you are putting your allow_none=default argument at the wrong nesting level. It should be on the innermost function (the wrapper), rather than the decorator (the middle level).
Try something like this:
def allow_none(default=True): # this is the decorator factory
def decorator(validator): # this is the decorator
#wraps(validator)
def wrapper(*args, allow_none=default, **kwargs): # this is the wrapper
if allow_none:
return Any(None, validator)
else:
return validator(*args, **kwargs)
return wrapper
return decorator
If you don't need the default to be settable, you can get rid of the outermost layer of nesting and just make the default value a constant in the wrapper function (or omit it if your callers will always pass a value). Note that as I wrote it above, the allow_none argument to the wrapper is a keyword-only argument. If you want to pass it as a positional parameter, you can move it ahead of *args, but that requires that it be the first positional argument, which may not be desireable from an API standpoint. More sophisticated solutions are probably possible, but overkill for this answer.

error passing a class method as an argument to another class method

I'm attempting to pass a class method as an argument to another class method. Below is an example...
import time
class MyClass(object):
def doSomething(self,argument2,argument3):
print argument2,argument3
def attemptTenTimes(self,fun,*args):
attempt = 0
while True:
try:
print 'Number of arguments: %s' % len(*args)
print args
output = fun(*args)
return output
except Exception as e:
print 'Exception: %s' % e
attempt += 1
time.sleep(10)
if attempt >= 10: return
else: continue
MC = MyClass()
MC.attemptTenTimes(MC.doSomething,(MC,'argument2','argument3',))
The output is....
Number of arguments: 3
((<__main__.MyClass object at 0x7f7e6be4e390>, 'argument2', 'argument3'),)
Exception: doSomething() takes exactly 3 arguments (2 given)
Number of arguments: 3
((<__main__.MyClass object at 0x7f7e6be4e390>, 'argument2', 'argument3'),)
Exception: doSomething() takes exactly 3 arguments (2 given)
Number of arguments: 3
((<__main__.MyClass object at 0x7f7e6be4e390>, 'argument2', 'argument3'),)
Exception: doSomething() takes exactly 3 arguments (2 given).............
I am passing three arguments to the function doSomething, however, this exception keeps coming up. I've used functions as arguments to other functions before, but this is my first time doing it within the context of a class. Any help would be appreciated. Thanks.
You've not passed three arguments; you passed two. You need this:
MC.attemptTenTimes(MC.doSomething,*('argument2','argument3'))
or this (equivalent):
MC.attemptTenTimes(MC.doSomething,'argument2','argument3')
The attemptTenTimes function has the parameter *args, which collects positional arguments into a tuple referred to locally as args. You're passing it the whole tuple as the only positional argument, so locally you have a variable named args that looks like ((MC,'argument2','argument3'),). As a result, when you unpack it and pass it to your function, you're just passing the inner tuple.
As an aside, you also shouldn't be unpacking args when you pass it to len, because that'll throw an error. You just want len(args) on line 12 up there.
Alternately, you could change your attemptTenTimes function signature to this:
def attemptTenTimes(self,fun,args):
You could then pass the whole args tuple to it, as you were originally doing. I believe using *args is more standard, though, and personally I think it's clearer.

Categories

Resources