Change builtin function - Print - python

I'm trying to change the print builtin function from python.
The reason I'm trying to achieve this is cause my application has an verbose sys.argv, and I want to use print to console out the message whether the verbose is True or False.
I've tried to use create a new function, but I get a recursion error:
>>> import builtins
>>> def new_print(*args, **kwargs):
... print('print:', *args, **kwargs)
...
>>> old_print = builtins.print
>>> old_print(1)
1
>>> builtins.print = new_print
>>> print(1)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in new_print
File "<stdin>", line 2, in new_print
File "<stdin>", line 2, in new_print
[Previous line repeated 996 more times]
RecursionError: maximum recursion depth exceeded
I've tried using sys.stdout():
>>> import builtins
>>> import sys
>>> def new_print(*args, **kwargs):
... sys.stdout(*args, **kwargs)
...
>>> old_print = builtins.print
>>> old_print(1)
1
>>> builtins.print = new_print
>>> print(1
... )
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 2, in new_print
TypeError: '_io.TextIOWrapper' object is not callable
Although using those options, none seemed to work properly.
I need the new print function to be accesible for all my module files, without needing to import it every time. That's why I'm trying to change the builtin function, but I'm not sure that changing this in my init.py file will make a difference for my other files.
Please, if you have any idea on what could help me, please leave it below.

You almost had it. Call old_print in your new function:
def new_print(*args, **kwargs):
old_print('print:', *args, **kwargs)
old_print = print
print = new_print

Related

how to use the post_mortem method of pdb?

I am trying to understand how to use the pdb.post_mortem() method.
for this given file
# expdb.py
import pdb
import trace
def hello():
a = 6 * 9
b = 7 ** 2
c = a * b
d = 4 / 0
print(c)
tracer = trace.Trace()
Command prompt
'''
# first Try
λ python -i expdb.py
>>> pdb.post_mortem()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Program Files\Anaconda3\lib\pdb.py", line 1590, in post_mortem
raise ValueError("A valid traceback must be passed if no "
ValueError: A valid traceback must be passed if no exception is being handled
'''
'''
# Second Try
λ python -i expdb.py
>>> pdb.post_mortem(traceback=tracer.run('hello()') )
--- modulename: trace, funcname: _unsettrace
trace.py(77): sys.settrace(None)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Program Files\Anaconda3\lib\trace.py", line 500, in run
self.runctx(cmd, dict, dict)
File "C:\Program Files\Anaconda3\lib\trace.py", line 508, in runctx
exec(cmd, globals, locals)
File "<string>", line 1, in <module>
File "expdb.py", line 8, in hello
d = 4 / 0
ZeroDivisionError: division by zero
>>>
The post_mortem method wants a traceback object, not a Trace object. Traceback objects can be acquired from sys.exc_info()[2] inside of an except block, or you can simply call pdb.post_mortem() with no arguments directly (in the except block).
But either way, you must catch the exception before you can debug it.

error in function: 'str' object is not an iterator

I have a problem with the following function in python (where swap is a function that I have previously created and that works fine):
def swap (cards):
"""
>>> swap('FBFFFBFFBF')
'BFBBBFBBFB'
>>> swap('BFFBFBFFFBFBBBFBBBBFF')
'FBBFBFBBBFBFFFBFFFFBB'
>>> swap('FFBFBFBFBFBFBFBBFBFBFBFBBFBFBBFBF')
'BBFBFBFBFBFBFBFFBFBFBFBFFBFBFFBFB'
"""
invert=""
for i in cards:
if i is "B":
invert+="F"
else:
invert+="B"
return (invert)
def swap2 (cards):
"""
>>> next('FBFFFBFFBF')
'FFBBBFBBFF'
>>> next('BFFBFBFFFBFBBBFBBBBFF')
'FBBFBFBBBFBFFFBFFFFFF'
>>> next('FFBFBFBFBFBFBFBBFBFBFBFBBFBFBBFBF')
'FFFBFBFBFBFBFBFFBFBFBFBFFBFBFFBFF'
"""
indices=""
for pos, i in enumerate(cards):
if i =="B":
indices+=str(pos)
first= int(indices[0])
last= int(indices[-1])
prefix= cards [:first]
middle= cards [first:last+1]
suffix= cards [last+1:]
middle2=swap(middle)
return (prefix+middle2+suffix)
def turns (cards):
"""
>>> turns('FBFFFBFFBF')
3
>>> turns('BFFBFBFFFBFBBBFBBBBFF')
6
>>> turns('FFBFBFBFBFBFBFBBFBFBFBFBBFBFBBFBF')
14
"""
turn=0
while cards != 'F'*len(cards):
cards=swap2(cards)
turn+=1
return (turn)
if __name__ == '__main__':
import doctest
doctest.testmod()
when I run this function it works fine but if I use doctest to see if there are mistakes it tells me:
TypeError: 'str' object is not an iterator
I don't know where this error comes from.
Can anyone help me?
complete output of the doctest:
File "C:\Users\manuel\Documents\Gent MaStat\programming and algorithms\workspace_python\homeworks\Week 5\looking_up.py", line 25, in __main__.swap2
Failed example:
next('FBFFFBFFBF')
Exception raised:
Traceback (most recent call last):
File "C:\Users\manuel\Anaconda3\lib\doctest.py", line 1321, in __run
compileflags, 1), test.globs)
File "<doctest __main__.swap2[0]>", line 1, in <module>
next('FBFFFBFFBF')
TypeError: 'str' object is not an iterator
**********************************************************************
File "C:\Users\manuel\Documents\Gent MaStat\programming and algorithms\workspace_python\homeworks\Week 5\looking_up.py", line 27, in __main__.swap2
Failed example:
next('BFFBFBFFFBFBBBFBBBBFF')
Exception raised:
Traceback (most recent call last):
File "C:\Users\manuel\Anaconda3\lib\doctest.py", line 1321, in __run
compileflags, 1), test.globs)
File "<doctest __main__.swap2[1]>", line 1, in <module>
next('BFFBFBFFFBFBBBFBBBBFF')
TypeError: 'str' object is not an iterator
**********************************************************************
File "C:\Users\manuel\Documents\Gent MaStat\programming and algorithms\workspace_python\homeworks\Week 5\looking_up.py", line 29, in __main__.swap2
Failed example:
next('FFBFBFBFBFBFBFBBFBFBFBFBBFBFBBFBF')
Exception raised:
Traceback (most recent call last):
File "C:\Users\manuel\Anaconda3\lib\doctest.py", line 1321, in __run
compileflags, 1), test.globs)
File "<doctest __main__.swap2[2]>", line 1, in <module>
next('FFBFBFBFBFBFBFBBFBFBFBFBBFBFBBFBF')
TypeError: 'str' object is not an iterator
def swap2 (cards):
"""
>>> next('FBFFFBFFBF')
'FFBBBFBBFF'
>>> next('BFFBFBFFFBFBBBFBBBBFF')
'FBBFBFBBBFBFFFBFFFFFF'
>>> next('FFBFBFBFBFBFBFBBFBFBFBFBBFBFBBFBF')
'FFFBFBFBFBFBFBFFBFBFBFBFFBFBFFBFF'
"""
# …
The function is called swap2 but within the doctests, you are using next which happens to be a built-in function that does something completely different. That’s why you are seeing that error.
At times like this, it’s really important to actually look at the error messages. It clearly told you what was called:
File "<doctest __main__.swap2[0]>", line 1, in <module>
next('FBFFFBFFBF')
So if you don’t know where that was supposed to come from, then check out the error message. Doctest will tell you what it is executing: swap2[0], swap2[1], etc. tells you the function name the docstring that is being executed is by doctest and which test case it is (0 is the first, 1 the second etc.). It even gives you the line number (within the doctest case) where the error appeared, and of course the line that was causing the error. So use that information to go to the problematic code, and figure out what the problem is.

How can i check call arguments if they will change with unittest.mock

One of my classes accumulates values in a list, uses the list as an argument to a method on another object and deletes some of the values in this list. Something like
element = element_source.get()
self.elements.append(element)
element_destination.send(elements)
self.remove_outdated_elements()
But when when i was trying to test this behavior, i've found that mocks don't copy their arguments.
>>> from unittest.mock import Mock
>>> m = Mock()
>>> a = [1]
>>> m(a)
<Mock name='mock()' id='139717658759824'>
>>> m.call_args
call([1])
>>> a.pop()
1
>>> m.assert_called_once_with([1])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.3/unittest/mock.py", line 737, in assert_called_once_with
return self.assert_called_with(*args, **kwargs)
File "/usr/lib/python3.3/unittest/mock.py", line 726, in assert_called_with
raise AssertionError(msg)
AssertionError: Expected call: mock([1])
Actual call: mock([])
Is there a way to make Mock copy it's call arguments? If not, what is the best way to test this kind of behavior?
There is a chapter "Coping with mutable arguments" in the documentation, which suggests several solutions to your problem.
I'd go with this one:
>>> from copy import deepcopy
>>> class CopyingMock(MagicMock):
... def __call__(self, *args, **kwargs):
... args = deepcopy(args)
... kwargs = deepcopy(kwargs)
... return super(CopyingMock, self).__call__(*args, **kwargs)
...
>>> c = CopyingMock(return_value=None)
>>> arg = set()
>>> c(arg)
>>> arg.add(1)
>>> c.assert_called_with(set())
>>> c.assert_called_with(arg)
Traceback (most recent call last):
...
AssertionError: Expected call: mock(set([1]))
Actual call: mock(set([]))
>>> c.foo
<CopyingMock name='mock.foo' id='...'>

Accessing instance variables using index

I have a class which keeps track of errors encountered during a search operation
class SearchError(object):
def __init__(self,severity=0,message=''):
self.severity = severity
self.message = message
My idea is to make the instance variables indexable.
So if I have
a=SearchError(1,"Fatal Error")
I get
>>> a[0]
1
>>> a[1]
'Fatal Error'
>>> a.severity
1
>>> a.message
'Fatal Error'
To do this I add a __getitem__ method to the class. The class now becomes
class SearchError(object):
def __init__(self,severity=0,message=''):
self.severity = severity
self.message = message
def __getitem__(self,val):
if isinstance(val,slice):
return [self.__getitem__(i) for i in xrange(val.start,val.stop,val.step)]
elif val==0:
return self.severity
elif val==1:
return self.message
else:
raise IndexError
This does what I want but fails in cases such as
>>> a[:2]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 23, in __getitem__
TypeError: an integer is required
Or even
>>> a[-1]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 29, in __getitem__
IndexError
I understand my implementation of __getitem__ is limited. What I need to know is -
Is this the way to make instance variables indexable (Without using a list as variable container)?
How do I make the object behave 'sanely' as far as indexing goes?
This does everything:
>>> from collections import namedtuple
>>> _SearchError = namedtuple("SearchError", "severity message")
>>> def SearchError(severity=0, message=''):
return _SearchError(severity, message)
xrange requires all its arguments to be integers, but slice objects have None for unspecified attributes.
The best way to implement what you're after is to use namedtuple:
from collections import namedtuple
class SearchError(namedtuple('SearchError', 'severity message')):
def __new__(cls, severity=0, message=''):
return super(SearchError, cls).__new__(cls, severity, message)
The problem here is that slice objects default to having None values as attributes. So, a[:2] passes in slice(None,2,None). When you break this apart and try to pass it to xrange, you'll get a TypeError:
>>> xrange(None,2,None)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: an integer is required
Try a[0:2:1] and your code will work. Ultimately, you could do something like:
val.start = 0 if val.start is None else val.start
val.stop = 2 if val.stop is None else val.stop
val.stop = 2-val.stop if val.stop < 0 else val.stop
val.step = 1 if val.step is None else val.step
to unravel your slices into useable indices (In the general case, it'd be better to use len(self) instead of 2, but I don't know if your object has defined __len__.
Or, even better:
start,stop,step = val.indices(len(self))
Similarly, in the case where you do a[-1], you're not passing in a slice, a 0 or a 1, so you hit the else clause where you to raise an IndexError.
I mucked around the code and found the following solution.
It uses lists but to only store the names of the variables - Not the actual values. Additionally it also provides the method add to add a new variable with a given name and value. The new variable will also be indexable. (The add function is not needed by my class, but is nice to have around)
Thanks to #mgilson for nudging me in this direction
class SearchError(object):
def __init__(self,severity=0,message=''):
self.severity = severity
self.message = message
self._length = 2
self._vars_list = ['severity','message',]
def __getitem__(self,val):
if isinstance(val,slice):
steps = val.indices(self._length)
return [self.__getitem__(i) for i in xrange(*steps)]
elif val < 0:
i = self._length + val
if i < 0:
raise IndexError,"Index Out of range for SearchError object"
else:
return self.__getitem__(i)
else:
try:
return getattr(self,self._vars_list[val])
except IndexError:
raise IndexError,"Index Out of range for SearchError object"
def add(self,var_name,value):
self._vars_list.append(var_name)
self._length += 1
setattr(self,var_name,value)
The results
>>> a=SearchError(1,"Fatal Error")
>>> a[-1]
'Fatal Error'
>>> a[-2]
1
>>> a[-3]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 14, in __getitem__
IndexError: Index Out of range for SearchError object
>>> a[2]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 21, in __getitem__
IndexError: Index Out of range for SearchError object
>>> a[1]
'Fatal Error'
>>> a[0]
1
>>> a[:]
[1, 'Fatal Error']
>>> a.add('new_severity',8)
>>> a[:]
[1, 'Fatal Error', 8]
>>> a[3]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<stdin>", line 21, in __getitem__
IndexError: Index Out of range for SearchError object
>>> a[2]
8
>>> a.new_severity
8
>>> a[:3]
[1, 'Fatal Error', 8]
>>> a[:4]
[1, 'Fatal Error', 8]
>>> a[:2]
[1, 'Fatal Error']
As far as I can see, you need lists (to either store the actual variables or their names). If someone has a better alternative please do post

Can I create function using 'function' class?

I would like to know, just for fun, if I can create functions using function class constructor, i.e. without language construct def, just like creating class by instantiating type object. I know, function constructor takes 2 args - code object and globals. But I don't know how I should compile the source properly.
>>> def f():
... pass
>>> Function = type(f)
>>> Function
<class 'function'>
>>> code = compile("x + 10", "<string>", "exec")
>>> f = Function(code, globals())
>>> f()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "<string>", line 1, in <module>
NameError: name 'x' is not defined
>>> f(20)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
TypeError: <module>() takes 0 positional arguments but 1 was given
You need to set many attributes on the code object, such as co_varnames, co_nlocals, etc.
What clearly works is
code = compile("def foo(n):return n+10", "<string>", "exec").co_consts[0]
func = Function(code, globals())
but I guess this would be considered cheating. To really define the code object from scratch, do (for 3.3)
code = types.CodeType(1, 0, 1, 2, 67, b'|\x00\x00d\x01\x00\x17S', (None, 10),
(), ('x',), '<string>', 'f', 1, b'\x00\x01')
func = Function(code, globals())
print(func(10))
This, of course, requires you to do the entire compile() yourself.
Well, this works:
>>> x = 0
>>> def f(): pass
...
>>> func = type(f)
>>> code = compile("global x\nx += 10","<string>","exec")
>>> nf = func(code,globals())
>>> nf()
>>> x
10
Don't know how you'd pass arguments to the function though.

Categories

Resources