I'd like to write a function similar to open. I'd like to be able to call it with with, but also without with.
When I use contextlib.contextmanager, it makes my function work fine with with:
#contextmanager
def versioned(file_path, mode):
version = calculate_version(file_path, mode)
versioned_file = open(file_path, mode)
yield versioned_file
versioned_file.close()
So, I use it like this:
with versioned('file.txt', 'r') as versioned_file:
versioned_file.write(...)
How do I use it without with:
versioned_file = versioned('file.txt', 'r')
versioned_file.write(...)
versioned_file.close()
It complains:
AttributeError: 'GeneratorContextManager' object has no attribute 'write'
The problem is that contextmanager only provides exactly that; a context manager to be used in the with statement. Calling the function does not return the file object, but a special context generator which provides the __enter__ and __exit__ functions. If you want both the with statement and “normal” assignments to work, then you will have to have some object as the return value from your function that is fully usable and also provides the context functions.
You can do this pretty easily by creating your own type, and manually providing the context functions:
class MyOpener:
def __init__ (self, filename):
print('Opening {}'.format(filename))
def close (self):
print('Closing file.')
def write (self, text):
print('Writing "{}"'.format(text))
def __enter__ (self):
return self
def __exit__ (self, exc_type, exc_value, traceback):
self.close()
>>> f = MyOpener('file')
Opening file
>>> f.write('foo')
Writing "foo"
>>> f.close()
Closing file.
>>> with MyOpener('file') as f:
f.write('foo')
Opening file
Writing "foo"
Closing file.
You have this:
#contextmanager
def versioned(file_path, mode):
# some setup code
yield versioned_file
# some teardown code
Your basic problem of course is that what you yield from the context manager comes out of the with statement via as, but is not the object returned by your function. You want a function that returns something that behaves like the object open() returns. That is to say, a context manager object that yields itself.
Whether you can do that depends what you can do with the type of versioned_file. If you can't change it then you're basically out of luck. If you can change it then you need to implement the __enter__ and __exit__ functions as specified in PEP 343.
In your example code, though, it already has it, and your teardown code is the same as what it does itself on context exit already. So don't bother with contextlib at all, just return the result of open().
For other examples where you do need __enter__ and __exit__, if you like the contextlib style (and who doesn't?) you can bridge the two things. Write a function context that's decorated with #contextmanager and yields self. Then implement:
def __enter__(self):
self.context = context() # if context() is a method use a different name!
return self.context.__enter__()
def __exit__(self, *args):
return self.context.__exit__(*args)
It's basically up to you whether you find this better or worse than separating out the setup code into __enter__ and the teardown code into __exit__. I generally find it better.
Do you really need to use contextlib.contextmanager?
If you have a custom stream you would want to use Poke's solution.
But since you are just returning a file object, why go through all the hassle:
def versioned(file_path, mode):
version = calculate_version(file_path, mode)
return open(file_path, mode)
with versioned('test.conf', 'r') as stream:
print stream.read()
f = versioned('test.conf', 'r')
print f.read()
f.close()
Both will work perfectly fine :)
Related
I would like to write a decorator that closes all open files. For example:
#close_files
def view(request):
f = open('myfile.txt').read()
return render('template.html')
Ignoring any threadsafe stuff, how could I write such a decorator to close out any open files after the function is returned? I'm not interested in writing a context manager, but something like this:
def close_files(func):
#wraps(func)
def wrapper_close_files(*args):
return_value = func(*args, **kwargs)
# close open files here?
return return_value
return wrapper_close_files
Unfortunately I cannot use something like this:
with open('myfile.txt') as _f: f = _f.read()
I'm asking about how to do a decorator to close files where we do not have direct access to the variable which references the file (handler).
According to python docs 3.6:
If you’re not using the with keyword, then you should call f.close() to close the file and immediately free up any system resources used by it. If you don’t explicitly close a file, Python’s garbage collector will eventually destroy the object and close the open file for you, but the file may stay open for a while. Another risk is that different Python implementations will do this clean-up at different times.
So if you want to do this, you need a decorator who captures the open command from the builtins module. However, since we don't know how many files will be opened, we can use ExitStack. Thus, you can use a function that keeps the file handle within the ExitStack context.
My version for that is:
def close_opened_files(func):
#functools.wraps(func)
def wrapper(*args, **kwargs):
import builtins
_open = getattr(builtins, 'open')
try:
with contextlib.ExitStack() as stack:
def myopen(*args, **kwargs):
f = stack.enter_context(_open(*args, **kwargs))
return f
setattr(builtins, 'open', myopen)
ret = func(*args, **kwargs)
return ret
finally:
setattr(builtins, 'open', _open)
return wrapper
At the end of the decorator, the original open function should be restored in the builtins module.
One of the problems with this type of approach is: the functions that are called inside func and that use the open function, will also have their files closed at the end of decorator, even if it represents an error.
Here are some examples:
The following examples show read and write operations, also with files being closed within the function.
#close_opened_files
def test_write(fn1, fn2=None):
f1 = open(fn1, 'w')
print('Hello world', file=f1)
if fn2:
f2 = open(fn2, 'w')
print('foo-bar', file=f2)
f2.close()
#close_opened_files
def test_read(fn1, fn2=None):
f1 = open(fn1, 'r')
for line in f1: print(line)
if fn2:
f2 = open(fn2, 'r')
for line in f2: print(line)
f2.close()
test_write('file1.txt', 'file2.txt')
test_read('file1.txt', 'file2.txt')
try: test_read('non_exist_filename.txt')
except FileNotFoundError as ex: print(ex)
try: test_exception('file1.txt')
except RuntimeError as ex: print(ex)
These last examples show the decorator under the exceptions: 'no file exists', or any other exception that happens.
The last case, file will be closed before the exception is raised.
I believe it would be possible to use ast to traverse the code for file openings, and attempt to close any files that were found to have been opened.
Summary:
There is a variety of function for which it would be very useful to be able to pass in two kinds of objects: an object that represents a path (usually a string), and an object that represents a stream of some sort (often something derived from IOBase, but not always). How can this variety of function differentiate between these two kinds of objects so they can be handled appropriately?
Say I have a function intended to write a file from some kind of object file generator method:
spiff = MySpiffy()
def spiffy_file_makerA(spiffy_obj, file):
file_str = '\n'.join(spiffy_obj.gen_file())
file.write(file_str)
with open('spiff.out', 'x') as f:
spiffy_file_makerA(spiff, f)
...do other stuff with f...
This works. Yay. But I'd prefer to not have to worry about opening the file first or passing streams around, at least sometimes... so I refactor with the ability to take a file path like object instead of a file like object, and a return statement:
def spiffy_file_makerB(spiffy_obj, file, mode):
file_str = '\n'.join(spiffy_obj.gen_file())
file = open(file, mode)
file.write(file_str)
return file
with spiffy_file_makerB(spiff, 'file.out', 'x') as f:
...do other stuff with f...
But now I get the idea that it would be useful to have a third function that combines the other two versions depending on whether file is file like, or file path like, but returns the f destination file like object to a context manager. So that I can write code like this:
with spiffy_file_makerAB(spiffy_obj, file_path_like, mode = 'x') as f:
...do other stuff with f...
...but also like this:
file_like_obj = get_some_socket_or_stream()
with spiffy_file_makerAB(spiffy_obj, file_like_obj, mode = 'x'):
...do other stuff with file_like_obj...
# file_like_obj stream closes when context manager exits
# unless `closefd=False`
Note that this will require something a bit different than the simplified versions provided above.
Try as a I might, I haven't been able to find an obvious way to do this, and the ways I have found seem pretty contrived and just a potential for problems later. For example:
def spiffy_file_makerAB(spiffy_obj, file, mode, *, closefd=True):
try:
# file-like (use the file descriptor to open)
result_f = open(file.fileno(), mode, closefd=closefd)
except TypeError:
# file-path-like
result_f = open(file, mode)
finally:
file_str = '\n'.join(spiffy_obj.gen_file())
result_f.write(file_str)
return result_f
Are there any suggestions for a better way? Am I way off base and need to be handling this completely differently?
For my money, and this is an opinionated answer, checking for the attributes of the file-like object for the operations you will need is a pythonic way to determine an object’s type because that is the nature of pythonic duck tests/duck-typing:
Duck typing is heavily used in Python, with the canonical example being file-like classes (for example, cStringIO allows a Python string to be treated as a file).
Or from the python docs’ definition of duck-typing
A programming style which does not look at an object’s type to determine if it has the right interface; instead, the method or attribute is simply called or used (“If it looks like a duck and quacks like a duck, it must be a duck.”) By emphasizing interfaces rather than specific types, well-designed code improves its flexibility by allowing polymorphic substitution. Duck-typing avoids tests using type() or isinstance(). (Note, however, that duck-typing can be complemented with abstract base classes.) Instead, it typically employs hasattr() tests or EAFP programming.
If you feel very strongly that there is some very good reason that just checking the interface for suitability isn't enough, you can just reverse the test and test for basestring or str to test whether the provided object is path-like. The test will be different depending on your version of python.
is_file_like = not isinstance(fp, basestring) # python 2
is_file_like = not isinstance(fp, str) # python 3
In any case, for your context manager, I would go ahead and make a full-blown object like the below in order to wrap the functionality that you were looking for.
class SpiffyContextGuard(object):
def __init__(self, spiffy_obj, file, mode, closefd=True):
self.spiffy_obj = spiffy_obj
is_file_like = all(hasattr(attr) for attr in ('seek', 'close', 'read', 'write'))
self.fp = file if is_file_like else open(file, mode)
self.closefd = closefd
def __enter__(self):
return self.fp
def __exit__(self, type_, value, traceback):
generated = '\n'.join(self.spiffy_obj.gen_file())
self.fp.write(generated)
if self.closefd:
self.fp.__exit__()
And then use it like this:
with SpiffyContextGuard(obj, 'hamlet.txt', 'w', True) as f:
f.write('Oh that this too too sullied flesh\n')
fp = open('hamlet.txt', 'a')
with SpiffyContextGuard(obj, fp, 'a', False) as f:
f.write('Would melt, thaw, resolve itself into a dew\n')
with SpiffyContextGuard(obj, fp, 'a', True) as f:
f.write('Or that the everlasting had not fixed his canon\n')
If you wanted to use try/catch semantics to check for type suitability, you could also wrap the file operations you wanted to expose on your context guard:
class SpiffyContextGuard(object):
def __init__(self, spiffy_obj, file, mode, closefd=True):
self.spiffy_obj = spiffy_obj
self.fp = self.file_or_path = file
self.mode = mode
self.closefd = closefd
def seek(self, offset, *args):
try:
self.fp.seek(offset, *args)
except AttributeError:
self.fp = open(self.file_or_path, mode)
self.fp.seek(offset, *args)
# define wrappers for write, read, etc., as well
def __enter__(self):
return self
def __exit__(self, type_, value, traceback):
generated = '\n'.join(self.spiffy_obj.gen_file())
self.write(generated)
if self.closefd:
self.fp.__exit__()
my suggestion is to pass pathlib.Path objects around. you can simply .write_bytes(...) or .write_text(...) to these objects.
other that that you'd have to check the type of your file variable (this is how polymorphism can be done in python):
from io import IOBase
def some_function(file)
if isinstance(file, IOBase):
file.write(...)
else:
with open(file, 'w') as file_handler:
file_handler.write(...)
(i hope io.IOBase is the most basic class to check against...). and you would have to catch possible exceptions around all that.
Probably not the answer you're looking for, but from a taste point of view I think it's better to have functions that only do one thing. Reasoning about them is easier this way.
I'd just have two functions: spiffy_file_makerA(spiffy_obj, file), which handles your first case, and a convenience function that wraps spiffy_file_makerA and creates a file for you.
Another approach to this problem, inspired by this talk from Raymond Hettinger at PyCon 2013, would be to keep the two functions separate as suggested by a couple of the other answers, but to bring the functions together into a class with a number of alternative options for outputting the object.
Continuing with the example I started with, it might look something like this:
class SpiffyFile(object):
def __init__(self, spiffy_obj, file_path = None, *, mode = 'w'):
self.spiffy = spiffy_obj
self.file_path = file_path
self.mode = mode
def to_str(self):
return '\n'.join(self.spiffy.gen_file())
def to_stream(self, fstream):
fstream.write(self.to_str())
def __enter__(self):
try:
# do not override an existing stream
self.fstream
except AttributeError:
# convert self.file_path to str to allow for pathlib.Path objects
self.fstream = open(str(self.file_path), mode = self.mode)
return self
def __exit__(self, exc_t, exc_v, tb):
self.fstream.close()
del self.fstream
def to_file(self, file_path = None, mode = None):
if mode is None:
mode = self.mode
try:
fstream = self.fstream
except AttributeError:
if file_path is None:
file_path = self.file_path
# convert file_path to str to allow for pathlib.Path objects
with open(str(file_path), mode = mode) as fstream:
self.to_stream(fstream)
else:
if mode != fstream.mode:
raise IOError('Ambiguous stream output mode: \
provided mode and fstream.mode conflict')
if file_path is not None:
raise IOError('Ambiguous output destination: \
a file_path was provided with an already active file stream.')
self.to_stream(fstream)
Now we have lots of different options for exporting a MySpiffy object by using a SpiffyFile object. We can just write it to a file directly:
from pathlib import Path
spiff = MySpiffy()
p = Path('spiffies')/'new_spiff.txt'
SpiffyFile(spiff, p).to_file()
We can override the path, too:
SpiffyFile(spiff).to_file(p.parent/'other_spiff.text')
But we can also use an existing open stream:
SpiffyFile(spiff).to_stream(my_stream)
Or, if we want to edit the string first we could open a new file stream ourselves and write the edited string to it:
my_heading = 'This is a spiffy object\n\n'
with open(str(p), mode = 'w') as fout:
spiff_out = SpiffyFile(spiff).to_str()
fout.write(my_heading + spiff_out)
And finally, we can just use a context manager with the SpiffyFile object directly to as many different locations- or streams- as we like (note that we can pass the pathlib.Path object directly without worrying about string conversion, which is nifty):
with SpiffyFile(spiff, p) as spiff_file:
spiff_file.to_file()
spiff_file.to_file(p.parent/'new_spiff.txt')
print(spiff_file.to_str())
spiff_file.to_stream(my_open_stream)
This approach is more consistent with the mantra: explicit is better than implicit.
I'm trying to test the read() method in the following class:
class Channel(sam.Sam):
def __open(self):
try:
self.__channel = open('%s/channel.ini' % os.path.dirname(os.path.realpath(__file__)), 'r+')
except Exception as e:
traceback.print_exc(file = sys.stdout)
raise e
def read(self):
try:
self.__open()
return JSONEncoder().encode({
"status": True,
"channel": self.__channel.read().strip()
})
except Exception as e:
traceback.print_exc(file = sys.stdout)
return JSONEncoder().encode({
"status": False
})
finally:
self.__close()
As I understand it, I should be mocking the file.read() method (in self.__channel.read(), or maybe the os.open() method, but none of the examples I've found have the call to os.open() or file.read() deep inside a class.
I already tried __builtin__.read = MagicMock(return_value="something"), and many variations thereof, but not one of them even make sense. I'm kind of lost as to how to even start this.
Is this even the right way?
Mock the open function; you can use the mock_open() utility function to provide a suitable mock:
from unittest.mock import mock_open
with patch('your_module.open', mock_open(read_data=JSON_TEST_DATA), create=True) as m:
result = Channel().read()
assert m.assert_called_once_with(expected_file_name)
The patch() call creates a new global open object in your your_module namespace, so when the Channel.__open() method runs it'll find that object rather than the open() built-in function.
By passing in a read_data argument to mock_open(), you can dictate what is returned by the self.__channel.read() call.
ContextManager is really useful and it's also make our code more readable, but it seems it only works if the given function is meant to be a context manager, otherwise it will fail (no __exit__ or something else). I' wondering if we can use any function including those in libraries such as Django as a contextmanager, suppose the given code
self.assertEqual(Transaction.objects.filter(account=a, date=b, year=c).count(), 10)
self.assertEqual(Transaction.objects.filter(account=e, date=f, year=g).count(), 15)
self.assertEqual(Transaction.objects.filter(account=h, date=i, year=j).count(), 20)
Can be transformed into:
with Transaction.objects.filter as f:
self.assertEqual(f(account=a, date=b, year=c).count(), 10)
self.assertEqual(f(account=d, date=d, year=e).count(), 15)
self.assertEqual(f(account=h, date=i, yearj).count(), 20)
The way I look at it is that the one below is much readable, cleaner and less verbose. Is this possible?
Just do:
f = Transaction.objects.filter
before your statements!
If you really wanted to use a context manager, you could write one that does what you want:
from contextlib import contextmanager
#contextmanager
def alias(func):
yield func
with alias(Transaction.objects.filter) as f:
...
Note that, however you do this, without an explicit del statement, f will still be around after the with.
You're misunderstanding how the python with works (probably due to experience of Django Template's {% with %} block).
As mentioned in PEP 343, the with statement allows you to easily abstract try/finally blocks. This is really useful for things such as IO, where, if something goes wrong, you want to make sure that you safely close the file no matter what.
In your example, you're really looking to just reduce the length of the variable, which is a nested child of both Transaction and Transaction.objects. In that case you can simple do.
fn = Transaction.objects.filter
Your follow-up question of "how do I invalidate the usage of fn... after we go out of the scope" is also a little bit mis-guided, as common usages of with don't necessarily destroy the variable's reference to the original object.
>>> with open("README.md") as f:
... print f
...
<open file 'README.md', mode 'r' at 0x0055F860>
>>> f
<closed file 'README.md', mode 'r' at 0x0055F860>
>>>
To be honest, I'd suggest that you look for an alternative solution, as I think the semantics here are a little off. That said, if you truly want to do this, and mimic the unassignment as well, you'd need something like this:
class AliasContextManager(object):
"""
Handle temporary function scope within a with block.
"""
def __init__(self, fn):
self.fn = fn
def proxy(self):
"""
Create a proxy to our function, such that we can remove the reference
on exit, and replace it with None.
"""
def _proxy(*args, **kwargs):
fn = getattr(self, "fn", None)
return fn(*args, **kwargs)
return _proxy
def __enter__(self):
return self.proxy()
def __exit__(self, *args):
del self.fn
alias = AliasContextManager
And here's how it can be used:
>>> with alias(sum) as fn:
... print fn([1,2,3])
... print fn([4,5,6])
6
15
>>> print fn([7,8,9])
Traceback (most recent call last):
File "x.py", line 22, in <module>
print fn([7,8,9])
File "x.py", line 8, in _proxy
return fn(*args, **kwargs)
TypeError: 'NoneType' object is not callable
I want to wrap the default open method with a wrapper that should also catch exceptions. Here's a test example that works:
truemethod = open
def fn(*args, **kwargs):
try:
return truemethod(*args, **kwargs)
except (IOError, OSError):
sys.exit('Can\'t open \'{0}\'. Error #{1[0]}: {1[1]}'.format(args[0], sys.exc_info()[1].args))
open = fn
I want to make a generic method of it:
def wrap(method, exceptions = (OSError, IOError)):
truemethod = method
def fn(*args, **kwargs):
try:
return truemethod(*args, **kwargs)
except exceptions:
sys.exit('Can\'t open \'{0}\'. Error #{1[0]}: {1[1]}'.format(args[0], sys.exc_info()[1].args))
method = fn
But it doesn't work:
>>> wrap(open)
>>> open
<built-in function open>
Apparently, method is a copy of the parameter, not a reference as I expected. Any pythonic workaround?
The problem with your code is that inside wrap, your method = fn statement is simply changing the local value of method, it isn't changing the larger value of open. You'll have to assign to those names yourself:
def wrap(method, exceptions = (OSError, IOError)):
def fn(*args, **kwargs):
try:
return method(*args, **kwargs)
except exceptions:
sys.exit('Can\'t open \'{0}\'. Error #{1[0]}: {1[1]}'.format(args[0], sys.exc_info()[1].args))
return fn
open = wrap(open)
foo = wrap(foo)
Try adding global open. In the general case, you might want to look at this section of the manual:
This module provides direct access to all ‘built-in’ identifiers of Python; for example, __builtin__.open is the full name for the built-in function open(). See chapter Built-in Objects.
This module is not normally accessed explicitly by most applications, but can be useful in modules that provide objects with the same name as a built-in value, but in which the built-in of that name is also needed. For example, in a module that wants to implement an open() function that wraps the built-in open(), this module can be used directly:
import __builtin__
def open(path):
f = __builtin__.open(path, 'r')
return UpperCaser(f)
class UpperCaser:
'''Wrapper around a file that converts output to upper-case.'''
def __init__(self, f):
self._f = f
def read(self, count=-1):
return self._f.read(count).upper()
# ...
CPython implementation detail: Most modules have the name __builtins__ (note the 's') made available as part of their globals. The value of __builtins__ is normally either this module or the value of this modules’s __dict__ attribute. Since this is an implementation detail, it may not be used by alternate implementations of Python.
you can just add return fn at the end of your wrap function and then do:
>>> open = wrap(open)
>>> open('bhla')
Traceback (most recent call last):
File "<pyshell#24>", line 1, in <module>
open('bhla')
File "<pyshell#18>", line 7, in fn
sys.exit('Can\'t open \'{0}\'. Error #{1[0]}: {1[1]}'.format(args[0], sys.exc_info()[1].args))
SystemExit: Can't open 'bhla'. Error #2: No such file or directory