I have a library function which I ran through a wrapper function to disable printing.
However, I need to somehow make that wrapped function not cause pickling errors. I get AttributeError: Can't pickle local object 'mute.<locals>.wrapper'
First bottom line of the stack File "C:\Users\[...]\Python310\site-packages\neat\checkpoint.py", line [...], in save_checkpoint pickle.dump(data, f, protocol=pickle.HIGHEST_PROTOCOL)
I am only including my best guess which code is relevant, cause the whole code is quite big.
bu.py
def mute(f):
def wrapper(*args, **kwargs):
import sys
original_stdout = sys.stdout
sys.stdout = open(os.devnull, 'w')
f(*args, **kwargs)
sys.stdout.close()
sys.stdout = original_stdout
return wrapper
main.py
...
p = neat.Population(config)
reporter_to_add = neat.Checkpointer(1)
reporter_to_add.save_checkpoint = bu.mute(reporter_to_add.save_checkpoint)
p.add_reporter(reporter_to_add)
reporter_to_add = custom_reporters.FullPopulationCheckpointer(1)
reporter_to_add.save_checkpoint = bu.mute(reporter_to_add.save_checkpoint)
p.add_reporter(reporter_to_add)
p.run(gym.eval_genomes, 10)
...
neat.Checkpoint.py
class Checkpointer(BaseReporter):
...
def save_checkpoint(self, config, population, species_set, generation):
""" Save the current simulation state. """
filename = '{0}{1}'.format(self.filename_prefix,generation)
print("Saving checkpoint to {0}".format(filename))
with gzip.open(filename, 'w', compresslevel=5) as f:
data = (generation, config, population, species_set, random.getstate())
pickle.dump(data, f, protocol=pickle.HIGHEST_PROTOCOL)
I have tried putting a global wrapper in the mute() function, but that didn't work and also I don't know if I should have them global, cause there will be multiple save_checkpoint functions as you can see. They are different.
Everything works fine if the bu.mute() wrapper is not used.
Related
In python, I am trying to store a list to a file. I've tried pickle, json, etc, but none of them support classes being inside those lists. I can't sacrifice the lists or the classes, I must maintain both. How can I do it?
My current attempt:
try:
with open('file.json', 'r') as file:
allcards = json.load(file)
except:
allcards = []
def saveData(list):
with open('file.json', 'w') as file:
print(list)
json.dump(list, file, indent=2)
saveData is called elsewhere, and I've done all the testing I can and have determined the error comes from trying to save the list due to it's inclusion of classes. It throws me the error
Object of type Card is not JSON serializable
whenever I do the JSON method, and any other method doesn't even give errors but doesn't load the list when I reload the program.
Edit: As for the pickle method, here is what it looks like:
try:
with open('allcards.dat', 'rb') as file:
allcards = pickle.load(file)
print(allcards)
except:
allcards = []
class Card():
def __init__(self, owner, name, rarity, img, pack):
self.owner = str(owner)
self.name = str(name)
self.rarity = str(rarity)
self.img = img
self.pack = str(pack)
def saveData(list):
with open('allcards.dat', 'wb') as file:
pickle.dump(list, file)
When I do this, all that happens is the code runs as normal, but the list is not saved. And the print(allcards) does not trigger either which makes me believe it's somehow not detecting the file or causing some other error leading to it just going straight to the exception. Also, img is supposed to always a link, in case that changes anything.
I have no other way I believe I can help solve this issue, but I can post more code if need be.
Please help, and thanks in advance.
Python's built-in pickle module does not support serializing a python class, but there are libraries that extend the pickle module and provide this functionality. Drill and Cloudpickle both support serializing a python class and has the exact same interface as the pickle module.
Dill: https://github.com/uqfoundation/dill
Cloudpickle: https://github.com/cloudpipe/cloudpickle
//EDIT
The article linked below is good, but I've written a bad example.
This time I've created a new snippet from scratch -- sorry for making it earlier more complicated than it should.
import json
class Card(object):
#classmethod
def from_json(cls, data):
return cls(**data)
def __init__(self, figure, color):
self.figure = figure
self.color = color
def __repr__(self):
return f"<Card: [{self.figure} of {self.color}]>"
def save(cards):
with open('file.json', 'w') as f:
json.dump(cards, f, indent=4, default=lambda c: c.__dict__)
def load():
with open('file.json', 'r') as f:
obj_list = json.load(f)
return [Card.from_json(obj) for obj in obj_list]
cards = []
cards.append(Card("1", "clubs"))
cards.append(Card("K", "spades"))
save(cards)
cards_from_file = load()
print(cards_from_file)
Source
I don't know if there is an easy way of doing this that doesn't rely on manually writing down what the saved outputs from a script are so open to any suggestions.
I want a function that runs at the end of my script and that automatically generates a text file with a name like:
"IO_track_scriptname_date_time"
Which has a list of the files I loaded and the files I saved (location links).
And then saves this txt file to the desired destination.
Thank you for your help
Edit:
Or any alternative way of keeping a log of inputs and outputs.
Here is a thin object wrapper around the open function that tracks all of the files that are opened.
class Open:
_open = open
def __init__(self):
self.opened_files = []
self.fp = None
def __call__(self,
file,
mode='r',
buffering=-1,
encoding=None,
errors=None,
newline=None,
closefd=True,
opener=None):
self.fp = self._open(file, mode, buffering, encoding, errors,
newline, closefd, opener)
self.opened_files.append((mode, file))
return self.fp
def __enter__(self, *args, **kwargs):
return self.__call__(*args, **kwargs)
def __exit__(self, *exc_details):
return self.fp.close()
def __getattr__(self, attr):
return getattr(self.fp, attr)
def export(self, filename):
with open(filename, 'w') as fp:
for m, fn in self.opened_files:
fp.write(f'({m}): {fn}\n')
To actually use it, you will need to overwrite the built-in open function with an instantiation of this class. If you have one file that you are calling, you can pop this into the __main__ block. i.e.
...
if __name__=='__main__':
# code defining Open class here
...
open = Open()
# other code in __main__ here
open.export("IO_track_scriptname_date_time.txt")
I want to log the script output to a file while still displaying the output to the screen.
It works fine, except for some cases where not all the content is written to the file (one or two lines can be missed, if the output is long)
Below is my code:
class Tee(object):
def __init__(self, *files):
self.files = files
def write(self, obj):
for f in self.files:
f.write(obj)
f.flush()
write_log = open("log.txt", 'a', 0)
sys.stdout = Tee(sys.stdout, write_log)
sys.stderr = Tee(sys.stderr, write_log)
Tried all the following options at the end of the code, but the result is the same:
os.fsync(write_log.fileno())
write_log.flush()
write_log.close()
Try using the with statement or use try-except and explicitly close the file.
For example there's with statement:
with open("ACCELEROMETER", 'w') as ACCELEROMETER,\
open('GPS', 'w') as GPS,\
open('LIGHT', 'w') as LIGHT,\
open('LOCATION', 'w') as LOCATION,\
open('MIC', 'w') as MIC,\
open('SCREEN', 'w') as SCREEN,\
open('TIME', 'w') as TIME:
I want to get file objects just created using some python code :
I'm looking for equivalent of dir function for local scope of with.
Is it possible?
What you're asking for isn't really possible (without making some assumptions) since with doesn't create a new namespace. You could create a file-list object which is implemented as a context manager ...
class FileList(list):
def __init__(self, files, mode='r'):
list.__init__(open(arg, mode) for arg in files)
def __enter__(self):
return self
def __exit__(self, *args):
for fobj in self:
fobj.close()
with FileList(["ACCELEROMETER", "GPS", ...], mode='w') as fl:
for fobj in fl:
...
I would like to create a function that keeps a record of every print command, storing each command's string into a new line in a file.
def log(line):
with open('file.txt', "a") as f:
f.write('\n' + line)
This is what I have, but is there any way to do what I said using Python?
Try replacing stdout with custom class:
import sys
class LoggedStdout():
def __init__(self, filename = None):
self.filename = filename
def write(self, text):
sys.__stdout__.write(text)
if not self.filename is None:
self.log(text)
def log(self, line):
with open(self.filename, "a") as f:
f.write('\n' + line)
sys.stdout = LoggedStdout('file.txt')
print 'Hello world!'
This would affect not only print, but also any other function that prints something to stdout, but it is often even better.
For production-mode logging it's much better to use something like logging module, rather than home-made hooks over standard IO streams.