I'm trying to work out how to "rebuild" a bitmap after extracting the BitmapBits from the object.
I'm rewriting a small application so that I can take advantage of the available cores in my machine. Because I'm using Python, before I can send anything to another process, that object must first be serialized via Pickling. The only part of the PyCBitmap objects that are picklable are the bytestrings that get returned from GetBitmapBits. Thus the only way to leverage the additional processors is to extract the bytestring from the bitmap, pickle it, and then rebuild it on the other side.
However, the issue I've run into is that I'm not sure how to make a new bitmap object from the extracted bytestring. There is a CreateBitmap method in the win32gui library. However, from the docs the bitmapbits parameter must be set to None. So that's out.
Does anyone know how to create a new bitmap object from a bytestring?
Related
Python docs mention this word a lot and I want to know what it means.
It simply means it can be serialized by the pickle module. For a basic explanation of this, see What can be pickled and unpickled?. Pickling Class Instances provides more details, and shows how classes can customize the process.
Things that are usually not pickable are, for example, sockets, file(handler)s, database connections, and so on. Everything that's build up (recursively) from basic python types (dicts, lists, primitives, objects, object references, even circular) can be pickled by default.
You can implement custom pickling code that will, for example, store the configuration of a database connection and restore it afterwards, but you will need special, custom logic for this.
All of this makes pickling a lot more powerful than xml, json and yaml (but definitely not as readable)
These are all great answers, but for anyone who's new to programming and still confused here's the simple answer:
Pickling an object is making it so you can store it as it currently is, long term (to often to hard disk). A bit like Saving in a video game.
So anything that's actively changing (like a live connection to a database) can't be stored directly (though you could probably figure out a way to store the information needed to create a new connection, and that you could pickle)
Bonus definition: Serializing is packaging it in a form that can be handed off to another program. Unserializing it is unpacking something you got sent so that you can use it
Pickling is the process in which the objects in python are converted into simple binary representation that can be used to write that object in a text file which can be stored. This is done to store the python objects and is also called as serialization. You can infer from this what de-serialization or unpickling means.
So when we say an object is picklable it means that the object can be serialized using the pickle module of python.
Here's the situation: I have a massive object that needs to be loaded into memory. So big that if it is loaded in twice it will go beyond the available memory on my machine (and no, I can't upgrade the memory). I also can't divide it up into any smaller pieces. For simplicity's sake, let's just say the object is 600 MB and I only have 1 GB of RAM. I need to use this object from a web app, which is running in multiple processes, and I don't control how they're spawned (a third party load balancer does that), so I can't rely on just creating the object in some master thread/process and then spawning off children. This also eliminates the possibility of using something like POSH because that relies on it's own custom fork call. I also can't use something like a SQLite memory database, mmap or the posix_ipc, sysv_ipc, and shm modules because those act as a file in memory, and this data has to be an object for me to use it. Using one of those I would have to read it as a file and then turn it into an object in each individual process and BAM, segmentation fault from going over the machine's memory limit because I just tried to load in a second copy.
There must be someway to store a Python object in memory (and not as a file/string/serialized/pickled) and have it be accessible from any process. I just don't know what it is. I've looked all over StackOverflow and Google and can't find the answer to this, so I'm hoping somebody can help me out.
http://docs.python.org/library/multiprocessing.html#sharing-state-between-processes
Look for shared memory, or Server process. After re-reading your post Server process sounds closer to what you want.
http://en.wikipedia.org/wiki/Shared_memory
There must be someway to store a Python object in memory (and not as a
file/string/serialized/pickled) and have it be accessible from any
process.
That isn't the way in works. Python object reference counting and an object's internal pointers do not make sense across multiple processes.
If the data doesn't have to be an actual Python object, you can try working on the raw data stored in mmap() or in a database or somesuch.
I would implement this as a C module that gets imported into each Python script. Then the interface to this large object would be implemented in C, or some combination of C and Python.
I have a custom class, which has some string data. I wish to be able to save this string data to a file, using a file handle's write object. I have implemented __str__(), so i can do str(myobject), what is the equivalent method for making python consider my object to be a character buffer object?
If you are trying to use your object with library code, that expects to be able to write what you give it to a file, then you may have to resort to implementing a "duck file" class that acts like a file but supports your stringable object. Unfortunately, file is not a type that you can subclass easily, at least as of Python 2.6. You will have to implement enough of the file protocol (write, writelines, tell, etc.) to allow the library code to work as expected.
There isn't a single function, but a whole range of them - read, seek, etc.
Why don't you subclass StringIO.StringIO, which is already a string buffer?
So I've got a problem - I'm writing a game prototype in Python, using Pygame, and I want to save my games. All of the game-related data is in three instances of certain classes, and I want to save these three instances to a file. However, I've tried pickling these instances, and it doesn't work. Instead, I get "TypeError: can't pickle Surface objects". This is a problem, because I want to store Surface objects.
I'm open to any alternatives to pickling that there may be, using any other kind of data type. The important thing is that these instances get stored, and that their data is then retrievable later on. So what can I do to overcome this issue? Please keep in mind, I'm not a very experienced programmer, having learned Python in my spare time a year ago, and I can't write much of any other language, though I'm slowly learning C++.
The basic point of pickling is that you should be able to serialise the object somehow. An SDL surface is an in memory object holding a lot of run time state. Trying to serialise it is not totally sensible.
What you should do is to decouple the state of your game from the rendering components so that you can serialise just those (pickling or whatever).
It's like trying to save the state of a video by somehow saving the memory buffers holding the decoded video. This will not work. Instead, how you save it is to serialise the location of the video file and the time offset. Then you can continue playback upon load the next time you restore your application.
Reading http://docs.python.org/library/pickle.html#pickle-protocol, you need to either have Surface objects export a reduce method or use the copy_reg module to tell pickle how to handle that data as documented in http://docs.python.org/library/copy_reg.html#module-copy_reg.
Either way what pickle needs is a function that will turn a blob it can't handle into (some_class, [arguments here]). And then when you unpickle it will construct a new thing of that class with those arguments.
Python docs mention this word a lot and I want to know what it means.
It simply means it can be serialized by the pickle module. For a basic explanation of this, see What can be pickled and unpickled?. Pickling Class Instances provides more details, and shows how classes can customize the process.
Things that are usually not pickable are, for example, sockets, file(handler)s, database connections, and so on. Everything that's build up (recursively) from basic python types (dicts, lists, primitives, objects, object references, even circular) can be pickled by default.
You can implement custom pickling code that will, for example, store the configuration of a database connection and restore it afterwards, but you will need special, custom logic for this.
All of this makes pickling a lot more powerful than xml, json and yaml (but definitely not as readable)
These are all great answers, but for anyone who's new to programming and still confused here's the simple answer:
Pickling an object is making it so you can store it as it currently is, long term (to often to hard disk). A bit like Saving in a video game.
So anything that's actively changing (like a live connection to a database) can't be stored directly (though you could probably figure out a way to store the information needed to create a new connection, and that you could pickle)
Bonus definition: Serializing is packaging it in a form that can be handed off to another program. Unserializing it is unpacking something you got sent so that you can use it
Pickling is the process in which the objects in python are converted into simple binary representation that can be used to write that object in a text file which can be stored. This is done to store the python objects and is also called as serialization. You can infer from this what de-serialization or unpickling means.
So when we say an object is picklable it means that the object can be serialized using the pickle module of python.