What is the best way to store data in Python? [closed] - python

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm creating a Skype bot for a group of friends, and I want to be able to have a somewhat login system for text-based games and storing information such as usernames, high scores, notes, friends list, messages, etc.
I was thinking of storing it in a text file named after the person's handle on Skype, however, I was wondering if there was a better way of doing it. Such as XML files.
I'd like to avoid SQL servers, and it's not like they're storing passwords so encryption won't be that much of a big deal. (I'd prefer local file storage. Something easily editable and delete-able)
I want to enable commands such as !note and !friends and !addfriend and !highscore and so on, but I need a method to save that information.

Have you considered pickle? It can store python objects (any object) to files so you can just load and use them.
import pickle
# Saving
data = [1,2,3,4]
pickle.dump(data, open("d:/temp/test.pkl","wb"))
# Loading
load = pickle.load(open("d:/temp/test.pkl","rb"))
For more info, read the docs
(Another option is the json module which serializes to json. It is used in a similar way, but can only save dictionaries, lists, strings and integers)

Related

Is it better to have one large file or many smaller files for data storage? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 10 months ago.
Improve this question
I have an C++ game which sends a Python-SocketIO request to a server, which loads the requested JSON data into memory for reference, and then sends portions of it to the client as necessary. Most of the previous answers here detail that the server has to repeatedly search the database, when in this case, all of the data is stored in memory after the first time, and is released after the client disconnects.
I don't want to have a large influx of memory usage whenever a new client joins, however most of what I have seen points away from using small files (50-100kB absolute maximum), and instead use large files, which would cause the large memory usage I'm trying to avoid.
My question is this: would it still be beneficial to use one large file, or should I use the smaller files; both from an organization standpoint and from a performance one?
Is it better to have one large file or many smaller files for data storage?
Both can potentially be better. Each have their advantage and disadvantage. Which is better depends on the details of the use case. It's quite possible that best way may be something in between such as a few medium sized files.
Regarding performance, the most accurate way to verify what is best is to try out each of them and measure.
You should separate it into multiple files for less memory if you're only accessing small parts of it. For example, if you're only accessing let's say a player, then your folder structure would look like this:
players
- 0.json
- 1.json
other
- 0.json
Then you could write a function that just gets the player with a certain id (0, 1, etc.).
If you're planning on accessing all of the players, other objects, and more at once, then have the same folder structure and just concatenate the parts you need into one object in memory.

What are in-memory data structures in python? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 1 year ago.
Improve this question
I was given a coding challenge in which I have to parse a text file and build "A data structure in memory to work with." . I then have to perform descriptive statistics on it. So far I've parsed the text and build a dictionary containing all the needed data.
I haven't used SQlite or something similar because they specifically asked for data structures and not databases.
I am not sure if dictionary is correct here. So my question is: What are in-memory data sructures in python? I've tried the web but couldn't get an definitive answer.
An in memory data structure is one that is stored in RAM (as opposed to saved to disk or “pickled”). If you’re not using external programs that store to disk for you (like databases) and not explicitly storing to disk, you’ve created an in-memory data structure. Dicts, lists, sets, etc. are all data structures, and if you don’t save it to disk they’re in-memory data structures.

Save data in Python and then read in C++ [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I have some code in Python which generates a set of data structures (they may be represented by classes but no methods are needed).
This data structures may be extended or added new structures in future
I have some code in C++ on Android which knew about part of this data structures and their fields.
The only way to pass data structures is through serialization to file and then deserialize them
Binary format support is needed.
Mature implementations in Python and C++ are needed. BSD, MIT, Apache licenses are preferred.
Speed is not critical.
I have tried custom format but it is hard to extend it.
SAX-like parsers are too low level for such task.
The most important factor here is the file format to be passed on - whether you need to create a proxy class on the other side or if you simply need to read data on the other side, once the data format is known on the received side, the receiver side should know how to handle it.
Thus, it is best to use the data format which are well-known and widely used. Mostly for the reason of their widely-used virtue, such data formats would also normally have some 3rd party or build in library to help you creating your data structure files.
For this purpose, I will recommend you to use either JSON or XML data format. Python already have serializers for both:
XML: http://code.activestate.com/recipes/577268-python-data-structure-to-xml-serialization/
JSON : https://docs.python.org/3/library/json.html, http://www.diveintopython3.net/serializing.html, https://docs.python.org/2/library/json.html
You can also search some of other alternatives which I believe are also available apart from them.

How to access all of a file's attributes through Python in Windows 7? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
In a Windows 7 system you can right-click the sort columns to look at the details you want to view for a file and you get the following:
Question: Is there a way to access all of the attributes on that list for a given file using Python?
This is a bit long for a comment.
You are not likely to get a good answer because Microsoft makes this way too complicated, and their documentation on this topic is some of the worst that they have.
Everything is wrapped up in COM interfaces, and you really need the SDK installed to get all of the headers file needed to access these interfaces from a C style API.
To understand how it really works, you really need to start the Property System Overview
You will also want to read Property System Developers Guid
There is one C language answer that I know of for this topic on S/O, though clearly there could be others.
I know it is not a real answer, and it is certainly not Python -- but if you have the real motivation to dig into this, hopefully this is at least a little helpful.
Also not that these extended properties are poorly supported, and tend to disappear under many common usage patterns since they are not really part of the file -- e.g., copy the file using ftp -- lose the extended file attributes.

efficient database file trees [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
So I was making a simple chat app with python. I want to store user specific data in a database, but I'm unfamiliar with efficiency. I want to store usernames, public rsa keys, missed messages, missed group messages, urls to profile pics etc.
There's a couple of things in there that would have to be grabbed pretty often, like missed messages and profile pics and a couple of hashes. So here's the question: what database style would be fastest while staying memory efficient? I want it to be able to handle around 10k users (like that's ever gonna happen).
heres some I thought of:
everything in one file (might be bad on memory, and takes time to load in, important, as I would need to load it in after every change.)
seperate files per user (Slower, but memory efficient)
seperate files
per data value
directory for each user, seperate files for each value.
thanks,and try to keep it objective so this isnt' instantly closed!
The only answer possible at this point is 'try it and see'.
I would start with MySQL (mostly because it's the 'lowest common denominator', freely available everywhere); it should do everything you need up to several thousand users, and if you get that far you should have a far better idea of what you need and where the bottlenecks are.

Categories

Resources