Hello I have developed a small game in python using ZODB as backend for DB processing.I have never done game programming before.I was hoping if someone can tell me as to how I can save my current game and then reload it using python.The database filename is data.fs and there are three more ZODB files in my folder.One being for locks rest I'm not aware about.
I don't know your particular case but maybe ZODB is overkill for this. Check first pickle and also anydbm, both at the standard library. If they are not enough for your requirements you can then go for ZODB.
In ZODB, once you are connected, you can put anything you want inside as you would with a plain Python dictionary.
You can also use repoze.zodbconn to facilitate the configuration, the connection at startup and closing it properly when your program exists.
Related
I have a question and hope someone can direct me in the right direction; Basically every week I have to run a query (SSMS) to get a table containing some information (date, clientnumber, clientID, orderid etc) and then I copy all the information and that table and past it in a folder as a CSV file. it takes me about 15 min to do all this but I am just thinking can I automate this, if yes how can I do that and also can I schedule it so it can run by itself every week. I believe we live in a technological era and this should be done without human input; so I hope I can find someone here willing to show me how to do it using Python.
Many thanks for considering my request.
This should be pretty simple to automate:
Use some database adapter which can work with your database, for MSSQL the one delivered by pyodbc will be fine,
Within the script, connect to the database, perform the query, parse an output,
Save parsed output to a .csv file (you can use csv Python module),
Run the script as the periodic task using cron/schtask if you work on Linux/Windows respectively.
Please note that your question is too broad, and shows no research effort.
You will find that Python can do the tasks you desire.
There are many different ways to interact with SQL servers, depending on your implementation. I suggest you learn Python+SQL using the built-in sqlite3 library. You will want to save your query as a string, and pass it into an SQL connection manager of your choice; this depends on your server setup, there are many different SQL packages for Python.
You can use pandas for parsing the data, and saving it to a ~.csv file (literally called to_csv).
Python does have many libraries for scheduling tasks, but I suggest you hold off for a while. Develop your code in a way that it can be run manually, which will still be much faster/easier than without Python. Once you know your code works, you can easily implement a scheduler. The downside is that your program will always need to be running, and you will need to keep checking to see if it is running. Personally, I would keep it restricted to manually running the script; you could compile to an ~.exe and bind to a hotkey if you need the accessibility.
Okay, so basically I am creating a website. The data I need to display on this website is delivered twice daily, where I need to read the delivered data from a file and store this new data in the database (instead of the old data).
I have created the python functions to do this. However, I would like to know, what would be the best way to run this script, while my flask application is running? This may be a very simple answer, but I have seen some answers saying to incorporate the script into the website design (however these answers didn't explain how), and others saying to run it separately. The script needs to run automatically throughout the day with no monitoring or input from me.
TIA
Generally it's a really bad idea to put a webserver to handle such tasks, that is the flask application in your case. There are many reasons for it so just to name a few:
Python's Achilles heel - GIL.
Sharing system resources of the application between users and other operations.
Crashes - it happens, it could be unlikely but it does. And if you are not careful, the web application goes down along with it.
So with that in mind I'd advise you to ditch this idea and use crontabs. Basically write a script that does whatever transformations or operations it needs to do and create a cron job at a desired time.
I have been studying programming for a few years and I am now working on my first desktop application. I am making a simple program that is able to keep track of information pertaining to a DND (Dungeons and Dragons) character/s. I want to find a way to store information about these characters so they next time the applications is launched, the characters will be saved. How do things like spotify save information about each user? First, I will give some info about the program itself. I have written it in python and it is organized as follows:
I have a file, which serves as the brain of the application (app.py).
A file which defines a class representing a character
A file defining a class that is used to find information about the characters
Other files defining classes used the build the UI
So far in my studies, I have only gathered inputted information from txt files, input functions and APIs via requests. I have worked with JSON before and am thinking this may be an option, but I am not sure how this would work in this case. I also had the idea of storing data in txt files, but want to learn the way it is done in the real world in order to make the best use of my time.
TLDR: I am making a desktop application using python and want an effective and common way people store information they want to access the next time the program is ran. I am looking for a local way to save the data that is also SAFE. If you have a recommendation that is server/cloud based, I would still like to hear how it may be done that as I am sure the knowledge will still be beneficial. I am looking for a way to SAFELY store information that will be saved even after the application is terminated. Any advice or anything you have personally used is appreciated.
I have to setup a program which reads in some parameters from a widget/gui, calculates some stuff based on database values and the input, and finally sends some ascii files via ftp to remote servers.
In general, I would suggest a python program to do the tasks. Write a Qt widget as a gui (interactively changing views, putting numbers into tables, setting up check boxes, switching between various layers - never done something as complex in python, but some experience in IDL with event handling etc), set up data classes that have unctions, both to create the ascii files with the given convention, and to send the files via ftp to some remote server.
However, since my company is a bunch of Windows users, each sitting at their personal desktop, installing python and all necessary libraries on each individual machine would be a pain in the ass.
In addition, in a future version the program is supposed to become smart and do some optimization 24/7. Therefore, it makes sense to put it to a server. As I personally rather use Linux, the server is already set up using Ubuntu server.
The idea is now to run my application on the server. But how can the users access and control the program?
The easiest way for everybody to access something like a common control panel would be a browser I guess. I have to make sure only one person at a time is sending signals to the same units at a time, but that should be doable via flags in the database.
After some google-ing, next to QtWebKit, django seems to the first choice for such a task. But...
Can I run a full fledged python program underneath my web application? Is django the right tool to do so?
As mentioned previously, in the (intermediate) future ( ~1 year), we might have to implement some computational expensive tasks. Is it then also possible to utilize C as it is within normal python?
Another question I have is on the development. In order to become productive, we have to advance in small steps. Can I first create regular python classes, which later on can be imported to my web application? (Same question applies for widgets / QT?)
Finally: Is there a better way to go? Any standards, any references?
Django is a good candidate for the website, however:
It is not a good idea to run heavy functionality from a website. it should happen in a separate process.
All functions should be asynchronous, I.E. You should never wait for something to complete.
I would personally recommend writing a separate process with a message queue and the website would only ask that process for statuses and always display a result immediatly to the user
You can use ajax so that the browser will always have the latest result.
ZeroMQ or Celery are useful for implementing the functionality.
You can implement functionality in C pretty easily. I recomment however that you write that functionality as pure c with a SWIG wrapper rather that writing it as an extension module for python. That way the functionality will be portable and not dependent on the python website.
This is specifically geared towards managing MP3 files, but it should easily work for any directory structure with a lot of files.
I want to find or write a daemon (preferably in Python) that will watch a folder with many subfolders that should all contain X number of MP3 files. Any time a file is added, updated or deleted, it should reflect that in a database (preferably PostgreSQL). I am willing to accept if a file is simply moved that the respective rows are deleted and recreated anew but updating existing rows would make me the happiest.
The Stack Overflow question Managing a large collection of music has a little of what I want.
I basically just want a database that I can then do whatever I want to with. My most up-to-date database as of now is my iTunes.xml file, but I don't want to rely on that too much as I don't always want to rely on iTunes for my music management. I see plenty of projects out there that do a little of what I want but in a format that either I can't access or is just more complex than I want. If there is some media player out there that can watch a folder and update a database that is easily accessible then I am all for it.
The reason I'm leaning towards writing my own is because it would be nice to choose my database and schema myself.
Another answer already suggested pyinotify for Linux, let me add watch_directory for Windows (a good discussion of the possibilities in Windows is here, the module's an example) and fsevents on the Mac (unfortunately I don't think there's a single cross-platform module offering a uniform interface to these various system-specific ways to get directory-change notification events).
Once you manage to get such events, updating an appropriate SQL database is simple!-)
If you use Linux, you can use PyInotify.
inotify can notify you about filesystem events when your program is running.
IMO, the best media player that has these features is Winamp. It rescans the music folders every X minutes, which is enough for music (but of course a little less efficient than letting the operating system watch for changes).
But as you were asking for suggestions on writing your own, you could make use of pyinotify (Linux only). If you're running Windows, you can use the ReadDirectoryChangesW API call