I have a program that I wrote in python that collects data. I want to be able to store the data on the internet somewhere and allow for another user to access it from another computer somewhere else, anywhere in the world that has an internet connection. My original idea was to use an e-mail client, such as g-mail, to store the data by sending pickled strings to the address. This would allow for anyone to access the address and simply read the newest e-mail to get the data. It worked perfectly, but the program requires a new e-mail to be sent every 5-30 seconds. So the method fell through because of the limit g-mail has on e-mails, among other reasons, such as I was unable to completely delete old e-mails.
Now I want to try a different idea, but I do not know very much about network programming with python. I want to setup a webpage with essentially nothing on it. The "master" program, the program actually collecting the data, will send a pickled string to the webpage. Then any of the "remote" programs will be able to read the string. I will also need the master program to delete old strings as it updates the webpage. It would be preferred to be able to store multiple string, so there is no chance of the master updating while the remote is reading.
I do not know if this is a feasible task in python, but any and all ideas are welcome. Also, if you have an ideas on how to do this a different way, I am all ears, well eyes in this case.
I would suggest taking a look at setting up a simple site in google app engine. It's free and you can use python to do the site. Than it would just be a matter of creating a simple restful service that you could send a POST to with your pickled data and store it in a database. Than just create a simple web front end onto the database.
Another option in addition to what Casey already provided:
Set up a remote MySQL database somewhere that has user access levels allowing remote connections. Your Python program could then simply access the database and INSERT the data you're trying to store centrally (e.g. through MySQLDb package or pyodbc package). Your users could then either read the data through a client that supports MySQL or you could write a simple front-end in Python or PHP that displays the data from the database.
Adding this as an answer so that OP will be more likely to see it...
Make sure you consider security! If you just blindly accept pickled data, it can open you up to arbitrary code execution.
I suggest you to use a good middle-ware like: Zero-C ICE, Pyro4, Twisted.
Pyro4 using pickle to serialize data.
Related
I've been learning Pygame for a quiet long time. Now, I'm managing to let players create accounts for the game and connect to them if they're playing online.
I've seen some game developers storing players' data in text files, yet that's a very classic way and it's unsecure ! So I thought about databases ...
However, I think I'm misunderstanding how databases work in general. In fact, I was thinking of them just as servers but after doing some Google researches, I realized they're not. Let's talk about the SQLite as an example. (as i just learned the SQLite3 python module's basics on the fly).
I thought it stores data in a server or in 'clouds' (like Apple iCloud) but then I realized it just stores it on the computer disk, and this won't enable players to connect to their accounts if they're using the game from a computer that's different from the one they used to sign up for example and that's a possible case and I want to cover max cases that can happen while connecting to the game in order to ensure a good player experience with the game.
So, is there any way to ensure a good connection to the game with any device ?
And do you think I have to use the socket module ?
NOTE :
Keep in mind that the game itself isn't multiplayer and doesn't need Internet connection. However, I want the players to be able to connect to their accounts from any device just to be able to save their progress in the game.
There's a lot of different approaches to this. As Rahul Hindocha pointed out, you need some kind of backend.
Let me list the ones I can think of. Note that my answer will be slightly Google flavored as I have most experience there.
A Flask/Django/Falcon app running in the Cloud. This backend could use either local storage or a SQL implementation to store the data. You need to take care of data persistance when using Cloud services though, as fe. Google App Engine and Heroku Dyno's both have ephemeral filesystems, which means all data stored on the local instance gets deleted regulary when the instance gets reset. A Google Compute Engine (GCE) instance would work though, as the filesystem on this is persistent.
Use something like Google CloudSQL or a GCE instance running SQL and connect to that from your app.
Use another Cloud storage option. There's tons of companies that offer online storage and python libraries to connect to them. There's also free tiers. Think Amazon AWS (S3), Google Cloud (GCS/Firestore), MS Azure, Heroku, etc...
Google Drive to store login information. I've done this in the past before I switched to using GCS buckets. You need to encrypt/decrypt the data in your app though, as storing plain text is just plain stupid. I did this by putting the data in a JSON string, then encrypt the complete JSON string using PyCrypto and upload. Download and decrypt when you need it. See this SO answer for a short explanation. Google offers a library for this.
Google Sheets to store login information. Put the relevant information into columns, with a row for each user. You can also use the encrypt/decrypt here to make sure SPI/PII is encrypted in transit and at rest. Google offers a library for this, and there's gspread which makes it a bit easier still.
Raspberry PI running the app mentioned in number 1 above. Note that you'll need to expose this to the internet, and keep it up and running. This is not something I would do lightly (the exposing part).
As for the play offline, you can make it a background thread that just retries every x minutes, and have the main thread run your game logic, storing whatever needs to be uploaded on the local filesystem to cache it for upload:
Main game thread saves data into fe. JSON file to_be_uploaded.json
Background upload process checks for internet connection
Then checks if to_be_uploaded.json has content
If 2&3 == True, remove contents from to_be_uploaded.json and store in pending_upload.json
Try upload.
If successfull, remove pending_upload.json and goto #2. If not goto #5
The reason for #4 above is for data contention, where main game thread tries to add to to_be_uploaded.json while the background thread already has the object loaded and then uploads it. If it deletes the contents of the file to_be_uploaded.json it would miss the write from the main thread. Putting the data in a secondary cache file pending_upload.json removes this problem, unless we're talking about very frequent writes to to_be_uploaded.json. In this case you would need to look into file locking methods.
There's other ways to go about this, but this is what springs to mind after thinking for a couple of minutes...
You would need to create a backend for your game. You can use Flask or Django framework for doing this. You can send the player data to the server so that it would store that in the database. And when the user would sign-in to their account, the server would send the data to the game so that it can load the user's progress.
I have JS running and essentially getting user entries from my HTML session storage and pushing these to a DB. I also need to use a HTTP request to pass a json object containing the entries to a python file hosted somewhere else.
Does anyone have any idea of documentation I could look at, or perhaps how to get JSON objects from JS to Python.
My client does not want me to grab the variables directly from the DB.
You have to create some sort of communication channel between the javascript and python code. This could be anything, SOAP, HTTP, RPC, any number of and flavor of message queue.
If nothing like that is in place, it's quite the long way around. A complex application might warrant you doing this, think micro services communicating across some sort of service bus. It's a sound strategy and perhaps that's why your client is asking for it.
You already have Firebase, though! Firebase is a real-time database that already has many of the characteristics of a queue. The simplest and most idiomatic thing to do would be to let the python code be notified of changes by Firebase: Firebase as service bus is a nice strategy too!
I've got a question concerning Python and MySQL. I come from a PHP background and I am wondering how the workflow will be in Python. I can't find any good answers on that on the web and hope anybody can help me understanding this thing. So let me quickly explain what i'm stucked with:
In PHP i did a lot of little things in combination with MySQL, meaning loading data from a database and writing to it. As long as the server on which the php files were stored was correctly set up, it was save to do that. The connection to the database including the username, servername, password and database name was saved in the php file. As php files get stored on the server and the source code won't get shown to the user itself, the user couldn't see the authentication data to connect to the database.
Now, I am wondering how that whole concept can be transfered to Python in a secure way so that the user can't see the authentication data in the source text.
I plan to program a Python program in which the user has to authenticate. Let's assume I created a MySQL database on a webserver and in the Python program the user can login. As soon the user clicks on the login-button a connection to the web-database is done. That would mean that in my source code i need to write down the neccessary data like username, password, db-name and server name for that specific database. Here is my Question: That would mean that everybody could see that authentication data which would be very unsecure, wouldn't it? Even if the user has just a .pyc file he could then recompile it and see the standard .py file in which he could see all that very sensitive data.
So I was wondering how to securely hide that authentication data from the user who will later use my Python program.
As a pythoneer who long time ago was working in php, I think I know what you are after.
First of all, the HTML code will not contain any database credentials unless you put them into the HTML view. An easy way of structuring what goes into the HTML views is to use a framework like Django. It handles the MVC of web applications and does connections to databases.
If you want your database credentials to be very safe, you can have your web application ask for them at startup, thus never having them written down in a file. Keep them secure using keypass or similar password storage systems.
This way they are also not checked in to any version control system, where the most common place for database password leakage occurs.
If you are a newbie to webapp programming in python, I would suggest to follow a Django tutorial it should help you get on the track.
What is the easiest way to write a simple Python daemon/server-side program that, in a reasonably secure way, processes incoming messages from an email account? For example, if you have an account 'foo#bar.org' and you have the username/password to the program, you want to be able to have the program read the contents of the email and save them to a database (e.g. with sqlite) in Python. What's the best framework/library for doing this? It sounds like it might be overkill to use Django for something so simple -- can it be done purely with the Python standard libraries?
There are python poplib (http://docs.python.org/2/library/poplib.html) and python imaplib (http://docs.python.org/2/library/imaplib.html). For accessing mailboxes.
Then you have lamson (http://lamsonproject.org/), which is not only excellent for sending and recieving mails. But it can also help you with parsing messages, detecting if they are spam or not - look into lamsons code to see exactly what you can do with it.
Then there are many examples of python daemons, which you can periodically run to pick up mails using poplib/imaplib and then save them somewhere using sqlalchemy or django or whatever.
OR you could skip python daemons and rather create small django project for doing all that. Combined with Celery (https://pypi.python.org/pypi/django-celery), you can create excellent daemonized backend for accessing mailbox via POP or IMAP and saving stuff to your own database.
I have a desktop python application whose data backend is a MySQL database, but whose previous database was a network-accessed xml file(s). When it was xml-powered, I had a thread spawned at the launch of the application that would simply check the xml file for changes and whenever the date modified changed (due to any user updating it), the app would refresh itself so multiple users could use and see the changes of the app as they went about their business.
Now that the program has matured and is venturing toward an online presence so it can be used anywhere. Xml is out the window and I'm using MySQL with SQLAlchemy as the database access method. The plot thickens, however, because the information is no longer stored in one xml file but rather it is split into multiple tables in the SQL database. This complicates the idea of some sort of 'last modified' table value or structure. Thus the question, how do you inform the users that the data has changed and the app needs to refresh? Here are some of my thoughts:
Each table needs a last-modified column (this seems like the worst option ever)
A separate table that holds some last modified column?
Some sort of push notification through a server?
It should be mentioned that I have the capability of running perhaps a very small python script on the same server hosting the SQL db that perhaps the app could connect to and (through sockets?) it could pass information to and from all connected clients?
Some extra information:
The information passed back and forth would be pretty low-bandwidth. Mostly text with the potential of some images (rarely over 50k).
Number of clients at present is very small, in the tens. But the project could be picked up by some bigger companies with client numbers possibly getting into the hundreds. Even still the bandwidth shouldn't be a problem for the foreseeable future.
Anyway, somewhat new territory for me, so what would you do? Thanks in advance!
As I understand this is not a client-server application, but rather an application that has a common remote storage.
One idea would be to change to web services (this would solve most of your problems on the long run).
Another idea (if you don't want to switch to web) is to refresh periodically the data in your interface by using a timer.
Another way (and more complicated) would be to have a server that receives all the updates, stores them in the database and then pushes the changes to the other connected clients.
The first 2 ideas you mentioned will have maintenance, scalability and design uglyness issues.
The last 2 are a lot better in my opinion, but I still stick to web services as being the best.