Updater for Python Application - python

I'm currently developing a small tool that's supposed to let players of a particular game customize their game way quicker thn actually browsing through the game files, making changes to each and any .ini file. However that is not the matter.
I'd like to get a function going to update my application within itself where there i have a function to download a .zip with the most recent version of my application and unzip it onto the app's directory.
However i need some sort of checkig mechanism to determine rather there's a new version of my app or not. As for that matter, i've uploaded both a .html aswell as a .txt with just a 0 in it which would then be changed to a 1 and uploaded to the web server in case there's an update.
So my question is how you'd go through this as in how would you read the text of the .txt file for example and determine rather it's 0 or 1 in so if the result is 1, it'll go over to downloading the newest version.
Any input is welcome,
Best regards,

Related

Create direct file object in python

I have images in 100 folders and the search results are slow, so I want to access those images, so maybe I wanna do it with python(if it is faster), in the way that when we select all the files, and drag and drop them in windows. then I realized that drag and drop in windows uses Component Object Model this source.
So I want to know is there any way in python to have COMs of the image files in those 100 folders in the same place (a specific folder)? or in other words can we create COMs of other files, (equivalent of shortcuts), cause I know shortcuts for my purpose won't work.
The question in general is about how to access direct handles or COMs of files of different folders in one folder? if it's possible, please tell me how? to be simpler I want to have similar function of file shortcuts but not 'shortcuts' existing in windows, because for my purpose 'shortcuts' won't work, so I think it can be done with COMs.
tkinter equivalent question:
let me ask my question in other way, lets think I want to make a windows file search application in python with some library like tkinter, so one background part of my code finds the file paths of desired search results, and other part in gui('gui part'): wants to show the result files with ability of opening files from that gui or drag files from gui to other folder or applications, so how should I do the 'gui part'?
this tutorial suggested by #Thingamabobs is about getting external files into window(gui) of app, but I want the opposite, I mean having file handles to open, something like windows explorer
My question maybe wrong in case of misunderstanding the concept of COMs, so please provide me more relevant sources of use case of mine. finally if the title seems to be unsuitable, feel free to change it.
Based on an interpretation of the question, the following is an initial summary approach to a solution.
"""
This module will enable easy access to files spread across 100 plus
directories. A file should be as easy to open as clicking on a link.
Analysis:
Will any files be duplicated in any other directory? Do not know.
Will any file name be the same as another file in a different directory? Do
not know.
Initial design in pseudocode:
> Capture absolute path to each file in each directory.
> Store files information in python data structure
> for instance a list of tuples <path>,<filename>
> Once a data structure is determined use Tkinter, ttk.treeview to open a
file as easy as clicking on a link in the tree.
"""

How to automate SAS enterprise guide reports with Python Script?

I tried with SASpy but it's not working. I am able to open the SAS .egp file but not able to run the multiple scripts within in sequence.
import os, sys, subprocess
def OpenProject(sas_exe, egp_path):
sasExe = sas_exe
sasEGpath = egp_path
subprocess.call([sasExe, sasEGpath])
sas_exe = path\path\
egp_path = path\path\path\
OpenProject(sas_exe, egp_path)
This depends a bit on exactly what the workflow is. A few side notes, then the full solution.
First: EGP is not really intended to store production processes, in my opinion. EGP should really be used for development, then production is done with .sas (text) files. EGP can directly store the nodes as .sas files; ask a new question about that if you want to know more, but it's pretty easy to figure out. Best practice is to have EGP save the code modules as .sas files, then run those - SASPy will easily do that for you.
Second: If you use SAS's built-in Git connectivity, then you can do this a bit more easily I suspect. Consider doing that if you already use Git for your other processes. Again, then you end up with a .sas file, and can directly run that via SASPy.
So: how can you do this in Python, with the assumption you do have to use the .egp itself, without too many different moving parts? The key here is the .egp format. EGP is a container file, which is actually a .zip format container that has in it, among other things, all of the SAS code you want to run, as text. Text in xml format, but still, text.
You can write a python program that opens the .egp as a .zip file, using the zipfile library, and then use xml.etree.ElementTree to parse the project.xml file inside that project. Exactly what you do from there depends on your particular details, and is well out of scope for a Stack Overflow answer, but if you do better visually you can simply rename the .egp to .zip and then open in unzip program of your choice, then browse project.xml in your text editor, and find the nodes and code related to those nodes.
You can then extract the .sas code as text, and submit it directly via SASPy, or extract it to a .sas file and then submit that however you prefer (SASPy or something else).
I do something similar to this for a project - I don't actually run code from it, I'm just parsing it to verify that the correct programs were synced from the EGP to production - but it would be trivial to actually submit the code from what I've written, which is about 50 lines of code total. I may write a SGF paper this year or next year on this topic, in which case I'll try and remember to submit it here - or you can head over to my github page and see if it's there (in the future!).

Data storage for standalone python application

I want to make a python program (with a PyQt GUI, but I don't know whether that is relevant) that has to save some information that I want to store even when the program closes. Example for information I want to store:
The user can search for a file in a file dialog window. I want to start the file dialog window in the previously used directory, even if the program is closed in between file searches.
The user can enter their own categories to sort items, building up on some of my predefined categories. These new categories should be available the next time the program starts.
Now I'm wondering what the proper way to store such information is. Should I use pickle? A proper database (I know a tiny bit of sqlite3, but would have to read up on that)? A simple text file that I parse myself? One thing for data like in example 1., another for data like in example 2.?
Also, whatever way to store it I use, where would I put that file?
I'm asking in the context that I might want to later make my program available to others as a standalone application (using py2app, py2exe or PyInstaller).
Right now I'm just saving a pickle file in the directory that my .py file is in, like this answer reconmends, but the answer also specifically mentions:
for a personal project it might be enough.
(emphasis mine)
Is using pickle also the "proper, professional" way, if I want to make the program available to other people as a standalone application?
Choice depends on your approach to data you store, which is yours?:
user should be able to alter it without usage of my program
user should be prevented from altering it with program other than my program
If first you might consider deploying JSON open-standard file format, for which Python has ready library called json. In effect you get text (which you can save to file) which is human-readable and can be edited in text editor. Also there exist JSON file viewers and editors which made viewing/editing of JSON files easier.
I think SQLite3 is the better solution in this case as Moldovan commented.
There is a problem in pickle, sometimes pickling format can be change across python versions and there are greater advantages of using sqlite3.

Is there a django app that provides a file chooser for the files on the server?

I need a component that's a browser-based file browser, and I expect some django app to currently provide this. Is there such a thing?
The full story:
I'm building a django app that is used for testing. I want to use it to serve files (and strings, and etc.) and attach custom headers to it.
Currently, I have a model FileSource which has a single file_path field, which is of type django.db.models.FileField.
When creating a FileSource from the admin, the user has a nice file upload dialog, and when saving, the file he chose, is saved on the server (in a really weird location, inside the directory where django is installed, or something weird like that, because i didn't customize the storage, nor will it help me in any way)
My problem: I only want to use the file dialog for the user to select a full path on the server. The file that the user chose must be only referenced, not copied (like currently), and it must reside on the server.
The server must thus be able to list the files it has, so i basically need a little browser-based file-browser.
At that point, I expect to be able to save a full path in my DB, and then I'll be able to access that file and serve it (together with whatever custom headers the user will chose from my app).
Currently, as you might know, the browsers always lie about the full path of the file. Chromium appends "C:\fakepath" to the file name, so I need support of the backend to accomplish this.
Also, I checked out django-filebrowser and django-filer and from what I understood, they weren't built for this. If I'm wrong, a little assistence in configuring them would be awesome.
You can use a FilePathField for that. It won't upload a file, but rather allow you to choose a pre-existing file. A caveat is that you can only use one directory. If you need multiple directories, then you'd need do go with something like django-filer.

web2py site doesn't load all images/videos (especially larger ones)

First of all, I must say I have seen something similar to this in the web2py discussion group, but I couldn't understand it very well.
I've set up a database-driven website using web2py in which the entries are just HTML text. Most of them will contain img and/or video tags that point to relative URLs; these files are stored in folders with the address pattern static/content/article/<article-name> and the document's base href is set via the controller to make these links work. So, the images are stored and referenced directly, without all the upload/download machinery.
I'm testing it locally and using Rocket server because I'm not allowed to install Apache in this PC.
The problem:
Everything works fine, except, as it seems, when there are several "large" files being requested. By "large" I mean 4Mb files were enough, which isn't really a lot (and I think slightly smaller files would produce the same result). I'm pretty sure the links aren't broken since 1) by copying/pasting their URLs in the browser they show up normally, 2) the images/videos appear well/broken randomly when I refresh the page and 3) sometimes a video loads until a certain point and then stops, and the browser inspector shows a 'fail' signal. When I replaced these files with smaller ones (each with a dozen kb), all of them loaded. Another thing to consider is that sometimes it takes a really long time until the page finishes loading (from 2 seconds to several minutes).
The questions:
Is this the simplest/optimal way of getting the job done? I'm aware that web2py has some neat features like upload fields, but I don't know how I could make these files be effortlessly referenced in the document, considering there will be some special features in such pages involving the static files. So the solution I've come up with so far was to create a directory which name equals to the entry's and store the files there, as I said before. Is it an overkill considering what web2py has to offer?
If the answer to the first question is something like "yes", then (obvious question) what may be causing the problem and how do I fix it? Does it have something to do with the fact that web2py sends static files in chunks of 1Mb? Might it be the Rocket server? Or because I'm testing it locally?
Thanks in advance!
It's hard to give you an answer without knowing some details...
Where is hosted your Web2py application?
Do you use apache? nginx?
Did you deploy using a one step-deploy script? (http://web2py.googlecode.com/hg/scripts/setup-web2py-ubuntu.sh)
But in any case, you can (should) :
Configure Apache/Nginx to serve your static files directly (files in /YourApp/static/.). See "setup-web2py-" scripts in the "scripts" folder for more informations
Use scripts/zip_static_files.py to create gzipped versions of your static files. You can create a cron to run "python web2py.py -S myapp -R scripts/zip_static_files.py"
More details about efficiency in the book : http://web2py.com/books/default/chapter/29/13/deployment-recipes?search=static+files#Efficiency-and-scalability

Categories

Resources