I am trying to create a google drive like backup program using python that backs up to a Linux box that will further backup to an off site place tbd. I have ran into a few interesting coding and computer challenges in doing this.
The one I am working on right now has to do with "locked" files. So what do I mean by this? In windows 7 if you create a .txt file you can open it in notepad(any program) and at the same time you can open it in a python program. If you make a change in the .txt file and save the change but BEFORE closing it you can still open and see the changes in pythn. Now change the file to a .docx windows file and open it with word 2007. While opened in word you cannot access it with in python until the user closes it.
Now if you look at google drive, the desktop install not the web only variety, you can open a .docx file and change it. Once you save it but BEFORE closing google drive has already synched the file.
Google drive must have some sort of lower level access to the file than the simple python file.open() command.
So here is the question. Does anyone know of a way to access files in python in such a way as to keep me from having to wait for the user to close the file.
Edit 1:
Let me further explain. Once I have created an sqlite database that has all the files and directories I will then use the win32file.ReadDirectoryChangesW() function to monitor for changes. My problem stems from the fact that when setting up the application of first install/run it must catalog all files and files that are open in windows office are locked and cannot be cataloged. Is there a way around this?
Many backup tools use snapshots. Then, they'll copy the locked file directly from the snapshot rather than coping it directly from the filesystem. If you're on Windows you should check Windows VSS, see the Microsoft documentation for more details. Otherwise, if the filesystem you're on supports snapshots check its documentation as well.
Third party tools
You can use the subprocess Python module to run third-party tools which will take snapshots for you.
Microsoft VSS
In case you want to do it by yourself you might need modules from the Win32 API such as win32com module.
There is also on Github a project that seems to do the job: pyshadowcopy
Filesystem Snapshot
Depending on the filesystem features, you might find python modules or tools allowing you to take a snapshot.
Related
Context:
We have some Excel files with embedded VBA used only for refreshing them (the data source for each one of these Excel files is a SQL query therefore on refresh, the data gets pulled). After refresh, the file gets saved. All of these files are already saved in the One Drive folder.
When you open an Excel file that is stored in the OneDrive folder along with a specific ceremony (you have to first open Excel application, then go to open, then through Excel application open each file separately from the OneDrive folder), the AutoSave gets activated on Excel file. I hear that this is only available in O365.
Problem:
The problem is that on system restarts, planned & unplanned, we have to repeat this.
Granted this is not an efficient system to begin with and you can question the method itself, I was wondering if there is a way to open the files on system restart automatically with AutoSave enabled.
So far:
Please let me know if you need more information.
Currently I have tried Python and batch command, both open the file but do not enable the autosave which makes it pointless since the files should get save to OneDrive. It is worth mentioning that I used gpedit.msc in Run (Windows) to run the batch file on system restart and I have used the path for the shortcut of .bat file. The reason is that I wanted to test and see if running the .bat in Administrative mode would fix this and the only way that you can run a .bat file in admin mode is to create a shortcut and then set the default run-mode of that file to Admin mode.
For Python I used this script:
import os os.chdir('C:\\Users\\USER\\FOLDER) os.system('FILE.xlsx')
For .bat file I used this
#echo off set params=%* start excel "C:\Users\USER\FOLDER\FILE.xlsx" /e/%params%
I have tried using .bat to run Python to open the files as well and that also failed.
Many thanks in advance,
I need to locally store some basic data the user inputs into my exe. The program was compiled using pyinstaller and was previously making use of the os open method of saving data to txt files. It is my understanding that macOS(my OS although cross-compatibility with Windows would be great) locks executables so that they may not make any file changes. The program currently does save the data, but when the executable is run again the old data is no longer accessible.
It seems obvious that apps store things locally all the time, how can data be persisted specifically within the python/pyinstaller combination?
Apologies if this is a simple question, it definitely seems simple but I can't find documentation for this anywhere.
You can use
os.path.expanduser('~user')
to get the user home directory in a cross-platform manner, see How to find the real user home directory using python?
Your application should have write permissions in the user home directory, so then you can let it create and modify a data file there following How to store Python application data.
Currently my program is using txt files (filled with data) that are located in the Desktop. I am going to be sending this out to users and the text files are going to be included in a installer. When installing I don't want these files to crowd the users desktop. Any ideas on this??
Kunwar, this is a somewhat subjective question, so this essentially is a subjective answer.
This is really about packaging. You've not provided any info on what your package lookslike, but in principle, why not just put the text files in the directory with your program?
your_program/inputs/*.txt
Then they will always be available to your tool. You can find the current location of your script within the script itself and build the path to the input file, no matter where the users have stored the script on their machines.
I recently came across an interesting way to deploy a Python application to users on a local Windows network:
Install Python properly on user machines (the same minor version on all machines)
Create a shared network folder
Copy Python application files into the folder
Create a .bat script that tweaks the PYTHONPATH and invokes the main .py file
Copy shortcut onto each Windows desktop
User invokes the application by double-clicking the shortcut
This deployment option isn't listed as a shipping option in the Python Guide. Are there any obvious gotchas to having multiple users run the same .py files from a shared location?
(Assume that any other resource sharing is handled correctly by the application(s), e.g. they don't try to write to a shared file.)
(I'm open to suggestions on making this a more valid and answerable question.)
Your creative approach requires your PC to always be on, and the network working, and what about performance, when is it safe to update python, what about someone who wants to use your app away from your network, ...
Why not package the script into an exe? I've used py2exe http://www.py2exe.org/
This has the added advantage that it's not so easy for people to play with your script and 'adapt' it in ways that might break it. The exe is a lot bigger than the script, but we're not talking hundreds of megs. It doesn't need an installer or anything like that.
I have a specific folder in which I download certain .zip files. I am writing a python script to automate the unzip, upload, and deletion of files from this folder. Is there a way to automatically trigger my python script each time a zip file is downloaded to this folder?
[EDIT] : i am on osx mavericks, sorry for not mentioning this from the start
Yes, you can use inotify (e.g. using pyinotify) to get a callback whenever a new file is created. It is not available on Windows though. There might be a similar api available, but I don't know if there are python bindings for that API.
The easiest way I can think of:
Make a cronjob lets say every 1 minute, that launches a script to check the directory in question for any new zip files.
If found it will trigger unziping, upload and deletion.
if you don't want to create a cronjob you can always think about creating a daemon (but why bother)
Since I am on Mac OSX Mavericks, my best bet is to use watchdog - the most popular python module for this kind of stuff.