Block file writing in external drives - python

I am working on a project for my company and I have reached a dead end.
I am trying to find a way to be able to decide whether to allow or block file writing to an external drive based on the content written.
The project is creating a basic DLP system.
I have tried using pydbg to debug Windows explorer, this worked partially but was very unstable and it doesn't work globally (ex. file write with cmd).
The solution can be in any language (Python preferred).
Any kind of help would be great!

Related

How do I detect if a process is done with a file to stop watching?

I'm coding a simple cross-platform file version-tracking program that's intended to open a single temp file with a given application, watch for modifications, and and stop watching and remove the file when process is done with it.
So far, I've had to dig into registry keys for Windows and desktop entries for Linux to find these apps, run them via subprocess.Popen and then watch for modifications using watchdog library. I realized then that different apps use different strategies to open/edit files, as some would lock the file (keep it open, such as for Microsoft Word), and other just copy it in memory and close the file immediately (such as VSCode or gedit). What I'm struggling with now is finding a general, or at least workable, way to code in Python that handles different apps, in different OS's.
What's the most efficient way to detect the end of watching?
Should I make it app-specific that at least handles more popular apps reliably?
Can OS help me? (I checked that lsof won't show open files in gedit, and I presume pids returned by Popen won't always be helpful as apps that open files in tabs, and not windows, remain open if other tabs are open.)

How to get file to open with python app I made

For the past month I've been writing a desktop application for MacOS using Python. It requires opening files and saving compressed data to them. This application creates files using a my own made up extension, so the files are not usable for for other applications. I have almost everything figured out. However, I want to make it so that I can right click on a file with the extension and open it with my python application. I tried using sys.argv to get any arguments for the path of the file to open, but that doesn't work. I know there has to be a way. Preferably there's a builtin variable that is easy to use, but I haven't found anything that helps.
Any help would be useful.
Thanks.

Refreshing WinDir whilst using MicroPython

I'm new to both Python and MicroPython, but i'm getting the hang of this interesting approach at embedded programming!
After creating a new file with the PyBoard, I was wondering whether there is a clever way to refresh the
Windows Directory of the representation of the PyBoard flashDir, and then seeing the change in the windows directory.
Currently, i have to turn off and reload the PyBoard to see the change in a windows directory.
I am aware that i am able to see the file using os.listdir() in a REPL.

Can you stop PyCharm from automatically closing script files when you click out of the program?

I am having a problem with PyCharm (Community Edition) in that, when I open a .py file in the program, I can happily read and write in the file as usual, however when I click out of PyCharm (to look my emails for example) and then click back into it to carry on with my code, the file automatically closes and the project tree structure collapses (so I have to re-open it every time).
So far I have tried changing the tab limit to a high number, but this doesn't seem to help (and it shouldn't be related, since this happens when I open just one file).
I had the same issue before. I'm assuming you're connected to a network shared folder via UNC path (e.g. \\foo\bar\)?
If so, it is not currently supported. You'll need to map your network folder and give it a Drive letter. Then load up your project using the mapped drive, and it'll work like a charm.
If that's not the scenario though, please give us more information.

Need a way to determine if a file is done being written to

The situation I'm in is this - there's a process that's writing to a file, sometimes the file is rather large say 400 - 500MB. I need to know when it's done writing. How can I determine this? If I look in the directory I'll see it there but it might not be done being written. Plus this needs to be done remotely - as in on the same internal LAN but not on the same computer and typically the process that wants to know when the file writing is done is running on a Linux box with a the process that's writing the file and the file itself on a windows box. No samba isn't an option. xmlrpc communication to a service on that windows box is an option as well as using snmp to check if that's viable.
Ideally
Works on either Linux or Windows - meaning the solution is OS independent.
Works for any type of file.
Good enough:
Works just on windows but can be done through some library or whatever that can be accessed with Python.
Works only for PDF files.
Current best idea is to periodically open the file in question from some process on the windows box and look at the last bytes checking for the PDF end tag and accounting for the eol differences because the file may have been created on Linux or Windows.
There are probably many approaches you can take. I would try to open the file with write access. If that succeeds then no-one else is writing to that file.
Build a web service around this concept if you don't have direct access to the file between machines.
I ended up resolving it for our situation. As it turns out the process that was writing the files out had them opened exclusively so all we had to do was try opening them for read access - when denied they were in use.

Categories

Resources