Detecting changes in a txt file [closed] - python

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 months ago.
Improve this question
I am a beginner python programmer and I am wondering if there is any way to detect a change in a txt file on windows. Any suggestion is appreciated.

There are many ways to go with it :
You can for example check the last modification date of the file every few seconds with os.path.getmtime(path), when the date change you know the file was edited.
You can also use some form of checksum (generate md5 hash of a file) on the file and check every few seconds if the checksum change (can get slow on big files since the checksum require to read the entire file)
You can also listen for signals send by windows directly and execute an event handler when you get a signal, this is harder to implement but by far the cleanest way to do it. (Edit, this seems to be what #martin kamau suggest in his answer)
Probably many more way that I can't think of right now...

To watch for file changes in a file, you can use the following code:
import time
import fcntl
import os
import signal
filename = "nameofthefile"
def handler(signum, frame):
print "File %s modified" % (FNAME,)
I found this code here.

Related

Get last execution time of script python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
Im writting a script in Python that takes as an input a timestamp. For the first execution, it should be something as the now() function. However, for further executions, the input parameter should be the last execution time of the same script. This is being done to avoid getting duplicates results.
Can anyone give me a clue please?
As far as I know, there is no "last executed" attribute for files. Some operating systems have a "last accessed" attribute, but even if that's updated on execution, it would also be updated any time the file was read, which is probably not what you want.
If you need to be able to store information between runs, you'll need to save that data somewhere. You could write it to a file, save it to a database, or write it to a caching service, such as memcached.

deleting results after completion of code's execution PYTHON [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Currently I am working on a project in which I need to save n number of images (to be used in the program's scope). Since the number of images to be saved is dynamic, it may end up exhausting the whole space which i have for my project.
I wanted to know that can there be something added to my code so that after 100% completion of my code the images get automatically deleted as I do not need them after the code's execution.
How can this be done?
I need to save images as they are passed as an argument to one of my functions inside my code. If you know how can I pass image without saving it to my function then please comment here
might be an idea to delete the files immediately after you've done the code you need to do i.e
import os
# Open image
# Manipulate image
os.remove(path_to_image)
Keep track of all the image files you're creating, then delete them in a finally block to ensure they'll be deleted even if an exception is raised.
import os
temp_images = []
try:
# ...do stuff
# ...create image at path_to_file
temp_images.append(path_to_file) # called multiple times
# ...other stuff
finally:
for image in temp_images:
os.remove(image)

Display a file to the user with Python [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I am trying to find a way to display a txt/csv file to the user of my Python script. Everytime I search how to do it, I keep finding information on how to open/read/write etc ... But I just want to display the file to the user.
Thank you in advance for your help.
if you want the file to open with its associated default program, use startfile.
os.startfile("path/to/file") # may only work on Windows
It really depends what you mean by "display" the file. When we display text, we need to take the file, get all of its text, and put it onto the screen. One possible display would be to read every line and print them. There are certainly others. You're going to have to open the file and read the lines in order to display it, though, unless you make a shell command to something like vim file.txt.

Received data Parsing in Python [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
This is an assignment. So, if what I am asking is something I should figure out myself then lemme know! :)
The thing is that I am to send a complete directory which may have files and sub-folders to the server. To differentiate b/w binary data, filename and folder name. I have assigned specific key letters !,^,| before and after data. (receiving one byte at a time). Though this seems like a hack to what I am trying to do. Is there a better solution?
Compress it with gzip or similar before sending and unpack it after transfer. This will save you the hassle dealing with multiple files.
http://docs.python.org/2/library/archiving.html
If your assignment does not specify a byte-stream, you can also try the SFTP protocol. It's pretty neat with commands like MKDIR, CD, PUT, GET. You can iterate through your file structure, check whether its a folder or a file and appropriately send through commands to the server.
I recommend paramiko - http://www.lag.net/paramiko/

Python R/W to text file network [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
What could happen if multiple users run the same copies of python script which designed to R/W data to a single text file store in network device at the same time?
Will the processes stop working?
If so, what could be the solution?
It can happen many bad things, I don't think the processes stop working, not at least because of concurrent access to file a file, but what could happen is and inconsistent file creation: for example, if one processes write hello, and there is a concurrent access to the file, you might get a line like hhelllolo
A solution I can see is, use a database as suggested, or, create a mechanism for locking the file to concurrent accesses (which might be cumbersome because you're working on network, not the same computer)
Another solution I can think of is create a server side simple script who handle the requests and lock the file for concurrent access. This is almost the same solution as using a database, you'll be creating an storage system from scratch so why bother :)
Hope this helps!

Categories

Resources