I have a text file that will be populated depending on a radio button selected by a user in php
I am reading from the same text file, whenever I find that a new line has been added the Python script prints it. For now I am just printing it. Eventually I will send out a message on a GSM modem based on the new records in the .txt file. Right now I'm just printing it to make sure I can identify a new record.
Here's the code I'm using. It works just fine.
def main():
flag = 0
count = 0
while flag == 0:
f = open('test.txt', 'r')
with open('test.txt') as f:
for i, l in enumerate(f):
pass
nol = i + 1
if count < nol:
while count <= nol:
f = open('test.txt', 'r')
a = f.readlines(count)
if count == 0:
print a[0]
count = count+2
else:
print a[count-1]
count = count+1
if __name__ == '__main__':
main()
I was wondering if there is a better way to do this.
Also, php will keep writing to the file. And python will keep reading from it. Will this cause a clash? Since multiple instance are open?
According to this answer you can use watchdog package to watch over changes in the file. Alternatively you can use custom made solution using fcntl in Unix.
First solution has advantage of being cross-platform.
Related
What would be the best Pythonic way of implementing this awk command in python?
awk 'BEGIN{chunk=0} /^From /{msgs++;if(msgs==500){msgs=0;chunk++}}{print > "chunk_" chunk ".txt"}' mbox
I'm using this now to split up enormous mailbox (mbox format) files.
I'm trying a recursive method right now.
def chunkUp(mbox, chunk=0):
with open(mbox, 'r') as bigfile:
msg = 0
for line in bigfile:
if msg == 0:
with open("./TestChunks/chunks/chunk_"+str(chunk)+".txt", "a+") as cf:
if line.startswith("From "): msg += 1
cf.write(line)
if msg > 20: chunkUp(mbox, chunk+1)
I would love to be able to implement this in python and be able to resume progress if it is interrupted. Working on that bit now.
I'm tying my brain into knots! Cheers!
your recursive approach is doomed to fail: you may end up having too many open files at once, since the with blocks don't exit until the end of the program.
Better have one handle open and write to it, close & reopen new handle when "From" is encountered.
also open your files in write mode, not append. The code below tries to do the minimal operations & tests to write each line in a file, and close/open another file when From: is found. Also, in the end, the last file is closed.
def chunkUp(mbox):
with open(mbox, 'r') as bigfile:
handle = None
chunk = 0
for line in bigfile:
if line.startswith("From "):
# next (or first) file
chunk += 1
if handle is not None:
handle.close()
handle = None
# file was closed / first file: create a new one
if handle is None:
handle = open("./TestChunks/chunks/chunk_{}.txt".format(chunk), "w")
# write the line in the current file
handle.write(line)
if handle is not None:
handle.close()
I haven't tested it, but it's simple enough, it should work. If file doesn't have "From" in the first line, all lines before are stored in chunk_0.txt file.
I'm learning Python and stuck with below task thus appreciate some help. (version 3.7)
Task: Record console inputs into a file - assign an ID to it to be able to delete based on the ID. Be able to list it.
Full code below for reference - yet I struggle with assigning an index to the user input and delete it based on that ID. I was advised that it is possible relying on sys.argv - yet I kinda feel this is rather expecting a dictionary usage (which I am not familiar with yet).
I was trying to add a number prior to the user input in the file (so the text file would look like 001. *user input*,\n 002. *user input*, etc), thus all lines shall get numbered.
Based on that when the user enters a certain number(=certain line) to be deleted the script should delete the given line. Yet I failed to look up and make python understand the number reference at the beginning of the line (some sort of search function would work I assume).
Can I tell the script to delete based online reference?
edit: given that the program shuts down after every entry len(sys.argv) will remain 1. A possible solution could be -if I don't shut it down- to refer to the index number to delete certain line based on the reference. But how do I feed again the variable/index number after restarting the program? The index will start again from 1 (as 0 reserved) and will disregard the number of lines already in the text.
Thanks in advance!
import sys
menu = input("What o you want to do?\n add new idea(1)\n delete an idea(2)\n list all ideas(3)")
if menu == "1":
myfile = open("ideabank.txt", 'a+', encoding = 'utf-8')
newidea = input("What is your new idea?:")
print('Argument List:', str(sys.argv))
myfile.write(newidea)
myfile.write("\n")
myfile.close()
elif menu == "2":
print("delete")
else:
myfile = open("ideabank.txt", 'r', encoding = 'utf-8')
for line in myfile:
print(line, end="")
myfile.close()
myfile = open("ideabank.txt", 'r+', encoding = 'utf-8')
newidea = input("Which line you want to delete:")
data = myfile.readlines()
for i in range(0, len(data))
if i != int(newidea):
myfile.write(data[i])
myfile.write("\n")
myfile.close()
Just to close the question for any future possible check.
The solution I used was to use list in lists. Thus the index of the element was the ordering and the tasks and the completion status was the 0 and 1 elements of the list-in-lists.
I have a Python script that I want to increment a global variable every time it is run. Is this possible?
Pretty easy to do with an external file, you can create a function to do that for you so you can use multiple files for multiple vars if needed, although in that case you might want to look into some sort of serialization and store everything in the same file. Here's a simple way to do it:
def get_var_value(filename="varstore.dat"):
with open(filename, "a+") as f:
f.seek(0)
val = int(f.read() or 0) + 1
f.seek(0)
f.truncate()
f.write(str(val))
return val
your_counter = get_var_value()
print("This script has been run {} times.".format(your_counter))
# This script has been run 1 times
# This script has been run 2 times
# etc.
It will store in varstore.dat by default, but you can use get_var_value("different_store.dat") for a different counter file.
example:-
import os
if not os.path.exists('log.txt'):
with open('log.txt','w') as f:
f.write('0')
with open('log.txt','r') as f:
st = int(f.read())
st+=1
with open('log.txt','w') as f:
f.write(str(st))
Each time you run your script,the value inside log.txt will increment by one.You can make use of it if you need to.
Yes, you need to store the value into a file and load it back when the program runs again. This is called program state serialization or persistency.
For a code example:
with open("store.txt",'r') as f: #open a file in the same folder
a = f.readlines() #read from file to variable a
#use the data read
b = int(a[0]) #get integer at first position
b = b+1 #increment
with open("store.txt",'w') as f: #open same file
f.write(str(b)) #writing a assuming it has been changed
The a variable will I think be a list when using readlines.
I am trying to create a box that tells me if a file text is modified or not, if it is modified it prints out the new text inside of it. This should be in an infinite loop (the bot sleeps until the text file is modified).
I have tried this code but it doesn't work.
while True:
tfile1 = open("most_recent_follower.txt", "r")
SMRF1 = tfile1.readline()
if tfile1.readline() == SMRF1:
print(tfile1.readline())
But this is totally not working... I am new to Python, can anyone help me?
def read_file():
with open("most_recent_follower.txt", "r") as f:
SMRF1 = f.readlines()
return SMRF1
initial = read_file()
while True:
current = read_file()
if initial != current:
for line in current:
if line not in initial:
print(line)
initial = current
Read the file in once, to get it's initial state. Then continuously repeat reading of the file. When it changes, print out its contents.
I don't know what bot you are referring to, but this code, and yours, will continuously read the file. It never seems to exit.
I might suggest copying the file to a safe duplicate location, and possibly using a diff program to determine if the current file is different from the original copy, and print the added lines. If you just want lines appended you might try to utilize a utility like tail
You can also use a library like pyinotify to only trigger when the filesystem detects the file has been modified
This is the first result on Google for "check if a file is modified in python" so I'm gonna add an extra solution here.
If you're curious if a file is modified in the sense that its contents have changed, OR it was touched, then you can use os.stat:
import os
get_time = lambda f: os.stat(f).st_ctime
fn = 'file.name'
prev_time = get_time(fn)
while True:
t = get_time(fn)
if t != prev_time:
do_stuff()
prev_time = t
I have a Django app that opens a file, continuously reads it, and at the same time writes data to a Postgres database. My issue is that whenever I open a file,
file = open(filename, 'r')
I am unable to also create new things in the database,
Message.objects.create_message(sys, msg)
That should create a database entry with two strings. However, nothing seems to happen and I am presented with no errors :( If I decide to close the file, file.close(), before I write to the database everything is fine. My problem is that I need that file open to create my objects. Does anyone have a solution for this? Thanks.
EDIT
Here's some more of my code. Basically I have the following snippet following the end of a file and then writing to the database as it gets information.
file.seek(0,2)
while True:
line = file.readline()
if not line:
time.sleep(1)
continue
Message.objects.create_message(sys, line)
EDIT 2
Got this to work finally but I'm not sure why. I'd love to understand why this worked:
str1ng = line[0:len(line)-1]
Message.objects.create_message(sys, str1ng)
Some how there is a difference between that string and the string gathered from file.readline().
Any ideas?
try this:
file = open(filename, 'r')
fileContents = file.read()
file.close()
Have you tried linecache? Something like this might work (not tested).
import linecache
i = 0
go = True
file = ...
while (go == True):
out = linecache.getline(file,i)
...process out...
i = i+1
if i % 100 == 0:
# check for cache update every 100 lines
linecache.checkcache(file)
if ( some eof condition):
go = False
linecache.clearcache()