Python for: loop won't pause? - python

Alright, so I have a script to find, move, and rename files when given a filename to search for. I wrote a wrapper to iterator throughout all my folder in my Robotics folder to automate the process. Here's the code:
#! /usr/bin/env python
import os
import sys
import time
files = [ ... Big long list of filenames ... ]
for item in files:
sys.stdout.write("Analyzing & Moving " + item + "... ")
os.system('python mover-s.py "' + item + '"')
sys.stdout.write("Done.\n")
print ""
print "All files analyzed, moved, and renamed."
Now, it takes ~2s to execute and finished the original script, what I want it to do is display "Analyzing & Moving whatever...." and then AFTER the script is finished, display "Done.".
My issue is that both the message and the "Done." message appear at the same time.
I've added a small pause to it, about .25s, and the same thing, but it just adds .25s to the time it takes to display "Analyzing & Moving whatever... Done."
Basically, why won't it show my first message, pause, then display the second? Because right now it displays the entire line at the same time. This may be because of my poor knowledge of pipes and whatnot..

Add a call to sys.stdout.flush() right after the first write().
The behaviour you're seeing has to do with buffering. See Usage of sys.stdout.flush() method for a discussion.

There are a couple of issues here.
First, to call another python script, there is no reason to be using os.system as you're currently doing. you should simply have a line along the lines of
import movers # can't have hyphen in a module name
movers.main()
or whatever and let than do it.
Secondly, if you are going to move the files using Python's built-in libraries, see this SO question which explains that you should use shutil.copyfile rather than os.system.
This will also take care of the pause issue.

....
for item in files:
sys.stdout.write("Analyzing & Moving " + item + "... ")
os.system('python mover-s.py "' + item + '"')
sys.stdout.write("Done.\n")
print ""
....

Related

print information in a corner when using python/ipython REPL -- and quitting it while thread is running

I am using a python library I wrote to interact with a custom USB device. The library needs to send and receive data. I also need to interactively call methods. At the moment I utilize two shells, one receiving only and the other sending only. The latter is in the (i)python REPL. It works but it is clumsy, so I want to consolidate the two things in a single shell, which will have the advantage to have access to data structures from both sides in one context. That works fine. The problem is in the UI.
In fact, the receiving part needs to asynchronously print some information. So I wrote something like the following:
#!/usr/bin/python
import threading
import time
import blessings
TOTAL=5
def print_time( thread_id, delay):
count = 0
t=blessings.Terminal()
while count < TOTAL:
time.sleep(delay)
count += 1
stuff = "Thread " + str(thread_id) + " " + str(time.ctime(time.time())) + " -- " + str(TOTAL - count) + " to go"
with t.location(t.width - len(stuff) - 1, thread_id):
print (stuff, end=None )
print("", end="") # just return the cursor
try:
t1 = threading.Thread( target = print_time, args = (1, 2, ) )
t1.start()
print ("Thread started")
except:
print ("Error: unable to start thread")
That is my __init__.py file for the module. It somewhat works, but it has two problems:
While the thread is running, you cannot exit the REPL neither with CTRL-D nor with sys.exit() (that is the reason I am using TOTAL=5 above, so your life is easier if you try this code). This is a problem since my actual thread needs to be an infinite loop. I guess one solution could be to exit via a custom call which will cause a break into that infinite loop, but is there anything better?
The cursor does not return correctly to its earlier position
if I remove the end="" in the line with the comment # just return the cursor, it sort of works, but obviously print an unwanted newline in the place the cursor was (which messes us other input and/or output which might be happening there, in addition to add that unwanted newline)
if I leave the end="" it does not return the cursor, not even if I add something to print, e.g. print(".", end="") -- the dots . are printed at the right place, but the blinking cursor and the input is printed at the top
I know these are two unrelated problem and I could have asked two separate questions, but I need an answer to both, or otherwise it's a moot point. Or alternatively, I am open to other solutions. I thought of a separate GTK window, and that might work, but it's a subpar solution, since I really would like this to work in CLI only (to keep it possible in a ssh-without-X-tunneling setup).
Using blessed instead of blessing does not have the problem with the cursor not returning to the previous position, even without anything outside of the with context.
Making the thread a daemon solves the other problem.

print string to terminal over and over again without flickering in python

I wrote a little python3 script, that runs another program with Popen and processes its output to create a little dashboard for it. The script generates a long string with information about the other program, clears the terminal and prints it. Everytime the screen refreshes, the whole terminal flickers.
here are the important parts of my script:
def screen_clear():
if os.name == 'posix':
os.system('clear')
else:
os.system('cls')
def display(lines):
# lines as a list of, well, lines i guess
s=''
for line in lines:
s=s + '\n' + str(line)
screen_clear()
print(s)
I bet theres a more elegant way without flickering to this, right?
Thanks for any help in advance!
the only solution to try out I can think of would be using print(s, end='\r') instead of clearing the screen first and printing again. The \r marker tells the console to override the last line.
In the end I'm sorry to say that consoles are simply not made for using them as a dashboard with permanently changing values. If the aforementioned solution doesn't work, maybe try implementing your dashboard in another way, python offers lots of solutions for that.

Chaining Python Scripts

I have two user defined python scripts. First takes a file and processes it, while the second script takes the output of first and runs an executable, and supplies the output of first script to program with additional formatting.
I need to run these scripts via another python script, which is my main executable script.
I searched a bit about this topic and;
I can use importlib to gather the content of scripts so that I can call them at appropriate times. This requires the scripts to be under my directory/or modification to path environment variable. So it is a bit ugly looking at best, not seem pythonish.
Built-in eval function. This requires the user to write a server-client like structure, cause the second script might have to run the said program more than one time while the first script still gives output.
I think I'm designing something wrong, but I cannot come up with a better approach.
A more detailed explenation(maybe gibberish)
I need to benchmark some programs, while doing so I have a standard form of data, and this data needs to be supplied to benchmark programs. The scripts are (due to nature of benchmark) special to each program, and needs to be bundled with benchmark definition, yet I need to create this program as a standalone configurable tester. I think, I have designed something wrong, and would love to hear the design approaches.
PS: I do not want to limit the user, and this is the reason why I choose to run python scripts.
I created a few test scripts to make sure this works.
The first one (count_01.py) sleeps for 100 seconds, then counts from 0 to 99 and sends it to count_01.output.
The second one (count_02.py) reads the output of first one (count_01.output) and adds 1 to each number and writes that to count_02.output.
The third script (chaining_programs.py) runs the first one and waits for it to finish before calling the second one.
# count_01.py --------------------
from time import sleep
sleep(100)
filename = "count_01.output"
file_write = open(filename,"w")
for i in range(100):
#print " i = " + str(i)
output_string = str(i)
file_write.write(output_string)
file_write.write("\n")
file_write.close()
# ---------------------------------
# count_02.py --------------------
file_in = "count_01.output"
file_out = "count_02.output"
file_read = open(file_in,"r")
file_write = open(file_out,"w")
for i in range(100):
line_in = file_read.next()
line_out = str(int(line_in) + 1)
file_write.write(line_out)
file_write.write("\n")
file_read.close()
file_write.close()
# ---------------------------------
# chaining_programs.py -------------------------------------------------------
import subprocess
import sys
#-----------------------------------------------------------------------------
path_python = 'C:\Python27\python.exe' # 'C:\\Python27\\python.exe'
#
# single slashes did not work
#program_to_run = 'C:\Users\aaaaa\workspace\Rich_Project_044_New_Snippets\source\count.py'
program_to_run_01 = 'C:\\Users\\aaaaa\\workspace\\Rich_Project_044_New_Snippets\\source\\count_01.py'
program_to_run_02 = 'C:\\Users\\aaaaa\\workspace\\Rich_Project_044_New_Snippets\\source\\count_02.py'
#-----------------------------------------------------------------------------
# waits
sys.pid = subprocess.call([path_python, program_to_run_01])
# does not wait
sys.pid = subprocess.Popen([path_python, program_to_run_02])
#-----------------------------------------------------------------------------

Python: Reread contents of a file

I have a file that an application updates every few seconds, and I want to extract a single number field in that file, and record it into a list for use later. So, I'd like to make an infinite loop where the script reads a source file, and any time it notices a change in a particular figure, it writes that figure to an output file.
I'm not sure why I can't get Python to notice that the source file is changing:
#!/usr/bin/python
import re
from time import gmtime, strftime, sleep
def write_data(new_datapoint):
output_path = '/media/USBHDD/PythonStudy/torrent_data_collection/data_one.csv'
outfile = open(output_path, 'a')
outfile.write(new_datapoint)
outfile.close()
forever = 0
previous_data = "0"
while forever < 1:
input_path = '/var/lib/transmission-daemon/info/stats.json'
infile = open(input_path, "r")
infile.seek(0)
contents = infile.read()
uploaded_bytes = re.search('"uploaded-bytes":\s(\d+)', contents)
if uploaded_bytes:
current_time = strftime("%Y-%m-%d %X", gmtime())
current_data = uploaded_bytes.group(1)
if current_data != previous_data:
write_data(","+ current_time + "$" + uploaded_bytes.group(1))
previous_data = uploaded_bytes.group(1)
infile.close()
sleep(5)
else:
print "couldn't write" + strftime("%Y-%m-%d %X", gmtime())
infile.close()
sleep(60)
As is now, the (messy) script writes once correctly, and then I can see that although my source file (stats.json) file is changing, my script never picks up on any changes. It keeps on running, but my output file doesn't grow.
I thought that an open() and a close() would do the trick, and then tried throwing in a .seek(0).
What file method am I missing to ensure that python re-opens and re-reads my source file, (stats.json)?
Unless you are implementing some synchronization mechanism or could guarantee somehow atomic read and write, I think you are calling for race condition and subtle bugs here.
Imagine the "reader" accessing the file whereas the "writer" hasn't completed its write cycle. There is a risk of reading incomplete/inconsistent data. In "modern" systems, you could also hit the cache -- and not seeing file modifications "live" as they appends.
I can think of two possible solutions:
You forgot the parentheses on the close in the else of the infinite loop.
infile.close --> infile.close()
The program that is changing the JSON file is not closing the file, and therefore it is not actually changing.
Two problems I see:
Are you sure your file is really updated on filesystem? I do not know on what operating system you are playing with your code, but caching may kick your a$$ in this case, if the files is not flushed by producer.
Your problem is worth considering pipe instead of file, however I cannot guarantee what transmission will do if it stuck on writing to pipe if your consumer is dead.
Answering your problems, consider using one of the following:
pynotifyu
watchdog
watcher
These modules are intended to monitor changes on filesystem and then call proper actions. Method in your example is primitive, has big performance penalty and couple other problems mentioned already in other answers.
Ilya, would it help to check(os.path.getmtime), whether stats.json changed before you process the file?
Moreover, i'd suggest to make advantage of the fact it's JSON file:
import json
import os
import sys
dir_name ='/home/klaus/.config/transmission/'
# stats.json of daemon might be elsewhere
file_name ='stats.json'
full_path = os.path.join(dir_name, file_name)
with open(full_path) as fp:
json.load(fp)
data = json.load(fp)
print data['uploaded-bytes']
Thanks for all the answers, unfortunately my error was in the shell, and not in the script with Python.
The cause of the problem turned out to be the way I was putting the script in the background. I was doing: Ctrl+Z which I thought would put the task in the background. But it does not, Ctrl+Z only suspends the task and returns you to the shell, a subsequent bg command is necessary for the script to run on infinite loop in the background

Python: how to modify/edit the string printed to screen and read it back?

I'd like to print a string to command line / terminal in Windows and then edit / change the string and read it back. Anyone knows how to do it? Thanks
print "Hell"
Hello! <---Edit it on the screen
s = raw_input()
print s
Hello!
You could do some ANSI trickery to make it look like you are editing on screen. Check out this link (also similar to this SO post on colors).
This would only work on certain terminals and configurations. ymmv.
This python script worked in my Cygwin terminal on Win7:
print 'hell'
print '\033[1A\033[4CO!'
Ends up printing hellO! on one line. The 2nd print moves the cursor up one line (Esc[1A) then over 4 characters (Esc[4C]) and then prints the 'O!'.
It wouldn't let you read it back though... only a 1/2 answer.
I had this same use-case for a command-line application.
Finally found a hack to do this.
# pip install pyautogui gnureadline
import pyautogui
import readline
from threading import Thread
def editable_input(text):
Thread(target=pyautogui.write, args=(text,)).start()
modified_input = input()
return modified_input
a = editable_input("This is a random text")
print("Received input : ", a)
The trick here is use pyautogui to send the text from keyboard. But we want to do this immediately after the input(). Since input() is a blocking call, we can run the pyautogui command in a different thread. And have an input function immediately after that in the main thread.
gnureadline is for making sure we can press left and right arrow keys to move the cursor in a terminal without printing escape characters.
Tested this on Ubuntu 20, python 3.7
raw_input accepts a parameter for a "prompt message", so use that to output the message, and then prepend it to what you get back. However, this won't allow you to backspace into the prompt, because it's a prompt and not really part of the input.
s = "Hell" + raw_input("Hell")
print s
os.sys.stdout is write only, but you can erase some characters of the last line with \b or the whole line with \r, as long as you did not write a carriage return.
(however, see also my question about limitations to the standard python console/terminal)
I once made some output exercise (including a status bar) to write,erase or animate if you will, perhaps it is helpfull:
from __future__ import print_function
import sys, time
# status generator
def range_with_status(total):
n=0
while n<total:
done = '#'*(n+1)
todo = '-'*(total-n-1)
s = '<{0}>'.format(done+todo)
if not todo:
s+='\n'
if n>0:
s = '\r'+s
sys.stdout.write(s)
sys.stdout.flush()
yield n
n+=1
print ('doing something ...')
for i in range_with_status(10):
time.sleep(0.1)
print('ready')
time.sleep(0.4)
print ('And now for something completely different ...')
time.sleep(0.5)
msg = 'I am going to erase this line from the console window.'
sys.stdout.write(msg); sys.stdout.flush()
time.sleep(1)
sys.stdout.write('\r' + ' '*len(msg))
sys.stdout.flush()
time.sleep(0.5)
print('\rdid I succeed?')
time.sleep(4)
If it's for your own purposes, then here's a dirty wee hack using the clipboard without losing what was there before:
def edit_text_at_terminal(text_to_edit):
import pyperclip
# Save old clipboard contents so user doesn't lose them
old_clipboard_contents = pyperclip.paste()
#place text you want to edit in the clipboard
pyperclip.copy(text_to_edit)
# If you're on Windows, and ctrl+v works, you can do this:
import win32com.client
shell = win32com.client.Dispatch("WScript.Shell")
shell.SendKeys("^v")
# Otherwise you should tell the user to type ctrl+v
msg = "Type ctrl+v (your old clipboard contents will be restored):\n"
# Get the new value, the old value will have been pasted
new_value= str(raw_input(msg))
# restore the old clipboard contents before returning new value
pyperclip.copy(old_clipboard_contents )
return new_value
Note that ctrl+v doesn't work in all terminals, notably the Windows default (there are ways to make it work, though I recommend using ConEmu instead).
Automating the keystrokes for other OSs will involve a different process.
Please remember this is a quick hack and not a "proper" solution. I will not be held responsible for loss of entire PhD dissertations momentarily stored on your clipboard.
For a proper solution there are better approaches such as curses for Linux, and on Windows it's worth looking into AutHotKey (perhaps throw up an input box, or do some keystrokes/clipboard wizardry).

Categories

Resources