send string between python script - python

I want to send 'hello world' to a script in python already running in ubuntu.
The script that's always running is this one (part of it):
print("$ echo 'foobar' > {0}".format(get_ttyname()))
print("$ echo 'foobar' > /proc/{0}/fd/0".format(os.getpid()))
sys.stdin.readline()
it throws the pid of the running process so I can send stuff by console with:
echo 'hello script!' > /proc/PID/fd/0
It will print it in the console! but I can't send \x15 or EOF or anything to break sys.stdin.readline() and do some other stuff in my script, for example:
def f(e):
print 'we already read:',s
while True:
s = sys.stdin.readline()
print 'we break the readline'
f(s)
.....blablabla some other stuff, and then we return to the top of the while to keep reading...
Does anyone know how to do it? The script that send the string will not always be running, but the script that receives the info will be always running.
PROBLEM SOLVED!
Thank's to Rafael this is the solution:
Reader:
import os
import sys
path = "/tmp/my_program.fifo"
try:
os.mkfifo(path)
except OSError:
pass
fifo = open(path, "r")
while True:
for line in fifo:
linea = line
print "Received: " + linea,
fifo.close()
if linea =='quit':
break
fifo = open(path, "r")
Sender:
# -*- coding: utf-8 -*-
import os
path = "/tmp/my_program.fifo"
fifo = open(path, "w")
fifo.write("Hello Wordl!!\n")
fifo.close()

Since you obviously don't have a problem with being limited to a Unix system, you can use named pipes to communicate with the program. Very unix-y way to work.
Python provides the os.mkfifo function to ease creating named pipes; otherwise they work just like files.

Write to a text file that is read by the already running program. The two can interact via this file. For example, these two programs simultaneously read and write to an initially empty text file.
already.py
# executed 1st
import time
while True:
text = open('file.txt').read()
print 'File contents: ' + text
time.sleep(5)
program.py
# executed 2nd
import time
while True:
text = open('file.txt', 'a')
text.write(raw_input('Enter data: '))
text.close()
time.sleep(5)

Related

python readlines not working during incron

I'm trying to call a python script through incron:
/data/alucard-ops/drop IN_CLOSE_WRITE /data/alucard-ops/util/test.py $#/$#
but I cant seem to read from the file passed. Here is the script:
#!/usr/bin/env /usr/bin/python3
import os,sys
logfile = '/data/alucard-ops/log/'
log = open(logfile + 'test.log', 'a')
log.write(sys.argv[1] + "\n")
log.write(str(os.path.exists(sys.argv[1])) + "\n")
datafile = open(sys.argv[1], 'r')
log.write('Open\n')
data = datafile.readlines()
log.write("read\n")
datafile.close()
The output generated by the script:
/data/alucard-ops/drop/nsco-20180219.csv
True
Open
It seems to stop at the readlines() call. I dont see any errors in the syslog.
Update: It seems that i can use a subprocess to cat the file and it retrieves the contents. But, when i decode it, data.decode('utf-8') I'm back to nothing in the variable.
I ended up using watchdog instead.

Save output of os.system to text file

I'm not great on all the technical terms so I'll do my best to explain my problem.
I've written a small script to open android SDK and check for attached devices (using windows 10 and python 2.7.14). The code I've got is as follows:
import os
import datetime
import time
print 'Current Directory:', os.getcwd()
print 'Opening Android SDK...'
os.chdir('C:\\android-sdk\\platform-tools')
print 'Current Directory:', os.getcwd()
t = time.ctime()
print t
print 'Checking for connected devices:'
os.system('adb devices -l')
That all works fine, but I want to get the last 3 lines to save to a text file. I've tried f = open('logfile.txt', 'w') then converting it all to a string using s = str(t, 'Checking for connected devices:', os.system('adb devices -l')) and writing it to the file and closing it, but it's not working. It's not even creating the file, let alone writing anything to it.
I'm probably missing something key but I'm a newbie at this so please be nice!
Any help would be much appreciated.
Many thanks
Edit: whole code with the write stuff included:
import os
import datetime
import time
print 'Current Directory:', os.getcwd()
print 'Opening Android SDK...'
os.chdir('C:\\android-sdk\\platform-tools')
print 'Current Directory:', os.getcwd()
t = time.ctime()
f = open('logfile.txt', 'w')
s = str(t, 'Checking for connected devices:', os.system('adb devices -l'))
f.write(s)
f.close()
os.system executes the command in a subshell and returns the command's exit code. It does not provide any mean to capture the outputs of the command ("outputs" => what the command prints to it's stdout/stderr streams).
To capture the command's outputs you'll have to use some of the subprocess module's feature, the most obvious here being subprocess.check_output
# ...
import subprocess
# ...
# NB : you may want to catch subprocess.CalledProcessError here
out = subprocess.check_output(['adb', 'devices', '-l'])
msg = "{t}\nChecking for connected devices:\n{out}".format(t=t, out=out)
with open('logfile.txt', 'w') as f:
f.write(msg)
Try the following:
import os
import subprocess
import time
print ('Current Directory: {}'.format(os.getcwd()) )
print ('Opening Android SDK...')
os.chdir('C:\\android-sdk\\platform-tools')
print ('Current Directory: {}'.format(os.getcwd()) )
t = str(time.ctime())
try:
process_output = subprocess.check_output(["adb", "devices", "-l"])
except: #Add here the type of exception you want to raise and logic
print("Please check your ADB installation and syntax.")
s = ('{} Checking for connected devices: {}'.format(t,process_output) )
with open('logfile.txt', 'w') as f:
f.write(s)
Thanks everyone for your help. The answer was:
import os
import time
import subprocess
print 'Current Directory:', os.getcwd()
print 'Opening Android SDK...'
os.chdir('C:\\android-sdk\\platform-tools')
print 'Current Directory:', os.getcwd()
print 'Checking for connected devices:'
t = time.ctime()
# save log of time and connected devices
with open('logfile.txt', 'w') as f:
s = ('{}\n{}'.format(t, subprocess.check_output(["adb", "devices", "-l"])))
f.write(s)
print(s)
With Python 3.5+ you can (and probably should) use subprocess.run() which conveniently replaces the legacy subprocess.check_output() with a more versatile API.
import subprocess
with open('logfile.txt', 'w') as f:
subprocess.run(['adb', 'devices', '-l'], stdout=f,
universal_newlines=True) # this obscurely makes everything Unicode
Directly connecting the stdout of the subprocess to an open file handle is possible via the old check_output() API too, mind you.

Exception for Python ftplib in my program?

I wrote this program to draw data from a text file on a website's directory (of which is edited by the user on the site) but it seems to crash. A lot.
from sys import argv
import ftplib
import serial
from time import sleep
one = "0"
repeat = True
ser = serial.Serial("COM3", 9600)
while repeat == True:
path = 'public_html/'
filename = 'fileone.txt'
ftp = ftplib.FTP("*omitted*")
ftp.login("*omitted*", "*omitted*")
ftp.cwd(path)
ftp.retrbinary("RETR " + filename ,open(filename, 'wb').write)
ftp.quit()
txt = open(filename)
openup = txt.read()
ser.write(openup)
print(openup)
Does anyone know any kind of way to stop it from crashing? I was thinking of using an exception but I'm no Python expert. The program does what it's meant to do, by the way, and the address and login have been omitted for obvious reasons. Also if possible I ask for an exception to stop the program from crashing when it disconnects from the serial port.
Thanks in advance!
Two things:
You might want to put all the ftplib related code in a try-except block like so:
try:
#code related to ftplib
except Exception, e: #you can fill this in after you encounter the exception once
print str(e)
You seem to be opening the file but not closing it when you're done. This might also cause errors later. The best way to do this would be:
with open(filename, 'r') as txt:
openup = txt.read()
This way the file will be closed automatically once you're outside the 'with' block.

fifo - reading in a loop

I want to use os.mkfifo for simple communication between programs. I have a problem with reading from the fifo in a loop.
Consider this toy example, where I have a reader and a writer working with the fifo. I want to be able to run the reader in a loop to read everything that enters the fifo.
# reader.py
import os
import atexit
FIFO = 'json.fifo'
#atexit.register
def cleanup():
try:
os.unlink(FIFO)
except:
pass
def main():
os.mkfifo(FIFO)
with open(FIFO) as fifo:
# for line in fifo: # closes after single reading
# for line in fifo.readlines(): # closes after single reading
while True:
line = fifo.read() # will return empty lines (non-blocking)
print repr(line)
main()
And the writer:
# writer.py
import sys
FIFO = 'json.fifo'
def main():
with open(FIFO, 'a') as fifo:
fifo.write(sys.argv[1])
main()
If I run python reader.py and later python writer.py foo, "foo" will be printed but the fifo will be closed and the reader will exit (or spin inside the while loop). I want reader to stay in the loop, so I can execute the writer many times.
Edit
I use this snippet to handle the issue:
def read_fifo(filename):
while True:
with open(filename) as fifo:
yield fifo.read()
but maybe there is some neater way to handle it, instead of repetitively opening the file...
Related
Getting readline to block on a FIFO
You do not need to reopen the file repeatedly. You can use select to block until data is available.
with open(FIFO_PATH) as fifo:
while True:
select.select([fifo],[],[fifo])
data = fifo.read()
do_work(data)
In this example you won't read EOF.
A FIFO works (on the reader side) exactly this way: it can be read from, until all writers are gone. Then it signals EOF to the reader.
If you want the reader to continue reading, you'll have to open again and read from there. So your snippet is exactly the way to go.
If you have mutliple writers, you'll have to ensure that each data portion written by them is smaller than PIPE_BUF on order not to mix up the messages.
The following methods on the standard library's pathlib.Path class are helpful here:
Path.is_fifo()
Path.read_text/Path.read_bytes
Path.write_text/Path.write_bytes
Here is a demo:
# reader.py
import os
from pathlib import Path
fifo_path = Path("fifo")
os.mkfifo(fifo_path)
while True:
print(fifo_path.read_text()) # blocks until data becomes available
# writer.py
import sys
from pathlib import Path
fifo_path = Path("fifo")
assert fifo_path.is_fifo()
fifo_path.write_text(sys.argv[1])

Read file and copy to standard output.

I'm trying to write a python program that will read input and copy it to standard output (with no alterations). I've been told that it needs to operate as a Python version of the Unix cat function. If a file cannot be opened, an error message needs to be printed, and then the program needs to continue processing any additional files. I am a complete beginner, and have tried my best to scrape something together with my limited knowledge. Here is what I have so far:
from sys import argv, stdout, stdin, stderr
if len(argv) == 1:
try:
stdout.write(raw_input(' ') + '\n')
except:
stderr.write ('sorry' + '\n')
quit()
else:
for filename in argv[1:]:
try:
filehandle + open(filename)
except IOError:
stderr.write('Sorry, could not open', filename + '\n')
continue
f = filehandle.read()
stdout.write(f)
I am not quite sure where to go from here.. does anyone have any advice/am I on the right track even a little bit? Please and thank you!
This function will copy the specified file to the console line by line (in case you later on decide to give it the ability to use the -n command line option of cat)
def catfile(fn):
with open(fn) as f:
for line in f:
print line,
It can be called with the filename once you have established the file exists.

Categories

Resources