I have a question about pexepct in Python.
What I wanna to do is, run my script at some time, and then stop it at some time.
Pexpect wont work like it should. I don't know what I'm doing wrong, so can you give me some advice on my code below?
#!/usr/bin/python
# -*- coding: utf-8 -*-
date = '2014-09-06'
start = '15:32'
stop = '16:30'
import pexpect, sys
string = 'at '+start+' '+date
child = pexpect.spawn(string)
child.expect('warning: commands will be executed using /bin/sh')
child.expect('at> ')
child.sendline('./run_script.py\n')
child.expect('at> ')
child.sendline('\^D\n')
print child.before
The problem is, when all commands send pexepct wont create a job.
Any advice should be great.
Here the way ctrl+d is sent is not valid. Even after sending ctrl+d, the script has to wait for couple of seconds for the at command to register the new job before closing the pexpect spawn object.
import pexpect
import sys
import time
prompt = "at>"
try:
conn = pexpect.spawn("at 14:30 2019-06-14")
conn.logfile = sys.stdout
conn.expect(prompt)
conn.sendline("touch /tmp/test.txt")
conn.expect(prompt)
conn.sendcontrol("d")
time.sleep(3)
conn.close()
except Exception as e:
print(e)
After executing the above code snippet, run the command 'atq' in the linux terminal to verify that job has been queued up.
# atq
52 Fri Jun 14 14:30:00 2019 a root
Related
I want to capture an output in python.
My code:-
import os
import sys
import subprocess
import time
cmd = './abc'
proc = subprocess.Popen(cmd)
time.sleep(12)
stdoutOrigin=sys.stdout
sys.stdout = open("log.txt", "w")
sys.stdout.close()
sys.stdout=stdoutOrigin
proc.terminate()
Problem is it never comes out of ./abc and is always stuck there. I need kill the process .
Normally i have to give CTRL+C to come out of it.
In this case how can i capture the output and save in a file which comes every 30 seconds .I need to capture it once.
You can redirect the output of a subprocess when starting it. Consider the following:
with open('proc.out', 'w') as proc_out:
subprocess.run(cmd, stdout=proc_out)
The call to run is blocking, but all the output is written to the output file. Once your subprocess finished, so does your Python script. You can still kill it prematurely, however.
Ok i found the answer and since Using only timeout would throw error
what i did was i added it to try and expect block.
import os
import sys
import subprocess
import time
cmd = './abc'
try:
with open('proc.out', 'w') as proc_out:
subprocess.run(cmd, stdout=proc_out , timeout = 30)
except subprocess.TimeoutExpired:
pass
I am familiar with expect script so I feel a bit odd when I first use pexpect. Take this simple script as an example,
#!/usr/bin/expect
set timeout 10
spawn npm login
expect "Username:"
send "qiulang2000\r"
expect "Password:"
send "xxxxx\r"
expect "Email:"
send "qiulang#gmail.com\r"
expect "Logged in as"
interact
When run it I will get the following output. It feels natural because that is how I run those commands
spawn npm login
Username: qiulang2000
Password:
Email: (this IS public) qiulang#gmail.com
Logged in as qiulang2000 on https://registry.npmjs.com/.
But when I use pexpect, no matter how I add print(child.after)or print(child.before) I just can't get output like expect, e.g. when I run following command,
#! /usr/bin/env python3
import pexpect
child = pexpect.spawn('npm login')
child.timeout = 10
child.expect('Username:')
print(child.after.decode("utf-8"))
child.sendline('qiulang2000')
child.expect('Password:')
child.sendline('xxxx')
child.expect('Email:')
child.sendline('qiulang#gmail.com')
child.expect('Logged in as')
print(child.before.decode("utf-8"))
child.interact()
I got these output, it feels unnatural because that is not what I see when I run those commands.
Username:
(this IS public) qiulang#gmail.com
qiulang2000 on https://registry.npmjs.com/.
So is it it possbile to achieve the expect script output?
--- update ---
With the comment I got from #pynexj I finally make it work, check my answer below.
With the comment I got I finally made it work
#! /usr/bin/env python3
import pexpect
import sys
print('npm login',timeout = 10)
child = pexpect.spawn('npm login')
child.logfile_read = sys.stdout.buffer // use sys.stdout.buffer for python3
child.expect('Username:')
child.sendline('qiulang2000')
child.expect('Password:')
child.sendline('xxxx')
child.expect('Email:')
child.sendline('qiulang#gmail.com')
child.expect('Logged in as')
If I need to call child.interact(), then it is important that I call child.logfile_read = None before it, otherwise sys.stdout will echo everything I type.
The answer here How to see the output in pexpect? said I need to pass an encoding for python3, but I found that if I use encoding='utf-8' it will cause TypeError: a bytes-like object is required, not 'str' If I don't set encoding at all, everything works fine.
So a simple ssh login script looks like this
#!/usr/bin/env python3
import pexpect
import sys
child = pexpect.spawn('ssh qiulang#10.0.0.32')
child.logfile_read = sys.stdout.buffer
child.expect('password:')
child.sendline('xxxx')
#child.expect('Last login') don't do that
child.logfile_read = None # important !!!
child.interact()
One problem remains unresolved, I had added one last expect call to match the ssh login output after sending the password, e.g. child.expect('Last login')
But if I added that, that line would show twice. I have gave up trying, like one comment said "pexpect's behavior is kind of counter intuitive".
Welcome to Ubuntu 16.04 LTS (GNU/Linux 4.4.0-141-generic x86_64)
* Documentation: https://help.ubuntu.com/
33 packages can be updated.
0 updates are security updates.
Last login: Fri Sep 11 11:44:19 2020 from 10.0.0.132
: Fri Sep 11 11:44:19 2020 from 10.0.0.132
I have a simple python script that takes screenshots of a computer that is running Ubuntu. I want it to run automatically on startup, so I put #reboot python3 /bin/program.py in the non-sudo version of crontab.
The program works fine when run from terminal, but gives the error pyscreenshot.err.FailedBackendError. I put it in a try loop, and had it write all exceptions to a file, and that's how I found the error message, "All backends failed."
It has something to do with the program 'pyscreenshot' not working correctly.
import pyscreenshot as screen
import os
from numpy import random
from time import sleep
from os.path import expanduser
TMP_SCREEN_PATH = expanduser('~') + '/.UE/tmp.png'
LOG_FILE_PATH = expanduser('~') + '/.UE/log.txt'
GRAB_DELAY_RANGE = (1, 10)
def screenshot(save_path=TMP_SCREEN_PATH):
img = screen.grab()
img.save(save_path)
def delay(delay_range):
sleep_time = random.randint(delay_range[0], delay_range[1])
print(f"Sleeping for {sleep_time} seconds")
sleep(sleep_time)
def main():
try:
while True:
screenshot()
delay(GRAB_DELAY_RANGE)
except KeyboardInterrupt:
print("Nope")
main()
except Exception as e:
print(e)
with open(LOG_FILE_PATH, 'a') as f:
f.write(str(type(e))+str(e)+'\n')
sleep(5)
main()
f = open(LOG_FILE_PATH, 'w+')
f.write('Startup')
f.close()
main()
I need one of the following solutions:
Simply fix the problem
Another way to run a program at startup
A different module to take screenshots with
Any help is appreciated, thanks
If the user that the cron job runs as is also logged in on the console (you mention a reboot, so I'm guessing that you have enabled autologin), then your cron job might work if you also add:
os.environ["DISPLAY"] = ":0"
This worked for me on Ubuntu in a test using cron and a simplified version of your script:
import os
import pyscreenshot as screen
os.environ["DISPLAY"] = ":0"
img = screen.grab()
img.save("/tmp/test.png")
If it doesn't work for you, then you might also have to try setting the value of the XAUTHORITY environment variable to the value found in the environment of the user's interactive processes, which could be extracted using the psutil package, but let's hope this isn't needed.
I'm trying to write very simple program which controls remote machine using pexpect. But remote system does not react to sent commands.
Here is source code:
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import pexpect
import sys
child = pexpect.spawn('telnet 192.168.2.81 24')
res = child.expect('/ # ')
print(res)
res = child.sendline('touch foo')
print(res)
Here is output:
0
10
So, as far as I understand, commands are executed successfully but there is no result on target system, i.e. foo file is not created.
Could anybody help me?
Add the following line after pexpect.spawn() or you would see nothing.
# for Python 2
child.logfile_read = sys.stdout
# for Python 3
child.logfile_read = sys.stdout.buffer
You also need the following statements at the end (otherwise the script would immediately exit after sendline('touch foo') so touch foo does not have a chance to run):
child.sendline('exit')
child.expect(pexpect.EOF)
According to the manual:
The logfile_read and logfile_send members can be used to separately log the input from the child and output sent to the child. Sometimes you don’t want to see everything you write to the child. You only want to log what the child sends back. For example:
child = pexpect.spawn('some_command')
child.logfile_read = sys.stdout
I am making a script that reads the output from a 433 MHz receiver over i2c on Arduino to an Raspberry Pi. I have tried to run start it from rc.local, but it seems like after a couple of days the script ends/stops/breaks/halts/is killed (?).
I have tried to run the following script from cron to determine if the runs or not and start the Python script if it has stopped. But it seems to not detect a running script and starts a new script every time it runs and so finally crashing the system.
#!/bin/bash
until <path to Python script>; do
sleep 1
done
exit(0)
I have also tried to replace the always true while statement with a statement that lets the script run for a minute and restart the script with cron each minute, but that also results in that the script does not end and a new process is started.
Do anyone have any idea of how I either can make the script stable or able to restart. I.e. Always running all time until infinity! :-)
Python script:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#Run in correct folder
import os
os.system("<path to script-folder>")
import sys
import MySQLdb as mdb
from smbus import SMBus
import RPi.GPIO as GPIO
import datetime
import time
addr = 0x10
intPin = 4
bus = SMBus(1)
def readData():
val = bus.read_byte(addr)
raw = val << 24
val = bus.read_byte(addr)
raw = raw | (val << 16)
val = bus.read_byte(addr)
raw = raw | (val << 8)
val = bus.read_byte(addr)
raw = raw | val
# Tidstämpel
ts = time.time()
date = datetime.datetime.fromtimestamp(ts).strftime('%Y-%m-%d %H:%M:%S')
try:
con = mdb.connect('<server>', '<User>', '<password>', '<DB>')
con.autocommit(True)
cur = con.cursor()
cur.execute("INSERT INTO <DB statement>") # Data is stored as integers
except mdb.Error, e:
errorLog = open('<path to log>', 'a')
errorlog.write("Error %d: %s" % (e.args[0],e.args[1]) + "\n")
errorlog.close()
sys.exit(1)
finally:
if con:
cur.close()
con.close()
while True:
GPIO.setmode(GPIO.BCM)
GPIO.setup(intPin,GPIO.IN)
GPIO.wait_for_edge(intPin, GPIO.RISING)
readData()
use linux screen to run the python script it will always run in backend until you stop the script.
http://www.tecmint.com/screen-command-examples-to-manage-linux-terminals/
Consider using daemontools. It's small and will ensure your code runs...forever.