simulate coding session in python - python

I would like to simulate a coding session (for video recording session : I am not a touch typist :-)
For example, I have a shell script like this (test.sh)
hello="Hello"
world="world"
echo $hello", "$world
And I have a python script like this (Simulate_KeyPresses.py) :
import sys
import time
import subprocess
def send_letter(letter):
# V1 : simple print
sys.stdout.write(letter)
sys.stdout.flush()
# V2: Test with expect (apt-get install expect)
# cmd = """echo 'send "{}"' | expect""".format(c)
# subprocess.run(cmd, shell=True)
def simulate_keypresses(content):
lines = content.split("\n")
for line in lines:
for c in line:
send_letter(c)
time.sleep(0.03)
send_letter("\n")
time.sleep(0.5)
if __name__ == "__main__":
filename = sys.argv[1]
with open(filename, "r") as f:
content = f.read()
simulate_keypresses(content)
Which I can invoke like this :
python Simulate_KeyPresses.py test.sh
And it works beautifully.
However when I pipe it to bash, like this:
python Simulate_KeyPresses.py test.sh | /bin/bash
I get
Hello, world
i.e I only get stdout and the key presses are not shown.
What I would like to see:
hello="Hello"
world="world"
echo $hello", "$world
Hello, world
I found a related answer (Simulate interactive python session), but it only handle python coding sessions.
I tried to use Expect, but it does not work as intended (does not show stdin also).
Help would be appreciated!

You can use the program tee as:
python Simulate_KeyPresses.py test.sh | tee /dev/tty | /bin/bash

How about adding this to your script:
subprocess.call("./{}".format(filename), shell=True)
The result will be
hello="Hello"
world="world"
echo $hello", "$world
Hello, world

Related

How to capture bash command output using python script

How to capture bash command output using python script.
For eg:
running below in linux :
[root#xxxxxx oracle]# echo sumit
sumit
[root#xxxxxx oracle]#
How can i re print only the above output using python script ? like running python test.py shoud give 'sumit' as output. i tried below:
test.py :
import sys
sys.stdout.flush()
out = sys.stdin.readline()
print(out)
Above prints only the input i type but not the already displayed output
With subprocess, you can run commands and check their return code, stdout and stderr outputs. Would that help?
For example:
import subprocess as proc
byte_output = proc.check_output(["ls", "-1", "."])
str_output = str(byte_output, "utf-8")
print(str_output)
# prints my local folders dev\ngit

Running a C executable inside a python program

I have written a C code where I have converted one file format to another file format. To run my C code, I have taken one command line argument : filestem.
I executed that code using : ./executable_file filestem > outputfile
Where I have got my desired output inside outputfile
Now I want to take that executable and run within a python code.
I am trying like :
import subprocess
import sys
filestem = sys.argv[1];
subprocess.run(['/home/dev/executable_file', filestem , 'outputfile'])
But it is unable to create the outputfile. I think some thing should be added to solve the > issue. But unable to figure out. Please help.
subprocess.run has optional stdout argument, you might give it file handle, so in your case something like
import subprocess
import sys
filestem = sys.argv[1]
with open('outputfile','wb') as f:
subprocess.run(['/home/dev/executable_file', filestem],stdout=f)
should work. I do not have ability to test it so please run it and write if it does work as intended
You have several options:
NOTE - Tested in CentOS 7, using Python 2.7
1. Try pexpect:
"""Usage: executable_file argument ("ex. stack.py -lh")"""
import pexpect
filestem = sys.argv[1]
# Using ls -lh >> outputfile as an example
cmd = "ls {0} >> outputfile".format(filestem)
command_output, exitstatus = pexpect.run("/usr/bin/bash -c '{0}'".format(cmd), withexitstatus=True)
if exitstatus == 0:
print(command_output)
else:
print("Houston, we've had a problem.")
2. Run subprocess with shell=true (Not recommended):
"""Usage: executable_file argument ("ex. stack.py -lh")"""
import sys
import subprocess
filestem = sys.argv[1]
# Using ls -lh >> outputfile as an example
cmd = "ls {0} >> outputfile".format(filestem)
result = subprocess.check_output(shlex.split(cmd), shell=True) # or subprocess.call(cmd, shell=True)
print(result)
It works, but python.org frowns upon this, due to the chance of a shell injection: see "Security Considerations" in the subprocess documentation.
3. If you must use subprocess, run each command separately and take the SDTOUT of the previous command and pipe it into the STDIN of the next command:
p = subprocess.Popen(cmd, stdin=PIPE, stdout=PIPE)
stdout_data, stderr_data = p.communicate()
p = subprocess.Popen(cmd, stdin=stdout_data, stdout=PIPE)
etc...
Good luck with your code!

Python Script that takes in Linux Commands to return Server Health

For my job, I'm developing a small script that users can run to essentially do a check of log files for errors. I'm familiar with both Python and cmd prompts but not running one inside the other.
I read around quite a bit but can't really find the best process for this. Many seem a bit more complex for my intentions.
Ideally what I want to build is a program that follows this process:
For all dirs in host:
cd into directory then grep log file for a certain string
-print errors
-return back to a dir
cd into directory:
df -h
My personal preference would be to perform like so:
def myFirstCheck():
file_result = cat a/b/c/LogFile.log | awk /ERROR\|/FATAL\
file0_result = cat a/b/c/LogFile2.log | awk /ERROR\|/FATAL\
return file_result, file0_result
def mySecondCheck():
print('Server 2 check:')
file2_result = cat d/e/f/LogFile3.log | awk /ERROR\|/FATAL\
file3_result = cat d/e/f/LogFile4.log | awk /ERROR\|/FATAL\
return file2_result, file3_result
file_result, file0_result = myFirstCheck()
print('Server 1 check:')
print('df -h') #I want this to return the output from cmd 'df -h'
print(file_result)
print(file0_result)
file2_result, file3_result = mySecondCheck()
print('Server 2 check:')
print('df -h') #I want this to return the output from cmd 'df -h'
print(file2_result)
print(file3_result)
#exit
I know this is widely inefficient and may be a simple way of thinking for something that may be a bit more complex. I just want to see if there's any helpful discussion from those with previous experience trying to do the similar thing.
Looks like you try to do alot of things with the cmd, but you can do it pythonicly.
you can read file like this:
with open("/path/to/log/file", "r") as f:
data = f.read()
error_lines = [line for line in data.splitlines() if "ERROR" in line]
secondly, for the df -h, you can simple do it as:
import subprocess
subprocess.check_output("df -h")
or if you want to do it pythonic and know the disks path you can do it as:
import psutil
hdd = psutil.disk_usage('/')

return value from python script to shell script

I am new in Python. I am creating a Python script that returns a string "hello world." And I am creating a shell script. I am adding a call from the shell to a Python script.
i need to pass arguments from the shell to Python.
i need to print the value returned from Python in the shell script.
This is my code:
shellscript1.sh
#!/bin/bash
# script for testing
clear
echo "............script started............"
sleep 1
python python/pythonScript1.py
exit
pythonScript1.py
#!/usr/bin/python
import sys
print "Starting python script!"
try:
sys.exit('helloWorld1')
except:
sys.exit('helloWorld2')
You can't return message as exit code, only numbers. In bash it can accessible via $?. Also you can use sys.argv to access code parameters:
import sys
if sys.argv[1]=='hi':
print 'Salaam'
sys.exit(0)
in shell:
#!/bin/bash
# script for tesing
clear
echo "............script started............"
sleep 1
result=`python python/pythonScript1.py "hi"`
if [ "$result" == "Salaam" ]; then
echo "script return correct response"
fi
Pass command line arguments to shell script to Python like this:
python script.py $1 $2 $3
Print the return code like this:
echo $?
You can also use exit() without sys; one less thing to import. Here's an example:
$ python
>>> exit(1)
$ echo $?
1
$ python
>>> exit(0)
$ echo $?
0

Is there a way to make python become interactive in the middle of a script?

I'd like to do something like:
do lots of stuff to prepare a good environement
become_interactive
#wait for Ctrl-D
automatically clean up
Is it possible with python?If not, do you see another way of doing the same thing?
Use the -i flag when you start Python and set an atexit handler to run when cleaning up.
File script.py:
import atexit
def cleanup():
print "Goodbye"
atexit.register(cleanup)
print "Hello"
and then you just start Python with the -i flag:
C:\temp>\python26\python -i script.py
Hello
>>> print "interactive"
interactive
>>> ^Z
Goodbye
The code module will allow you to start a Python REPL.
With IPython v1.0, you can simply use
from IPython import embed
embed()
with more options shown in the docs.
To elaborate on IVA's answer: embedding-a-shell, incoporating code and Ipython.
def prompt(vars=None, message="welcome to the shell" ):
#prompt_message = "Welcome! Useful: G is the graph, DB, C"
prompt_message = message
try:
from IPython.Shell import IPShellEmbed
ipshell = IPShellEmbed(argv=[''],banner=prompt_message,exit_msg="Goodbye")
return ipshell
except ImportError:
if vars is None: vars=globals()
import code
import rlcompleter
import readline
readline.parse_and_bind("tab: complete")
# calling this with globals ensures we can see the environment
print prompt_message
shell = code.InteractiveConsole(vars)
return shell.interact
p = prompt()
p()
Not exactly the thing you want but python -i will start interactive prompt after executing the script.
-i : inspect interactively after running script, (also PYTHONINSPECT=x) and force prompts, even if stdin does not appear to be a terminal
$ python -i your-script.py
Python 2.5.4 (r254:67916, Jan 20 2010, 21:44:03)
...
>>>
You may call python itself:
import subprocess
print "Hola"
subprocess.call(["python"],shell=True)
print "Adios"

Categories

Resources