Launch Terminal via Python and run commands - python

I am writing an automation script and it would be nice to be able to launch Terminal on my mac via my Python script in order to launch the Appium servers, instead of doing it manually.
The closest I've come is by using the following code, but this only launches Terminal and I am unable to send commands to it:
from subprocess import Popen, PIPE, STDOUT
Popen(['open', '-a', 'Terminal', '-n'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
I need to be able to launch two Terminal instances and run the following
'appium'
'appium -a 0.0.0.0 -p 4724'

You can execute shell commands in python like this:
import os
os.system('appium &')
this will start the Appium server

You have to use communicate to send cmd to your terminal.
from subprocess import Popen, PIPE, STDOUT
p1 = Popen(['open', '-a', 'Terminal', '-n'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
p2 = Popen(['open', '-a', 'Terminal', '-n'], stdout=PIPE, stdin=PIPE, stderr=STDOUT)
p1.communicate('appium')
p2.communicate('appium -a 0.0.0.0 -p 4724')

Related

Send Multiple Terminal Commands in Gnome Terminals With Subprocess

So I am currently trying to run two different gnome-terminal windows in Ubuntu that I can send individual commands to after they are initially open.
def ssh_command(cmd):
ssh_terminal_1 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
ssh_terminal_2 = subprocess.Popen(['gnome-terminal', '--', 'bash', '-c', cmd], stderr=subprocess.STDOUT, stdout=subprocess.PIPE, stdin=subprocess.PIPE)
# Activate the conda environment for our multilateration server
spyder_activate('conda activate flyhound')
time.sleep(10)
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'cd srsRAN22.04/build')
ssh_terminal_2.stdin.flush()
ssh_terminal_1.stdin.write(b'sudo ./srsepc/src/srsepc ../srsepc/epc.conf.example --hss.db_file=../srsepc/user_db_unknown.csv.example\n')
ssh_terminal_1.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -l /home/administrator/Downloads/hostedxA5-latest.rbf\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'bladeRF-cli -f /home/administrator/Downloads/bladeRF_fw_v2.4.0.img\n')
ssh_terminal_2.stdin.flush()
ssh_terminal_2.stdin.write(b'sudo ./srsenb/src/srsenb ../srsenb/enb.conf.example --enb_files.sib_config=../srsenb/sib.conf.example --e nb.n_prb=50 --enb_files.rr_config=../srsenb/rr.conf.example\n')
However when I start the original subprocess command the terminals open up fine with the command given during the function call but all the following commands don't work and I get a broken pipe error errno 32. While I try to run these commands I also need to keep previous terminal open that looks like this below
def access_command(cmd):
while True:
process = subprocess.Popen(shlex.split(cmd), stdout=subprocess.PIPE)
while True:
output = process.stdout.readline()
if output == '' and process.poll() is not None:
break
if output:
print(output.strip())
if b"f0:9e:4a:5f:a4:5b" and b"handshake" in output:
ssh_command("sshpass -p earth ssh -o StrictHostKeyChecking=no administrator#ipaddress; clear; screen")
I am really not sure how I can send multiple commands to the ssh terminals after they ssh into that ip address. I am very new to subprocess and sending commands to terminals via python so any help would be amazing on this!!
As I explained in the comments, your pipe goes to gnome-terminal and neither to ssh nor to bash. gnome-terminal is not listening to stdin but is only listening to the user at the console. Here is what you do.
Make a FIFO (named pipe) -- os.mkfifo -- for each terminal, give it a name that won't collide with any other file (such as put your process ID in it).
Issue the command gnome-terminal -- bash -c ssh <options> < <fifo name> for each terminal. Do not make this a Popen call, use os.system or something like that.
Do your spydy magic (anaconda).
Open the FIFO as a file
Write your commands to the open file; they will be executed by the bash process in the ssh connection. You will probably have to flush, unless there is a way to open it in line-buffered mode.
When you want to close the terminal, close the file.
What this accomplishes is that we move the pipe from gnome-terminal to ssh and hence across the connection to bash. We feed it on one end and it comes out and gets digested by the shell.

wget missing url with shlex and subprocess

I'm struggling to understand why this fails with a wget: missing URL error:
import shlex
import subprocess
copy_command = "wget -O - 'http://example.com/somepath/somefile.txt?someparam=test' | sshpass -p pass ssh user#localhost -p 2222 \"cat - > /upload/somefile.txt\""
cmd = shlex.split(copy_command, posix=False)
with subprocess.Popen(
cmd, stdin=subprocess.PIPE, stdout=subprocess.PIPE, shell=True
) as proc:
output, error = proc.communicate()
What am I missing here? If I just give subprocess the copy_command string directly, then it works without issues.
To set up a pipeline requires the parent process to spawn all the programs involved and the connect (pipe) the stdio of one to another.
The Python documentation for subprocess explains how to do this.
It works with string argument andshell=True because then it just hands off the command line to a sub shell, and that shell handles all those details.

How to get stdout and stderr from a tmux session?

I am writing a sample python program in linux system. I am using tmux to create a session and execute another script within the tmux-session. I would like to get the stdout and stderr out of the tmux session to the parent script but that somehow does not work.
Code snippet:
cmd = "tmux new-session -d -s 'test' '/my_dir/fibonacci.py __name:=fibonacci_0'"
proc = Popen(cmd, shell=True, stdout=PIPE, stderr=PIPE)
(stdout, stderr) = proc.communicate()
print(stderr)
I have came across answers to use show-buffer and pipe-pane. But that did not help. Maybe I need to modify the tmux command itself.
Thank you for your support. After digging a bit, I came up with a workaround. I am just adding it here for someone with similar needs.
What I have done is created a named pipe, redirect the output of tmux session to named pipe and then finally read from it.
# this attach if a session exists or creates one, then exits from the session
call("tmux new-session -A -s test \; detach", shell=True)
# to avoid conflict, remove existing named pipe and then create named pipe
call("rm -f /tmp/mypipe && mkfifo /tmp/mypipe && tmux pipe-pane -t test -o 'cat > /tmp/mypipe'", shell=True)
# feed the pipe to the stdout and stderr
poc = Popen(['cat', '/tmp/mypipe'], stdout=PIPE, stderr=PIPE)
# finally execute the command in tmux session
Popen(['tmux', 'send-keys', '-t', '/my_dir/fibonacci.py', 'C-m'])
(stdout, stderr) = proc.communicate()
print(stderr)
Hope this is helpful.
TLDR: tmux is like sandboxed cmd, you can't get in the session and reach sdtout or stderr
tmux creates a dispatched process. stdout and stderr is tmux would be the text messages that tmux provies, not the commands you run in a tmux session.
So you can't get your commands' output from a dispatched process.
In order to get the stdout and stderr, you have to change how fibonacci.py dumps text. Python logging framework can help you in this situation. You can write all stdout to stdout.txt and get content of that file.

subprocess popen Python

i am executing a shell script which is starting a process with background option &. The shell script is called from python script which hangs.
Shell script:
test -f filename -d &
python file
cmd =["shellscript","restart"]
proc = subprocess.Popen(cmd, stdout=subprocess.PIPE,
stderr=subprocess.PIPE, stdin=subprocess.PIPE, **kwargs)
pid = proc.pid
out, err = proc.communicate()
returncode = proc.poll()
Python file hangs and it won't return out of the python process. Also python process is an automated one.
The call to proc.communicate() will block until the pipes used for stderr and stdout are closed. If your shell script spawns a child process which inherits those pipes, then it will exit only after that process also has closed its writing ends of the pipes or exited.
To solve this you can either
redirect the output of the started subprocess to /dev/null or a logfile in your shell script, e.g.:
subprocess_to_start >/dev/null 2>&1 &
use subprocess.DEVNULL or an open file object for stderr and stdout in your python script and drop the communicate() call if you don't need the output of "shellscript" in python
A comma is missing in your cmd list:
cmd =["shellscript", "restart"]

Snort command run through python script

I am trying to read the snort alert in the console for one of my project.
What I did is, I wrote a following code to run the snort to listen to my interface.
import subprocess
command = 'snort -l /var/log/snort -c /etc/snort/snort.conf -A console -i s1-eth1'
process = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE)
process.wait()
print process.returncode
I run the above script through another python script as below
#!/usr/bin/env python
import os
import sys
from subprocess import Popen, PIPE, STDOUT
script_path = os.path.join(os.getcwd() +'/', 'comm.py')
p = Popen([sys.executable, '-u', script_path],
stdout=PIPE, stderr=STDOUT, bufsize=1)
with p.stdout:
for line in iter(p.stdout.readline, b''):
print line,
p.wait()
The output of the script takes me where snort listens. but when doing experiment that triggers the rule of snort file to alert, its not printing the output in terminal where i ran the python script.
when I run the snort command in the normal terminal the alert message prints all fine.
does any know any work around for this.
Thanks in advance.

Categories

Resources