Runing multiple processes within python script - python

Within my python script I need to launch several processes :
1) I need to run another python script ( a flask app with the command python app.py)
2)then I need to launch the command ngrok http 5000 and from this command output get the url on which ngrok is forwarding.
I have tried to use the subprocess module, but when it executes :
subprocess.Popen( "python app/app.py",shell=True)
it launches the interactive shell and blocks the execution of my script.
What is the correct way to achieve this ?

Just instead of Popen function, should to use the call function.
subprocess.call('python app.py', shell=True)
Also see the documentation subprocess docs

Related

How to pass commands to a program running in a separate shell using fabric?

I have a program which spawns a shell of its own, I further need to be able to pass commands in that shell.
For example -
local(command_which_spawns_new_shell)
Now in this new_shell
I would like to run commands like 'ls' etc... and finally exit from the program or shell.
When I execute local on this command currently, the program halts after opening up the shell, how can i capture the shell handle and pass further pass commands to it before I exit from the shell?
An example would be
local(ftp hostname)
and then execute commands like
lcd
cd
ls etc...
FTP is only an example here, the command I will be running spawns a new shell.

Python on Pi 3 to open and control a console application

I have a very simple problem but cant seem to get a simple solution anywhere.
I have an application running on my Pi which I start by typing into terminal and passing some arguments. For example:
sudo $HOME/Projects/myExampleApp MD_DWNC2378
This results in the console application starting and as expected, can take keyboard inputs.
Now, what I want to do is repeat the process described so far from a python application. My python application should be able to open the myExampleApp in terminal, get a reference to the console window and then direct any commands from my Python application as a keyboard press to myExampleApp.
On a windows platform, pywinauto library does the job.
What is the best option for doing what I described on linux running on my Pi 3?
Any suggestions would be very helpful.
Have a look at https://docs.python.org/2/library/cmd.html for receiving commands. Your application can be run with subprocess as suggested by another user:
import cmd
import sys
import subprocess
class Controller(cmd.Cmd):
cmd = 'while true; do cat /dev/stdin;sleep 1;done'
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stdin=subprocess.PIPE, shell=True)
def default(self, cmd):
self.p.stdin.write(cmd+"\n")
a=self.p.stdout.readline()
sys.stdout.write("subprocess returned {}".format(a))
sys.stdout.flush()
if __name__ == '__main__':
try:
Controller().cmdloop()
except:
print('Exit')
Running this script will present you with a CLI interface. Any command sent in will be forwarded to the stdin of your application (simulated by the shell script shown).
If you don't need to process the returning stdout you can skip the self.p.stdout.readline() call. If you need the returning data, it may be beneficial to change the readline() call to read(), this will read until the application sends an EOF.
The sudo requirement of your application can probably be overcome by running the python script with sudo. I'm no security expert, but be aware of the security risks of running the script as sudo, as well as the shell=True parameter of the Popen call.

Start/Stop Apache via Python script

On a RHEL 6.7 server running Python 2.6.6 I am trying to start and stop the Apache webserver as an unprivileged user in a Python script.
The command I am using is "sudo service httpd " where parameter is "start", "stop" or "status".
p = subprocess.Popen("sudo service httpd start", stdout=subprocess.PIPE, stderr=subprocess.PIPE)
This line alone does not work. Adding a
p.communicate()
lets the commands work as desired. Can somebody tell me why?
UPDATE:
The sudoers file contains a line that allows my user to run these commands passwordless.
Neither code works (with and without .communicate()).
You should use instead (assuming passwordless sudo for these commands):
import subprocess
subprocess.check_call("sudo service httpd start".split())
The reasons:
subprocess functions do not run the shell by default and therefore the string is interpreted as a name of the command: you should use a list to pass multiple command-line arguments on POSIX
Popen() starts the command and returns immediately without waiting for it to finish i.e., it may happen before httpd is started. Assuming Popen() call is fixed, .communicate() waits for the child process to terminate and therefore it returns after httpd is started (whether it was successful or not).
There are different reasons for that to happen:
1. the user is in sudoers and configured to not insert password.
sudo remembers your password for some time (depending on your system configuration), see: https://unix.stackexchange.com/questions/37299/how-does-sudo-remember-you-already-entered-roots-password
does it work in a brand new terminal session?

Launch Application and execute script lines from python

I want to write a python script which can launch an application.The application being launched can also read python commands which I am passing through another script.
The problem I am facing is that I need to use two python scripts, one to launch an application and second one to run commands in launched application.
Can I achieve this using a single script? How do I tell python to run next few lines of script in launched application?
In general, you use subprocess.Popen to launch a command from python. If you set it as non-blocking, it'll let you keep running python statements. You also have access to the running subprocesses stdin and stdout so you can interact with the running application.
If I understand what you're asking, it'd look something like this:
import subprocess
app = subprocess.Popen(["/path/to/app", "-and", "args"], stdin=subprocess.PIPE)
app.stdin.write("python command\n")

How to call .ksh file as part of Unix command through ssh in Python

I would like to achieve the following things:
Given file contains a job list which I need to execute one by one in a remote server using SSH APIs and store results.
When I try to call the following command directly on remote server using putty it executes successfully but when I try to execute it through python SSH programming it says cant find autosys.ksh.
autosys.ksh autorep -J JOB_NAME
Any ideas? Please help. Thanks in advance.
Fabric is a good bet. From the home page,
Fabric is a Python (2.5 or higher) library and command-line tool for streamlining the use of SSH for application deployment or systems administration tasks.
A quick example,
>>> from fabric.api import run, env, cd, settings, hide, show
>>> env.host_string='xxx.xxx.com'
>>> env.user='user'
>>> env.password='password'
>>> run('ls -lart')
After reading your comment on the first answer, you might want to create a bash script with bash path as the interpreter line and then the autosys commands.
This will create a bash shell and run the commands from the script in the shell.
Again, if you are using autosys commands in the shell you better set autosys environment up for the user before running any autosys commands.

Categories

Resources