Python argv and cmd - python

I'm trying to make a Python program that can correct exams automaticly, I have extra time and don't wanna wait for my teacher to correct them manually...
Annyways when i use python argv like this:
import sys
def hello(a):
print(a)
a = sys.argv[1:]
hello(a)
And i want to insert a list, I can no longer insert just one variable because of the way argv works, and I can't know how long the list will be because not all tasks are the same. I'm using subprocess.check_output to return the program output after my checker runs it in a cmd window... Now if someone knows a better way to approach correcting the programs without making the students replace their input with sys.argv(if there is a better way to input arguments to a seperate python program when you run it) or can tell me how to fix the argv issue?

You could use Popen.communicate instead of check_output:
echo.py:
print(input())
test.py:
from subprocess import Popen, PIPE
p = Popen(['python3', 'echo.py'], stdout=PIPE, stdin=PIPE, stderr=PIPE)
out, err = p.communicate(input="hello!".encode())
assert out.decode().strip() == "hello!"

I would suggest you to look into OptionParser

Related

Popen with Context Managers

I have been trying to write a function which would execute a command passed to it thru a parameter using POPEN along with Context Managers. Unfortunately, I am unable to get it to work. Can someone please help?
import os
import sys
import subprocess
import inspect
def run_process(cmd_args):
with subprocess.Popen(cmd_args, stdout=subprocess.PIPE) as proc:
log.write(proc.stdout.read())
run_process("print('Hello')")
The output expected is "Hello". Can someone please point out where I am going wrong?
What you have done is right if you are running a bash command through the subprocess.
Inside the context manager "with ..." what you have done is to reading out the output from the terminal and storing them as byte(s) in "output" variable and trying to print out the bytes in ASCII after decoding it.
Try returning the value from the context manager and then decode it in the calling function:
import os
import sys
import subprocess
import inspect
def run_process(cmd_args): # Below added shell=True' in parameters.
with subprocess.Popen(cmd_args, stdout=subprocess.PIPE, shell=True) as proc:
return proc.stdout.read() # returns the output
# Optionally you can use the 'encoding='utf-8' argument
# instead and just print(proc.stdout.read()).
print(run_process().decode('utf-8'))
I was having a similar issue while pipelining a process to another program and I did the decoding in the other program and surprisingly it worked. Hope it works for you as well.
def run_process(cmd_args):
with subprocess.Popen(cmd_args, stdout=subprocess.PIPE) as p:
output = p.stdout.read()
return output
It worked for the same question.
Popen runs the command it receives as you would run something in your terminal (example: CMD on Windows or bash on Linux). So, it does not execute Python, but Bash code (on Linux for ex). The Python binary has a command, -c that does what you would need: executes a Python command right away. So you have to options:
either use echo Hello (works on Windows or Linux too, echo it's both
in batch and in bash)
or you could use python -c "print('Hello') instead of just the print command.
Without making too many changes to your existing script, I have edited your script with the below comments indicating what I did to get it to work. I hope this helps.
import os
import sys
import subprocess
import inspect
def run_process(cmd_args): # Below added shell=True' in parameters.
with subprocess.Popen(cmd_args, stdout=subprocess.PIPE, shell=True) as proc:
output = proc.stdout.read() # Reads the output from the process in bytes.
print(output.decode('utf-8')) # Converts bytes to UTF-8 format for readability.
# Optionally you can use the 'encoding='utf-8' argument
# instead and just print(proc.stdout.read()).
run_process("echo Hello") # To display the message in the prompt use 'echo' in your string like this.
Note: Read the Security Considerations section before using shell=True.
https://docs.python.org/3/library/subprocess.html#security-considerations

grab serial input line and move them to a shell script

I tries to grab a uart - line and give this string to a shell script;
#!/usr/bin/env python
import os
import serial
ser = serial.Serial('/dev/ttyAMA0', 4800)
while True :
try:
state=ser.readline()
print(state)
except:
pass
So, "state" should given to a shell script now,
like: myscript.sh "This is the serial input..."
but how can I do this?
print(os.system('myscript.sh ').ser.readline())
doesn't work.
Just simple string concatenation passed to the os.system function.
import os
os.system("myscript.sh " + ser.readline())
If myscript can continuously read additional input, you have a much more efficient pipeline.
from subprocess import Popen, PIPE
sink = Popen(['myscript.sh'], stdin=PIPE, stdout=PIPE)
while True:
sink.communicate(ser.readline())
If you have to start a new myscript.sh for every input line, (you'll really want to rethink your design, but) you can, of course:
while True:
subprocess.check_call(['myscript.sh', ser.readline())
Notice how in both cases we avoid a pesky shell.
There are different ways to combine two strings (namely "./myscript.sh" and ser.readLine()), which will then give you the full command to be run by use of os.system. E.g. strings can be arguments of the string.format method:
os.system('myscript.sh {}'.format(ser.readline()))
Also you can just add two strings:
os.system('myscript.sh '+ser.readline())
I am not sure what you want to achieve with the print statement. A better way to handle the call and input/output of your code would be to switch from os to the subprocess module.

Passing arguments/strings into already running process - Python 2.7

I have two scripts in Python.
sub.py code:
import time
import subprocess as sub
while 1:
value=input("Input some text or number") # it is example, and I don't care about if it is number-input or text-raw_input, just input something
proces=sub.Popen(['sudo', 'python', '/home/pi/second.py'],stdin=sub.PIPE)
proces.stdin.write(value)
second.py code:
import sys
while 1:
from_sub=sys.stdin()#or sys.stdout() I dont remember...
list_args.append(from_sub) # I dont know if syntax is ok, but it doesn't matter
for i in list_arg:
print i
First I execute sub.py, and I input something, then second.py file will execute and printing everything what I inputed and again and again...
The thing is I don't want to open new process. There should be only one process. Is it possible?
Give me your hand :)
This problem can be solved by using Pexpect. Check my answer over here. It solves a similar problem
https://stackoverflow.com/a/35864170/5134525.
Another way to do that is to use Popen from subprocess module and setting stdin and stdout as pipe. Modifying your code a tad bit can give you the desired results
from subprocess import Popen, PIPE
#part which should be outside loop
args = ['sudo', 'python', '/home/pi/second.py']
process = Popen(args, stdin=PIPE, stdout=PIPE)
while True:
value=input("Input some text or number")
process.stdin.write(value)
You need to open the process outside the loop for this to work. A similar issue is addressed here in case you want to check that Keep a subprocess alive and keep giving it commands? Python
This approach will lead to error if child process quits after first iteration and close all the pipes. You somehow need to block the child process to accept more input. This you can do by either using threads or by using the first option i.e. Pexpect

How to pass two values to stdin using subprocess in python

I am executing a script which prompts for 2 values one after the other. I want to pass the values from the script itself as I want to automate this.
Using the subprocess module, I can easily pass one value:
suppression_output = subprocess.Popen(cmd_suppression, shell=True,
stdin= subprocess.PIPE,
stdout= subprocess.PIPE).communicate('y') [0]
But passing the 2nd value does not seem to work. If I do something like this:
suppression_output = subprocess.Popen(cmd_suppression, shell=True,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE).communicate('y/r/npassword')[0]
You should use \n for the new line instead of /r/n -> 'y\npassword'
As your question is not clear, I assumed you have a program which behaves somewhat like this python script, lets call it script1.py:
import getpass
import sys
firstanswer=raw_input("Do you wish to continue?")
if firstanswer!="y":
sys.exit(0) #leave program
secondanswer=raw_input("Enter your secret password:\n")
#secondanswer=getpass.getpass("Enter your secret password:\n")
print "Password was entered successfully"
#do useful stuff here...
print "I should not print it out, but what the heck: "+secondanswer
It asks for confirmation ("y"), then wants you to enter a password. After that it does "something useful", finally prints the password and then exits
Now to get the first program to be run by a second script script2.py it has to look somewhat like this:
import subprocess
cmd_suppression="python ./testscript.py"
process=subprocess.Popen(cmd_suppression,shell=True\
,stdin=subprocess.PIPE,stdout=subprocess.PIPE)
response=process.communicate("y\npassword")
print response[0]
The output of script2.py:
$ python ./script2.py
Do you wish to continue?Enter your secret password:
Password was entered successfully
I should not print it out, but what the heck: password
A problem can most likely appear if the program uses a special method to get the password in a secure way, i.e. if it uses the line I just commented out in script1.py
secondanswer=getpass.getpass("Enter your secret password:\n")
This case tells you that it is probably not a good idea anyway to pass a password via a script.
Also keep in mind that calling subprocess.Popen with the shell=True option is generally a bad idea too. Use shell=False and provide the command as a list of arguments instead:
cmd_suppression=["python","./testscript2.py"]
process=subprocess.Popen(cmd_suppression,shell=False,\
stdin=subprocess.PIPE,stdout=subprocess.PIPE)
It is mentioned a dozen times in the Subprocess Documentation
Try os.linesep:
import os
from subprocess import Popen, PIPE
p = Popen(args, stdin=PIPE, stdout=PIPE)
output = p.communicate(os.linesep.join(['the first input', 'the 2nd']))[0]
rc = p.returncode
In Python 3.4+, you could use check_output():
import os
from subprocess import check_output
input_values = os.linesep.join(['the first input', 'the 2nd']).encode()
output = check_output(args, input=input_values)
Note: the child script might ask for a password directly from the terminal without using subprocess' stdin/stdout. In that case, you might need pexpect, or pty modules. See Q: Why not just use a pipe (popen())?
import os
from pexpect import run # $ pip install pexpect
nl = os.linesep
output, rc = run(command, events={'nodes.*:': 'y'+nl, 'password:': 'test123'+nl},
withexitstatus=1)

How to run executable from python and pass it arguments asked for?

I can't figure out how to run executable from python and after that pass it commands which is asking for one by one. All examples I found here are made through passing arguments directly when calling executable. But the executable I have needs "user input". It asks for values one by one.
Example:
subprocess.call(grid.exe)
>What grid you want create?: grid.grd
>Is it nice grid?: yes
>Is it really nice grid?: not really
>Grid created
You can use subprocess and the Popen.communicate method:
import subprocess
def create_grid(*commands):
process = subprocess.Popen(
['grid.exe'],
stdout=subprocess.PIPE,
stdin=subprocess.PIPE,
stderr=subprocess.PIPE)
process.communicate('\n'.join(commands) + '\n')
if __name__ == '__main__':
create_grid('grid.grd', 'yes', 'not really')
The "communicate" method essentially passes in input, as if you were typing it in. Make sure to end each line of input with the newline character.
If you want the output from grid.exe to show up on the console, modify create_grid to look like the following:
def create_grid(*commands):
process = subprocess.Popen(
['grid.exe'],
stdin=subprocess.PIPE)
process.communicate('\n'.join(commands) + '\n')
Caveat: I haven't fully tested my solutions, so can't confirm they work in every case.

Categories

Resources