A Python Wrapper or Handler for A Minecraft Server - python

I am using Windows and am looking for a handler or wrapper using Python for a Minecraft server so that I can automatically enter commands without user input. I have searched through many questions on the website and only found half answers (in my case at least). I believe I will need to use the subprocess module but cannot decide which to use at the moment I am experimenting with the Popen functions. I have found an answer which I modified for my case:
server = Popen("java -jar minecraft_server.jar nogui", stdin=PIPE, stdout=PIPE, stderr=STDOUT)
while True:
print(server.stdout.readline())
server.stdout.flush()
command = input("> ")
if command:
server.stdin.write(bytes(command + "\r\n", "ascii"))
server.stdin.flush()
This does work in some way but only prints a line for every time you enter a command, which cannot work and all my efforts to change this end up with the program unable to execute anything else and instead just read. This is not a duplicate question because none of the answers in similar questions could help me enough.

As you already know, your server.stdout.readline() and input("> ") are blocking your code execution.
You need to make your code non-blocking, by not waiting to actually return what you want, but by checking, if there is anything to read and ignore it, if there isn't and continue to do other things.
On Linux systems you might be able to use select module, but on Windows it only works on sockets.
I was able to make it work on Windows by using threads and queues. (note: it's Python 2 code)
import subprocess, sys
from Queue import Queue, Empty
from threading import Thread
def process_line(line):
if line == "stop\n": # lines have trailing new line characters
print "SERVER SHUTDOWN PREVENTED"
return None
elif line == "quit\n":
return "stop\n"
elif line == "l\n":
return "list\n"
return line
s = subprocess.Popen("java -jar minecraft_server.jar nogui", stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
def read_lines(stream, queue):
while True:
queue.put(stream.readline())
# terminal reading thread
q = Queue()
t = Thread(target=read_lines, args=(sys.stdin, q))
t.daemon = True
t.start()
# server reading thread
qs = Queue()
ts = Thread(target=read_lines, args=(s.stdout, qs))
ts.daemon = True
ts.start()
while s.poll() == None: # loop while the server process is running
# get a user entered line and send it to the server
try:
line = q.get_nowait()
except Empty:
pass
else:
line = process_line(line) # do something with the user entered line
if line != None:
s.stdin.write(line)
s.stdin.flush()
# just pass-through data from the server to the terminal output
try:
line = qs.get_nowait()
except Empty:
pass
else:
sys.stdout.write(line)
sys.stdout.flush()

Related

Display process output incrementally using Python subprocess

I'm trying to run "docker-compose pull" from inside a Python automation script and to incrementally display the same output that Docker command would print if it was run directly from the shell. This command prints a line for each Docker image found in the system, incrementally updates each line with the Docker image's download progress (a percentage) and replaces this percentage with a "done" when the download has completed. I first tried getting the command output with subprocess.poll() and (blocking) readline() calls:
import shlex
import subprocess
def run(command, shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
while True:
# print one output line
output_line = p.stdout.readline().decode('utf8')
error_output_line = p.stderr.readline().decode('utf8')
if output_line:
print(output_line.strip())
if error_output_line:
print(error_output_line.strip())
# check if process finished
return_code = p.poll()
if return_code is not None and output_line == '' and error_output_line == '':
break
if return_code > 0:
print("%s failed, error code %d" % (command, return_code))
run("docker-compose pull")
The code gets stuck in the first (blocking) readline() call. Then I tried to do the same without blocking:
import select
import shlex
import subprocess
import sys
import time
def run(command, shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
io_poller = select.poll()
io_poller.register(p.stdout.fileno(), select.POLLIN)
io_poller.register(p.stderr.fileno(), select.POLLIN)
while True:
# poll IO for output
io_events_list = []
while not io_events_list:
time.sleep(1)
io_events_list = io_poller.poll(0)
# print new output
for event in io_events_list:
# must be tested because non-registered events (eg POLLHUP) can also be returned
if event[1] & select.POLLIN:
if event[0] == p.stdout.fileno():
output_str = p.stdout.read(1).decode('utf8')
print(output_str, end="")
if event[0] == p.stderr.fileno():
error_output_str = p.stderr.read(1).decode('utf8')
print(error_output_str, end="")
# check if process finished
# when subprocess finishes, iopoller.poll(0) returns a list with 2 select.POLLHUP events
# (one for stdout, one for stderr) and does not enter in the inner loop
return_code = p.poll()
if return_code is not None:
break
if return_code > 0:
print("%s failed, error code %d" % (command, return_code))
run("docker-compose pull")
This works, but only the final lines (with "done" at the end) are printed to the screen, when all Docker images downloads have been completed.
Both methods work fine with a command with simpler output such as "ls". Maybe the problem is related with how this Docker command prints incrementally to screen, overwriting already written lines ? Is there a safe way to incrementally show the exact output of a command in the command line when running it via a Python script?
EDIT: 2nd code block was corrected
Always openSTDIN as a pipe, and if you are not using it, close it immediately.
p.stdout.read() will block until the pipe is closed, so your polling code does nothing useful here. It needs modifications.
I suggest not to use shell=True
Instead of *.readline(), try with *.read(1) and wait for "\n"
Of course you can do what you want in Python, the question is how. Because, a child process might have different ideas about how its output should look like, that's when trouble starts. E.g. the process might want explicitly a terminal at the other end, not your process. Or a lot of such simple nonsense. Also, a buffering may also cause problems. You can try starting Python in unbuffered mode to check. (/usr/bin/python -U)
If nothing works, then use pexpect automation library instead of subprocess.
I have found a solution, based on the first code block of my question:
def run(command,shell=False):
p = subprocess.Popen(shlex.split(command), stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=shell)
while True:
# read one char at a time
output_line = p.stderr.read(1).decode("utf8")
if output_line != "":
print(output_line,end="")
else:
# check if process finished
return_code = p.poll()
if return_code is not None:
if return_code > 0:
raise Exception("Command %s failed" % command)
break
return return_code
Notice that docker-compose uses stderr to print its progress instead of stdout. #Dalen has explained that some applications do it when they want that their results are pipeable somewhere, for instance a file, but also want to be able to show their progress.

Python reading from stdin hangs when interacting with ruby code

I was trying to put python and ruby codes into conversation, and I found the methods from this link (http://www.decalage.info/python/ruby_bridge)
I tried the last method, using stdin and stdout to pass information. I made some changes to the origin code so that it fits python 3.4, but I am not sure whether or not the code that I changed messed all the things up. My python program always hangs when reading from stdin, and nothing was printed. I am not familiar with stdin and stdout, so I am just wondering why this does not work.
Here are my ruby codes:
$stdin.set_encoding("utf-8:utf-8")
$stdout.set_encoding("utf-8:utf-8")
while cmd = $stdin.gets
cmd.chop!
if cmd == "exit"
break
else
puts eval(cmd)
puts "[end]"
$stdout.flush
end
end
I am not sure if it is possible to set internal encoding and external encoding like this. And here are my python codes:
from subprocess import Popen, PIPE, STDOUT
print("Launch slave process...")
slave = Popen(['ruby', 'slave.rb'], stdin=PIPE, stdout=PIPE, stderr=STDOUT)
while True:
line = input("Enter expression or exit:")
slave.stdin.write((line+'\n').encode('UTF-8'))
result = []
while True:
if slave.poll() is not None:
print("Slave has terminated.")
exit()
line = slave.stdout.readline().decode('UTF-8').rstrip()
if line == "[end]":
break
result.append(line)
print("result:")
print("\n".join(result))
When I try to run the python script, input "3*4", and press enter, nothing shows until I broke the process manually with exit code 1 and KeyboardInterrupt Exception.
I have been struggling with this problem for quite a long time and I don't know what goes wrong...
Thanks in advance for any potential help!
The difference is that bufsize=-1 by default in Python 3.4 and therefore slave.stdin.write() does not send the line to the ruby subprocess immediately. A quick fix is to add slave.stdin.flush() call.
#!/usr/bin/env python3
from subprocess import Popen, PIPE
log = print
log("Launch slave process...")
with Popen(['ruby', 'slave.rb'], stdin=PIPE, stdout=PIPE,
bufsize=1, universal_newlines=True) as ruby:
while True:
line = input("Enter expression or exit:")
# send request
print(line, file=ruby.stdin, flush=True)
# read reply
result = []
for line in ruby.stdout:
line = line.rstrip('\n')
if line == "[end]":
break
result.append(line)
else: # no break, EOF
log("Slave has terminated.")
break
log("result:" + "\n".join(result))
It uses universal_newlines=True to enable text mode. It uses locale.getpreferredencoding(False) to decode bytes. If you want to force utf-8 encoding regardless of locale settings then drop universal_newlines and wrap the pipes into io.TextIOWrapper(encoding="utf-8") (code example -- it also shows the proper exception handling for the pipes).

stdin for Interactive command from python

I'm trying to integrate an interactive lua shell into my python GUI with a similar approach as described here: Running an interactive command from within python Target platform for now is windows. I want to be able to feed the lua interpreter line by line.
import subprocess
import os
from queue import Queue
from queue import Empty
from threading import Thread
import time
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
lua = '''\
-- comment
print("A")
test = 0
test2 = 1
os.exit()'''
command = os.path.join('lua', 'bin', 'lua.exe')
process = (subprocess.Popen(command + ' -i', shell=True,
stdin=subprocess.PIPE, stderr=subprocess.PIPE,
stdout=subprocess.PIPE, cwd=os.getcwd(), bufsize=1,
universal_newlines=True))
outQueue = Queue()
errQueue = Queue()
outThread = Thread(target=enqueue_output, args=(process.stdout, outQueue))
errThread = Thread(target=enqueue_output, args=(process.stderr, errQueue))
outThread.daemon = True
errThread.daemon = True
outThread.start()
errThread.start()
script = lua.split('\n')
time.sleep(.2)
for line in script:
while True:
try:
rep = outQueue.get(timeout=.2)
except Empty:
break
else: # got line
print(rep)
process.stdin.write(line)
The only output I receive is the very first line of the lua.exe shell. It seems that the writing to stdin doesn't actually take place. Is there anything I miss?
Running an external lua file with the -i switch actually works and yields the expected output which makes me think the issue is connected to the stdin.
I experimented a bit in python interactive mode using the python shell trying something similar to the solution featuring a file for the stdout here: Interactive input/output using python. However, this only wrote the output to the file once I stopped the python shell, which also seems like the stdin gets stalled somewhere and is only actually transmitted, once I quit the shell. Any ideas what goes wrong here?

Running an interactive command from within Python

I have a script that I want to run from within Python (2.6.5) that follows the logic below:
Prompts the user for a password. It looks like ("Enter password: ") (*Note: Input does not echo to screen)
Output irrelevant information
Prompt the user for a response ("Blah Blah filename.txt blah blah (Y/N)?: ")
The last prompt line contains text which I need to parse (filename.txt). The response provided doesn't matter (the program could actually exit here without providing one, as long as I can parse the line).
My requirements are somewhat similar to Wrapping an interactive command line application in a Python script, but the responses there seem a bit confusing, and mine still hangs even when the OP mentions that it doesn't for him.
Through looking around, I've come to the conclusion that subprocess is the best way of doing this, but I'm having a few issues. Here is my Popen line:
p = subprocess.Popen("cmd", shell=True, stdout=subprocess.PIPE,
stderr=subprocess.STDOUT, stdin=subprocess.PIPE)
When I call a read() or readline() on stdout, the prompt is printer to the screen and it hangs.
If I call a write("password\n") for stdin, the prompt is written to the screen and it hangs. The text in write() is not written (I don't the cursor move the a new line).
If I call p.communicate("password\n"), same behavior as write()
I was looking for a few ideas here on the best way to input to stdin and possibly how to parse the last line in the output if your feeling generous, though I could probably figure that out eventually.
If you are communicating with a program that subprocess spawns, you should check out A non-blocking read on a subprocess.PIPE in Python. I had a similar problem with my application and found using queues to be the best way to do ongoing communication with a subprocess.
As for getting values from the user, you can always use the raw_input() builtin to get responses, and for passwords, try using the getpass module to get non-echoing passwords from your user. You can then parse those responses and write them to your subprocess' stdin.
I ended up doing something akin to the following:
import sys
import subprocess
from threading import Thread
try:
from Queue import Queue, Empty
except ImportError:
from queue import Queue, Empty # Python 3.x
def enqueue_output(out, queue):
for line in iter(out.readline, b''):
queue.put(line)
out.close()
def getOutput(outQueue):
outStr = ''
try:
while True: # Adds output from the Queue until it is empty
outStr+=outQueue.get_nowait()
except Empty:
return outStr
p = subprocess.Popen("cmd", stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.PIPE, shell=False, universal_newlines=True)
outQueue = Queue()
errQueue = Queue()
outThread = Thread(target=enqueue_output, args=(p.stdout, outQueue))
errThread = Thread(target=enqueue_output, args=(p.stderr, errQueue))
outThread.daemon = True
errThread.daemon = True
outThread.start()
errThread.start()
try:
someInput = raw_input("Input: ")
except NameError:
someInput = input("Input: ")
p.stdin.write(someInput)
errors = getOutput(errQueue)
output = getOutput(outQueue)
Once you have the queues made and the threads started, you can loop through getting input from the user, getting errors and output from the process, and processing and displaying them to the user.
Using threading it might be slightly overkill for simple tasks.
Instead os.spawnvpe can be used. It will spawn script shell as a process. You will be able to communicate interactively with the script.
In this example I passed password as an argument, obviously that is not a good idea.
import os
import sys
from getpass import unix_getpass
def cmd(cmd):
cmd = cmd.split()
code = os.spawnvpe(os.P_WAIT, cmd[0], cmd, os.environ)
if code == 127:
sys.stderr.write('{0}: command not found\n'.format(cmd[0]))
return code
password = unix_getpass('Password: ')
cmd_run = './run.sh --password {0}'.format(password)
cmd(cmd_run)
pattern = raw_input('Pattern: ')
lines = []
with open('filename.txt', 'r') as fd:
for line in fd:
if pattern in line:
lines.append(line)
# manipulate lines
If you just want a user to enter a password without it being echoed to the screen just use the standard library's getpass module:
import getpass
print("You entered:", getpass.getpass())
NOTE:The prompt for this function defaults to "Password: " also this will only work on command lines where echoing can be controlled. So if it doesn't work try running it from terminal.

How can I run an external command asynchronously from Python?

I need to run a shell command asynchronously from a Python script. By this I mean that I want my Python script to continue running while the external command goes off and does whatever it needs to do.
I read this post:
Calling an external command in Python
I then went off and did some testing, and it looks like os.system() will do the job provided that I use & at the end of the command so that I don't have to wait for it to return. What I am wondering is if this is the proper way to accomplish such a thing? I tried commands.call() but it will not work for me because it blocks on the external command.
Please let me know if using os.system() for this is advisable or if I should try some other route.
subprocess.Popen does exactly what you want.
from subprocess import Popen
p = Popen(['watch', 'ls']) # something long running
# ... do other stuff while subprocess is running
p.terminate()
(Edit to complete the answer from comments)
The Popen instance can do various other things like you can poll() it to see if it is still running, and you can communicate() with it to send it data on stdin, and wait for it to terminate.
If you want to run many processes in parallel and then handle them when they yield results, you can use polling like in the following:
from subprocess import Popen, PIPE
import time
running_procs = [
Popen(['/usr/bin/my_cmd', '-i %s' % path], stdout=PIPE, stderr=PIPE)
for path in '/tmp/file0 /tmp/file1 /tmp/file2'.split()]
while running_procs:
for proc in running_procs:
retcode = proc.poll()
if retcode is not None: # Process finished.
running_procs.remove(proc)
break
else: # No process is done, wait a bit and check again.
time.sleep(.1)
continue
# Here, `proc` has finished with return code `retcode`
if retcode != 0:
"""Error handling."""
handle_results(proc.stdout)
The control flow there is a little bit convoluted because I'm trying to make it small -- you can refactor to your taste. :-)
This has the advantage of servicing the early-finishing requests first. If you call communicate on the first running process and that turns out to run the longest, the other running processes will have been sitting there idle when you could have been handling their results.
This is covered by Python 3 Subprocess Examples under "Wait for command to terminate asynchronously". Run this code using IPython or python -m asyncio:
import asyncio
proc = await asyncio.create_subprocess_exec(
'ls','-lha',
stdout=asyncio.subprocess.PIPE,
stderr=asyncio.subprocess.PIPE)
# do something else while ls is working
# if proc takes very long to complete, the CPUs are free to use cycles for
# other processes
stdout, stderr = await proc.communicate()
The process will start running as soon as the await asyncio.create_subprocess_exec(...) has completed. If it hasn't finished by the time you call await proc.communicate(), it will wait there in order to give you your output status. If it has finished, proc.communicate() will return immediately.
The gist here is similar to Terrels answer but I think Terrels answer appears to overcomplicate things.
See asyncio.create_subprocess_exec for more information.
What I am wondering is if this [os.system()] is the proper way to accomplish such a thing?
No. os.system() is not the proper way. That's why everyone says to use subprocess.
For more information, read http://docs.python.org/library/os.html#os.system
The subprocess module provides more
powerful facilities for spawning new
processes and retrieving their
results; using that module is
preferable to using this function. Use
the subprocess module. Check
especially the Replacing Older
Functions with the subprocess Module
section.
The accepted answer is very old.
I found a better modern answer here:
https://kevinmccarthy.org/2016/07/25/streaming-subprocess-stdin-and-stdout-with-asyncio-in-python/
and made some changes:
make it work on windows
make it work with multiple commands
import sys
import asyncio
if sys.platform == "win32":
asyncio.set_event_loop_policy(asyncio.WindowsProactorEventLoopPolicy())
async def _read_stream(stream, cb):
while True:
line = await stream.readline()
if line:
cb(line)
else:
break
async def _stream_subprocess(cmd, stdout_cb, stderr_cb):
try:
process = await asyncio.create_subprocess_exec(
*cmd, stdout=asyncio.subprocess.PIPE, stderr=asyncio.subprocess.PIPE
)
await asyncio.wait(
[
_read_stream(process.stdout, stdout_cb),
_read_stream(process.stderr, stderr_cb),
]
)
rc = await process.wait()
return process.pid, rc
except OSError as e:
# the program will hang if we let any exception propagate
return e
def execute(*aws):
""" run the given coroutines in an asyncio loop
returns a list containing the values returned from each coroutine.
"""
loop = asyncio.get_event_loop()
rc = loop.run_until_complete(asyncio.gather(*aws))
loop.close()
return rc
def printer(label):
def pr(*args, **kw):
print(label, *args, **kw)
return pr
def name_it(start=0, template="s{}"):
"""a simple generator for task names
"""
while True:
yield template.format(start)
start += 1
def runners(cmds):
"""
cmds is a list of commands to excecute as subprocesses
each item is a list appropriate for use by subprocess.call
"""
next_name = name_it().__next__
for cmd in cmds:
name = next_name()
out = printer(f"{name}.stdout")
err = printer(f"{name}.stderr")
yield _stream_subprocess(cmd, out, err)
if __name__ == "__main__":
cmds = (
[
"sh",
"-c",
"""echo "$SHELL"-stdout && sleep 1 && echo stderr 1>&2 && sleep 1 && echo done""",
],
[
"bash",
"-c",
"echo 'hello, Dave.' && sleep 1 && echo dave_err 1>&2 && sleep 1 && echo done",
],
[sys.executable, "-c", 'print("hello from python");import sys;sys.exit(2)'],
)
print(execute(*runners(cmds)))
It is unlikely that the example commands will work perfectly on your system, and it doesn't handle weird errors, but this code does demonstrate one way to run multiple subprocesses using asyncio and stream the output.
I've had good success with the asyncproc module, which deals nicely with the output from the processes. For example:
import os
from asynproc import Process
myProc = Process("myprogram.app")
while True:
# check to see if process has ended
poll = myProc.wait(os.WNOHANG)
if poll is not None:
break
# print any new output
out = myProc.read()
if out != "":
print out
Using pexpect with non-blocking readlines is another way to do this. Pexpect solves the deadlock problems, allows you to easily run the processes in the background, and gives easy ways to have callbacks when your process spits out predefined strings, and generally makes interacting with the process much easier.
Considering "I don't have to wait for it to return", one of the easiest solutions will be this:
subprocess.Popen( \
[path_to_executable, arg1, arg2, ... argN],
creationflags = subprocess.CREATE_NEW_CONSOLE,
).pid
But... From what I read this is not "the proper way to accomplish such a thing" because of security risks created by subprocess.CREATE_NEW_CONSOLE flag.
The key things that happen here is use of subprocess.CREATE_NEW_CONSOLE to create new console and .pid (returns process ID so that you could check program later on if you want to) so that not to wait for program to finish its job.
I have the same problem trying to connect to an 3270 terminal using the s3270 scripting software in Python. Now I'm solving the problem with an subclass of Process that I found here:
http://code.activestate.com/recipes/440554/
And here is the sample taken from file:
def recv_some(p, t=.1, e=1, tr=5, stderr=0):
if tr < 1:
tr = 1
x = time.time()+t
y = []
r = ''
pr = p.recv
if stderr:
pr = p.recv_err
while time.time() < x or r:
r = pr()
if r is None:
if e:
raise Exception(message)
else:
break
elif r:
y.append(r)
else:
time.sleep(max((x-time.time())/tr, 0))
return ''.join(y)
def send_all(p, data):
while len(data):
sent = p.send(data)
if sent is None:
raise Exception(message)
data = buffer(data, sent)
if __name__ == '__main__':
if sys.platform == 'win32':
shell, commands, tail = ('cmd', ('dir /w', 'echo HELLO WORLD'), '\r\n')
else:
shell, commands, tail = ('sh', ('ls', 'echo HELLO WORLD'), '\n')
a = Popen(shell, stdin=PIPE, stdout=PIPE)
print recv_some(a),
for cmd in commands:
send_all(a, cmd + tail)
print recv_some(a),
send_all(a, 'exit' + tail)
print recv_some(a, e=0)
a.wait()
There are several answers here but none of them satisfied my below requirements:
I don't want to wait for command to finish or pollute my terminal with subprocess outputs.
I want to run bash script with redirects.
I want to support piping within my bash script (for example find ... | tar ...).
The only combination that satiesfies above requirements is:
subprocess.Popen(['./my_script.sh "arg1" > "redirect/path/to"'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
shell=True)

Categories

Resources