Docker-compose with python command handler - python

I am trying to the python cmd library in combination with docker.
This is my minimal command class setup.
class Commands(cmd.Cmd):
intro = 'Welcome to the shell.\nType help or ? to list commands.\n'
prompt = 'Shell: '
#staticmethod
def do_stop(arg):
"""
Stops the servers gracefully
:param arg:
:return:
"""
logger.info("Stopping server...")
# Do stuff
If I start the app without docker, the shell is just working fine. I can interact with it, without issues. However, if I use docker-compose up I get an endless loop of error messages, like in the image below.
My application main.py looks like the following:
if __name__ == "__main__":
Commands().cmdloop()
Why is docker complaining about an unknown syntax? is something writing to the stdout or stderr, I am not aware of?

Using:
stdin_open: true
in the docker compose file, will fix the issue.

Related

Calling a .ps1 script in Linux Powershell from a Flask route in Python

I'm trying to invoke a .ps1 script in Powershell for Linux by passing a variable and triggering it in reaction to a web call to a Flask/Python API.
All the files are in the same directory in this example - Ideally the .ps1 script and other relevant files to the action it needs to take would be in another directory, but for testing, they're all in the main directory of my venv.
If I run the following code manually, via my_venv >> python script.py
# this is script.py
import subprocess
arg1 = 'xyz'
arg2 = 'abc'
subprocess.run(['pwsh', '.\example.ps1', arg1, arg2])
It will work properly. Powershell will run, and the actions in the script example.ps1 will execute. However, if I add the same Python code to a Flask app route so it can be triggered by an API request like so:
from flask import Flask, request
import subprocess
app = Flask(__name__)
app.debug = True
# example route
#app.route('/example/', methods=['GET'])
def example():
var1 = request.args.get('var1')
arg1 = var1
arg2 = 'abc'
subprocess.run(['pwsh', '.\example.ps1', arg1, arg2])
return ('success', 200)
It doesn't do anything. Flask debugging gives me the error:
Exception: builtins.FileNotFoundError: [Errno 2] No such file or directory: 'pwsh'
Which makes me think it's not locating the binary for pwsh but I'm not clear on how to fix that in this situation. In Windows, you'd put the path the the powershell.exe executable in your command, but that's obviously not how it works here.
One note is the variable above - I've tried it without, just letting a value pass to var1 via GET and then ignoring it and hardcoding arg1 to test, makes no difference. The real code does need the variable.
Any obvious dumb thing I'm doing wrong here? This question doesn't seem to have been asked for Linux, although there are some similar Windows questions.
In Windows, it's usually a best-practice to fully path your executables and arguments if you're unsure of the environment variables. You can accomplish this in your example by using
subprocess.run(['/usr/bin/pwsh', '.\example.ps1', arg1, arg2])
to fully qualify the pwsh executable path in Linux.

Python salt cloudclient is not logging anything

I am trying to create new nodes using the CloudClient in saltstack python API. Nodes are created successfully but I don't see any logging happening. Below is the code which I am using.
from salt.cloud import CloudClient
cloud_client = CloudClient()
kwargs = {'parallel': True}
cloud_client.map_run(path="mymap.map",**kwargs)
Is there way to run the same code in debug mode to see the output on console from this python script if logging cannot be done.
logging parameters in cloud
log_level: all
log_level_logfile: all
log_file: /var/logs/salt.log
When I try to run with sal-cloud on cli it is working with the below command:
salt-cloud -m mymap.map -P
I was able make it work by adding the below code
from salt.log.setup import setup_console_logger
setup_console_logger(log_level='debug')

python values to bash line on a remote server

So i have a script from Python that connects to the client servers then get some data that i need.
Now it will work in this way, my bash script from the client side needs input like the one below and its working this way.
client.exec_command('/apps./tempo.sh' 2016 10 01 02 03))
Now im trying to get the user input from my python script then transfer it to my remotely called bash script and thats where i get my problem. This is what i tried below.
Below is the method i tried that i have no luck working.
import sys
client.exec_command('/apps./tempo.sh', str(sys.argv))
I believe you are using Paramiko - which you should tag or include that info in your question.
The basic problem I think you're having is that you need to include those arguments inside the string, i.e.
client.exec_command('/apps./tempo.sh %s' % str(sys.argv))
otherwise they get applied to the other arguments of exec_command. I think your original example is not quite accurate in how it works;
Just out of interest, have you looked at "fabric" (http://www.fabfile.org ) - this has lots of very handy funcitons like "run" which will run a command on a remote server (or lots of remote servers!) and return you the response.
It also gives you lots of protection by wrapping around popen and paramiko for hte ssh login etcs, so it can be much more secure then trying to make web services or other things.
You should always be wary of injection attacks - Im unclear how you are injecting your variables, but if a user calls your script with something like python runscript "; rm -rf /" that would have very bad problems for you It would instead be better to have 'options' on the command, which are programmed in, limiting the users input drastically, or at least a lot of protection around the input variables. Of course if this is only for you (or trained people), then its a little easier.
I recommend using paramiko for the ssh connection.
import paramiko
ssh_client = paramiko.SSHClient()
ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh_client.connect(server, username=user,password=password)
...
ssh_client.close()
And If you want to simulate a terminal, as if a user was typing:
chan=ssh_client.invoke_shell()
chan.send('PS1="python-ssh:"\n')
def exec_command(cmd):
"""Gets ssh command(s), execute them, and returns the output"""
prompt='python-ssh:' # the command line prompt in the ssh terminal
buff=''
chan.send(str(cmd)+'\n')
while not chan.recv_ready():
time.sleep(1)
while not buff.endswith(prompt):
buff+=ssh_client.chan.recv(1024)
return buff[:len(prompt)]
Example usage: exec_command('pwd')
And the result would even be returned to you via ssh
Assuming that you are using paramiko you need to send the command as a string. It seems that you want to pass the command line arguments passed to your Python script as arguments for the remote command, so try this:
import sys
command = '/apps./tempo.sh'
args = ' '.join(sys.argv[1:]) # all args except the script's name!
client.exec_command('{} {}'.format(command, args))
This will collect all the command line arguments passed to the Python script, except the first argument which is the script's file name, and build a space separated string. This argument string is them concatenated with the bash script command and executed remotely.

Mininet Python API - CLI Class

I am having trouble understanding how to use to class properly. Calling the class constructor without a script automatically runs the CLI in interactive mode. Therefore you need to manually exit interactive mode to obtain the class instance. Only then can you call the class methods using said instance. This seems very strange.
What I am trying to do is write a program which configures the network and then opens several xterm windows on separate nodes and launches an application inside them. Is this possible?
Edit:
For example something like the following:
#!/usr/bin/python
from mininet.net import Mininet
from mininet.log import setLogLevel
from mininet.cli import CLI
from mininet.topolib import TreeTopo
def test():
"Create and test a simple network"
net = Mininet(TreeTopo(depth=2,fanout=2))
net.start()
cli = CLI(net)
CLI.do_xterm(cli, "h1 h2")
net.stop()
if __name__ == '__main__':
setLogLevel('info')
test()
Calling the CLI class constructor in order to obtain the class instance automatically launches mininet in interactive mode. This needs to be manually exited before the call to the do_xterm method can be envoked on the class instance.
I suppose a CLI is made to be used on stdin, so making use of scripting instead of programmatic manipulation of the CLI makes some sense.
If you want to obtain a reference to the cli object without interactive mode, you could make a workaround by creating an empty text file called "null_script" and then calling
cli = CLI(net, script='null_script')
Your real goal seems to be to programatically open xterms and have them run applications. Since you don't give a reason why you can't use scripts I propose a solution that uses a script. Put the following in a text file:
py h1.cmd('screen -dmS mininet.h1')
sh xterm -title Node:h1 -e screen -D -RR -S mininet.h1 &
sh screen -x -S mininet.h1 -X stuff 'ls'`echo '\015'`
Using this text file as a script in the cli works for me both using the 'source' command on CLI and by passing the filename into 'sript='.
I took the command arguments from the makeTerm function in term.py, and the screen stuff arguments from an answer on superuser. Just replace 'ls' with the name of the application you want to run.
Each screen you are trying to attach to needs to have a unique name, otherwise you will get a message listing the matching names and you will have to specify a pid for the correct session, which would complicate things.

How to check if a docker instance is running?

I am using Python to start docker instances.
How can I identify if they are running? I can pretty easily use docker ps from terminal like:
docker ps | grep myimagename
and if this returns anything, the image is running. If it returns an empty string, the image is not running.
However, I cannot understand how to get subprocess.Popen to work with this - it requires a list of arguments so something like:
p = subprocess.Popen(['docker', 'ps', '|', 'grep', 'myimagename'], stdout=subprocess.PIPE)
print p.stdout
does not work because it tries to take the "docker ps" and make it "docker" and "ps" commands (which docker doesn't support).
It doesn't seem I can give it the full command, either, as Popen tries to run the entire first argument as the executable, so this fails:
p = subprocess.Popen('docker ps | grep myimagename', stdout=subprocess.PIPE)
print p.stdout
Is there a way to actually run docker ps from Python? I don't know if trying to use subprocess is the best route or not. It is what I am using to run the docker containers, however, so it seemed to be the right path.
How can I determine if a docker instance is running from a Python script?
You can use the python docker client:
import docker
DOCKER_CLIENT = docker.DockerClient(base_url='unix://var/run/docker.sock')
RUNNING = 'running'
def is_running(container_name):
"""
verify the status of a sniffer container by it's name
:param container_name: the name of the container
:return: Boolean if the status is ok
"""
container = DOCKER_CLIENT.containers.get(container_name)
container_state = container.attrs['State']
container_is_running = container_state['Status'] == RUNNING
return container_is_running
my_container_name = "asdf"
print(is_running(my_container_name))
One option is to use subprocess.check_output setting shell=True (thanks slezica!):
s = subprocess.check_output('docker ps', shell=True)
print 'Results of docker ps' + s
if the docker ps command fails (for example you don't start your docker-machine) then check_output will throw an exception.
A simple find can then verify your container is found / not-found:
if s.find('containername') != -1:
print 'found!'
else:
print 'not found.'
I would recommend using the container hash id and not container name in this case, too, as the name may be duplicated in the image name or other results of the docker ps.
Even though it seems like you are on your way, I would recommend you use docker-py as it accesses the socket created by docker to issue API request. I use this library currently use this library and it is real time saver.

Categories

Resources