How do I use Python to launch an interactive Docker container? - python

I am working with a Docker image which I launch in interactive mode like so: docker run -it --rm ubuntu bash
The actual image I work with has many complicated parameters, which is why I wrote a script to construct the full docker run command and launch it for me. As the logic grew more complicated, I want to migrate the script from bash to Python.
Using docker-py, I prepared everything to run the image. Seems like using docker.containers.run for interactive shells is not supported, however. Using subprocess instead seems logical, so I tried the following:
import subprocess
subprocess.Popen(['docker', 'run', '-it', '--rm', 'ubuntu', 'bash'])
But this gives me:
$ python3 docker_run_test.py
$ unable to setup input stream: unable to set IO streams as raw terminal: input/output error
$
Note that the error message appears in a different shell prompt from the python command.
How do I make python3 docker_run_test.py to do equivalent of running docker run -it --rm ubuntu bash?

You can use a pseudo-terminal to read from and write to the container process
import pty
import sys
import select
import os
import subprocess
pty, tty = pty.openpty()
p = subprocess.Popen(['docker', 'run', '-it', '--rm', 'ubuntu', 'bash'], stdin=tty, stdout=tty, stderr=tty)
while p.poll() is None:
# Watch two files, STDIN of your Python process and the pseudo terminal
r, _, _ = select.select([sys.stdin, pty], [], [])
if sys.stdin in r:
input_from_your_terminal = os.read(sys.stdin.fileno(), 10240)
os.write(pty, input_from_your_terminal)
elif pty in r:
output_from_docker = os.read(pty, 10240)
os.write(sys.stdout.fileno(), output_from_docker)

Can we use this ?
import os
os.system('docker run -it --rm ubuntu bash')

Related

python http server hide console

I am trying to do a script that is hosting my file on my local network, here's my code :
import os
import getpass
os.system('python -m http.server --directory C:/Users/'+getpass.getuser())
But the probleme is that the http console is showing on my Desktop and that's annoying ! so I tried to hide by renaming the file in .pyw but it's not working.
Have you any idea on how to hide this console ? Thank you :D
On Linux you can use Nohup to ignore the HUP signal.
You could add nohup on your code like this:
import os
import getpass
os.system('nohup python -m http.server --directory C:/Users/'+getpass.getuser())
Update
Solution for windows
import os
import subprocess
import getpass
env = os.environ
directory = 'C:/Users/'+getpass.getuser()
proc = subprocess.Popen(['python', '-m', 'http.server', '--directory', directory], env=env)
Assuming you're on Linux (or other unix based OS), you can detach the process from the console after starting the server.
Here is one way to do it with screen command
sudo apt install -Y screen
And then
screen -d -m "python3 script.py"
Where script.py is the snippet you have shared.
Reference for the flags
...
-d -m
Start screen in detached mode. This creates a new session but doesn’t attach to it. This is useful for system startup scripts.
I found a way to do it, with a VBS script:
Set WshShell = CreateObject("WScript.Shell")
WshShell.Run chr(34) & "main.py" & Chr(34), 0
Set WshShell = Nothing
Just replace main.py with the path of your script.

Python script for Network Packet Capture in Nifi

I am new to the nifi platform.
I am trying to use a python script to capture network packet which works on VScode and want to implement same script using NiFi but unable to do so.
This python code I used:
import os, subprocess
from subprocess import PIPE
from datetime import datetime
n = 10
filename = str(datetime.now()).replace(" ","")
b = subprocess.run(f'sudo tcpdump udp -e -i wlp6s0 -nn -vvv -c {n} -w {filename}.raw',shell=True)
c = '"X%02x"'
a = subprocess.run(f"sudo hexdump -v -e '1/1 {c}' {filename}.raw| sed -e 's/\s\+//g' -e 's/X/\\x/g' ", shell=True , stdout=PIPE, stderr=PIPE)
output_file = open (f'{filename}.txt', 'w')
output_file.write(str(a.stdout))
# print("*************************File Created*************************")
output_file.close()
I am using Execute Script Processor for implementing the python script. But it doesn't seem to be working. For executing the "sudo command" I have set to use no password so that no input is needed while executing the script.
Thank you!
Since you're just calling shell commands, you might consider ExecuteStreamCommand instead. You can still run the top-level Python script to call the subprocesses, but since you're not working with flowfile attributes you might be better served being able to call "real" Python. In ExecuteScript the engine is actually Jython and it doesn't let you import native (CPython) modules such as scikit, you can only import pure Python modules (Python scripts that don't themselves import native modules)

Start a background shell script from python

I would like to connect a remote machine and run background script in that machine from python.
I tried:
os.system("ssh root#10.0.0.1 \' nohup script.sh & \')
But it seems not working. And if I put nohup in script.sh, and simply run
os.system("ssh root#10.0.0.1 \' script.sh \'")
The nohup command would not work in either cases.
I'm confused why so, and is there anybody knows how to do background job from python or it's just impossible doing it this way?
What kind of errors are you getting? What version of Python are you using?
You should take a look at this Python subprocess - run multiple shell commands over SSH
import subprocess
sshProcess = subprocess.Popen(["ssh", "root#10.0.0.1"],
stdin=subprocess.PIPE,
stdout = subprocess.PIPE,
universal_newlines=True,
bufsize=0)
sshProcess.stdin.write("nohup script.sh &")
For example you have a local script (python, bash, etc. Here I am demonstrating you using a python script)
First you create a python file locally. Lets say hello.py
# 'hello.py'
import os
print os.system('hostname')
Secondly now a python script which would execute the above hello.py on a remote machine
import pathos
copy = pathos.core.copy('hello.py', destination='abc.remote.com:~/hello.py')
exec = pathos.core.execute('python hello.py', host='.remote.com')
print exec.response()

Run Python Script Automatically at Startup - Ubuntu 16.04

I'm on Ubuntu 16.04.1, and I have a python script to download image files from websites, the codes are as follows:
import sys
import os
import time
import json
import shlex, subprocess
import signal
import random
R = 0
r = 0
targets = ['192.0.78.13', '134.109.133.7', '216.58.212.227', '54.246.159.107', '185.60.216.35', '98.136.103.24']
if __name__ == "__main__":
while True:
cmd = 'wget -A pdf,jpg,png -m -p -E -k -K -np --delete-after '
R = random.randint(0,5)
cmd += targets[R]
args = shlex.split(cmd)
p = subprocess.Popen(args, shell=False)
time.sleep(2.0)
# killing all processes in the group
os.kill(p.pid, signal.SIGTERM)
if p.poll() is None: # Force kill if process
os.kill(p.pid, signal.SIGKILL)
r = random.randint(3,20)
time.sleep(r-1)
it run perfectly with command "python webaccess.py", now I want to run it automatically on startup in the background.
I've tried two methods but all of them are fail (the scripty does not run):
Use crontab using the guide here: Run Python script at startup in Ubuntu
#reboot python /bin/web1.py &
Edit the rc.local using the guide here: https://askubuntu.com/questions/817011/run-python-script-on-os-boot
python /bin/web1.py &
Is there any way to solve this?
Thank you in advance.
your rc.local method should work, check using your full python path. if that is default /usr/bin/python
/use/bin/python your_file_py
also you said you verified python webaccess.py do verify it from outside the folder of script.
also note that scrips in rc.local are executed by root
so check path_to_python python_file from root #

Why is my Python script writing Windows style carriage returns?

I am trying to write a script that creates a fabfile, saves it and then runs it. Here is my code so far:
#!/usr/bin/python
bakery_internalip = "10.10.15.203"
print "[....] Preparing commands to run within fabfile.py"
fabfile = open("sfab.py", "w")
fabfile.write("from fabric.api import run, sudo, task\n\n#task\ndef myinstall():\n\tsudo('yum install httpd')")
fabfile.close
print "Running Fab Commands"
import subprocess
subprocess.call(['fab', '-f', 'sfab.py', '-u ec2-user', '-i', 'id_rsa', '-H', bakery_internalip, 'myinstall'])
The contents of my fabfile are as follows:
[root#ip-10-10-20-82 bakery]# cat sfab.py
from fabric.api import run, sudo, task
#task
def myinstall():
sudo('yum install httpd')
My script gives the following error when I run it:
Fatal error: Fabfile didn't contain any commands!
However, if I run dos2unix on the file and then run the following, it works fine:
fab -f sfab.py -H localhost myinstall
Simple typo fabfile.close should be fabfile.close()
Running without closing will give you:
Running Fab Commands
Fatal error: Fabfile didn't contain any commands!
Aborting
with open("sfab.py", "w") as fabfile:
fabfile.write("from fabric.api import run, sudo, task\n\n#task\ndef myinstall():\n\tsudo('yum install httpd')")
Alway use with as above to open your files, it will automatically close them for you and avoid these simple errors.
I assume you are running it on Windows.
When using open(path, "w"), Python uses the OS's native linebreak combo.
To use \n specifically use open(path, "wb").
For more information see open().

Categories

Resources