I want to save output files (binary, matplotlib, etc.) of a Python program in a different computer over ssh. So far, I have saved files in the same computer that runs the program and to do that I have a line in my Python code filename = '/OutputFiles/myOutput.txt'. How do I change this line so that I can save the output in a different computer through ssh? It can be assumed that the ssh login password for the remote computer is in my keyring.
(New answer because OP specified they wanted to write multiple files)
You want to look into the paramiko module. It might look something like this:
import paramiko
connection = paramiko.SSHClient()
connection.set_missing_host_key_policy(paramiko.AutoAddPolicy())
connection.connect(10.10.10.10, username=myuser, password=mypass)
ftp = connection.open_sftp()
for i in range(10):
results = do_stuff()
f = ftp.open(i+'.txt', 'w+')
f.write(results)
f.close()
ftp.close()
connection.close()
The easiest way to do this would be to write your output to standard out and pipe it to ssh (assuming you're on a Mac, Linux, or other *nix-based machine). For example:
python myProgram | ssh user#host 'cat > outfile.txt'
Related
I need to establish a connection to a telnet server which continually broadcasts information and save all of that information to a file. For testing, I am using the famous ASCII Star Wars located on towel.blinkenlights.nl
In essence, I need to replicate in Python what you get when inputting the following command into Windows Command Prompt: telnet towel.blinkenlights.nl /f c:/randfolder/output.txt
The below does establish the connection and is showing the data received in terminal, however I am having no luck in saving what I see in the terminal to a file:
import telnetlib
HOST = "towel.blinkenlights.nl"
PORT = "23"
telnetObj=telnetlib.Telnet(HOST,PORT)
outF = open("output.txt", "a")
outF.write(telnetObj.interact())
telnetObj.close()
While this seems to me to be a rather hacky solution, using stdout does output the telnet stream to a file as required. Running this creates and appends a file containg all of the data received until interrupted
import sys
import telnetlib
with open('output.txt', 'w') as output:
sys.stdout = output # this is where the magic happens
HOST = "towel.blinkenlights.nl"
PORT = "23"
telnetObj=telnetlib.Telnet(HOST,PORT)
outF = open("output.txt", "a")
outF.write(telnetObj.interact())
telnetObj.close()
Would not be surprised however, if incorrect handling of this approach caused mess in program that this is part of. I'm thinking running it as a separate process could be reasonably safe.
I need to continuously read and process data on one computer that is generated on another computer.
So far, I was able to use mypipe tosend data from the second computer to the first, using the following:
cat mypipe | ssh second_com#IP_address 'cat destfile'
This works and the data is now constantly dumped to destfile, but file size increases really fast and this is not the solution I need.
What I would like to do is pipe the data directly into my python script without writing it to a file. Any suggestions on how to do this?
What you've written doesn't dump data to destfile. What it does is:
cat mypipe: Dumps the contents of a file named mypipe to its stdout.
|: Takes the stdout from cat mypipe and sends it as the stdin to ssh.
ssh second_com#IP_address: Creates an ssh connection to another system and runs the specified command there, forwarding its stdin along.
'cat destfile': Runs the command cat destfile on that other system—which ignores the forwarded-in stdin and dumps the contents of a file named destfile to stdout, which goes nowhere useful.
What you probably have is something more like this:
cat mypipe | ssh second_com#IP_address 'cat >destfile'
The difference is:
'cat destfile': Runs the command cat >destfile on that other system—so cat just copies the forwarded-in stdin to its stdout, and then >destfile causes that stdout to be stored in the local file destfile.
So the result is exactly what you described as happening.
The most obvious way to change this is to just put your Python program in place of cat. Of course you need to put your program on the remote machine, somewhere accessible, like the home directory of that second_com user. Then you can execute it like:
cat mypipe | ssh second_com#IP_address 'python myscript.py'
Now, inside myscript.py, it can read from sys.stdin, which will be the stream of data coming from cat mypipe (via | and ssh), and you can do whatever it is you wanted to do with that data, without needing to save it to destfile first.
I'm running a binary that manages a usb device. The binary file, when executed outputs results to a file I specify.
Is there any way in python the redirect the output of a binary to my script instead of to a file? I'm just going to have to open the file and get it as soon as this line of code runs.
def rn_to_file(comport=3, filename='test.bin', amount=128):
os.system('capture.exe {0} {1} {2}'.format(comport, filename, amount))
it doesn't work with subprocess either
from subprocess import check_output as qx
>>> cmd = r'C:\repos\capture.exe 3 text.txt 128'
>>> output = qx(cmd)
Opening serial port \\.\COM3...OK
Closing serial port...OK
>>> output
b'TrueRNG Serial Port Capture Tool v1.2\r\n\r\nCapturing 128 bytes of data...Done'
The actual content of the file is a series of 0 and 1. This isn't redirecting the output to the file to me, instead it just prints out what would be printed out anyway as output.
It looks like you're using Windows, which has a special reserved filename CON which means to use the console (the analog on *nix would be /dev/stdout).
So try this:
subprocess.check_output(r'C:\repos\capture.exe 3 CON 128')
You might need to use shell=True in there, but I suspect you don't.
The idea is to make the program write to the virtual file CON which is actually stdout, then have Python capture that.
An alternative would be CreateNamedPipe(), which will let you create your own filename and read from it, without having an actual file on disk. For more on that, see: createNamedPipe in python
I want to open a new cmd window and stream some text output (logs) while my python script is running (this is basically to check where I am in the script)
I can't find w way to do it and keep it open and stream my output.
Here is what I have now:
import os
p = os.popen("start cmd", mode='w')
def log_to_cmd(process, message):
p.write(message)
for i in range(10):
log_to_cmd(p, str(i))
And I want to get 0 to 9 output on the same cmd window already open.
Thanks a lot for any suggestion.
use Baretail this software allows you to stream logs.
If you want to stick to cmd shell I'd suggest installing something like GNU Utilities for Win32. It has most favourites, including tail. tail which allows you to open file like a log veiwer .
I am setting up a remote SSH connection to a remote server and running a specific command to dump a DB table. The remote server is a hardened linux OS with it's own shell. I am running a remote sql type of command to dump out a lot of data. My python script is using an interactive SSH session to do this. As you can see below, I am running a command, letting it sleep for 5 seconds, then dumping the buffer. I've tried many different options for the "remote_conn.recv()" function but I cannot get the full output. The output is very large and it is paged (press space for more). I am not even getting the complete output of the first page. If there are 50 lines on the first page, I'm getting the first 4 only.
Are there better ways of doing this? How I can just grab the complete output? Below is the Paramiko portion of my script.
# Create instance of SSHClient object
remote_conn_pre = paramiko.SSHClient()
# Automatically add untrusted hosts
remote_conn_pre.set_missing_host_key_policy(
paramiko.AutoAddPolicy())
# initiate SSH connection
remote_conn_pre.connect(options.ip, username=options.userName, password=options.password)
# Use invoke_shell to establish an 'interactive session'
remote_conn = remote_conn_pre.invoke_shell()
remote_conn.send("<remote sql command here>")
time.sleep(5)
output = remote_conn.recv(65535)
print output
remote_conn.close()
I was able to get the full output by grabbing smaller chunks of the buffer and concat'ing them together until I hit a certain string in the buffer.
while True:
data = remote_conn.recv(500)
if "<string>" in data:
break
else:
finalOutput += data