I want to go to a path on a remote SFTP server and verify if the file is present. If the file is present, then I want to open the file and update its contents.
Is it possible with SFTP in Paramiko?
Paramiko SFTP client has SFTPClient.open method that is an equivalent of regular Python open function. It returns a file-like object, which you can then use as if you were editing a local file:
ssh = paramiko.SSHClient()
# ...
ssh.connect(...)
sftp = ssh.open_sftp()
with sftp.open("/remote/path/file.txt", "r+") as f:
f.seek(10)
f.write(b'foo')
Related
I am trying to upload a file via SFTP to my server. But instead of just uploading it, i have to explicitly tell my skript what file to overwrite on the server. I don't know how to change that.
#!/usr/bin/python3
import paramiko
k = paramiko.RSAKey.from_private_key_file("/home/abdulkarim/.ssh/id_rsa")
c = paramiko.SSHClient()
c.set_missing_host_key_policy(paramiko.AutoAddPolicy())
print("connecting")
c.connect( hostname = "do-test", username = "abdulkarim", pkey = k )
print("connected")
sftp = c.open_sftp()
sftp.put('/home/abdulkarim/Skripte/data/test.txt', '/home/abdulkarim/test/test1.txt')
c.close()
In the below call, the second (remotepath) parameter refers to the path, where the file will be stored on the server. There is not requirement for the remote file to actually exist. It will be created.
sftp.put('/home/abdulkarim/Skripte/data/test.txt', '/home/abdulkarim/test/test1.txt')
Obligatory warning: Do not use AutoAddPolicy – You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
I'm trying to load a .csv file stored on a FTP Server (SFTP protocol). I'm using Python in combination with pysftp library. On the FTP server, the CSV file is inside a .zip file. Is there a way to open the zip and then retrieve only the csv file inside it?
Thank you in advance,
import pysftp
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
# Make connection to sFTP
with pysftp.Connection(hostname,
username=sftp_username,
password=sftp_pw,
cnopts = cnopts
)
with pysftp.cd(download_directory):
with sftp.cd('download_directory'):
print(f'Downloading this file: {filename}')
sftp.get(filename, preserve_mtime=True)
sftp.close()
If you have ssh access to the remote host and know enough about the remote path to the zip file you want and the zip utilities on that host, you can use your ssh client to run the unzip command remotely and capture its output. Here, my target is a linux machine and the zipfile is in the login user's home directory path. I can use the paramiko ssh client to do the work
Its a good idea to log into the remote server via ssh and practice to see what the path structure is like
import sys
import paramiko
import shutil
def sshclient_exec_command_binary(sshclient, command, bufsize=-1,
timeout=None, get_pty=False):
"""Paramiko SSHClient helper that implements exec_command with binary
output.
"""
chan = sshclient._transport.open_session()
if get_pty:
chan.get_pty()
chan.settimeout(timeout)
chan.exec_command(command)
stdin = chan.makefile('wb', bufsize)
stdout = chan.makefile('rb', bufsize)
stderr = chan.makefile_stderr('rb', bufsize)
return stdin, stdout, stderr
# example gets user/pw from command line
if len(sys.argv) != 3:
print("usage: test.py username password")
exit(1)
username, password = sys.argv[1:3]
# put your host/file info here
hostname = "localhost"
remote_zipfile = "tmp/mytest.zip"
file_to_extract = "myfile"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname, username=username, password=password)
unzip_cmd = "unzip -p {} {}".format(remote_zipfile, file_to_extract)
print("running", unzip_cmd)
stdin, out, err = sshclient_exec_command_binary(ssh, unzip_cmd)
# if the command worked, out is a file-like object to read.
print("writing", file_to_extract)
with open(file_to_extract, 'wb') as out_fp:
shutil.copyfileobj(out, out_fp)
My code first writes lines to a CSV in io.StringIO():
fileBuffer = io.StringIO()
# write header
header_writer = csv.DictWriter(fileBuffer, fieldnames=columnNames)
header_writer.writeheader()
# write lines
writer = csv.writer(fileBuffer, delimiter=',')
for line in data:
line_dec = line.decode('ISO-8859-1')
# print([line_dec])
writer.writerow([line_dec])
The following code also prints all expected rows:
$print(fileBuffer.getvalue()) # -> prints all expected rows
I can also successfully connect to the SFTP Server using pysftp and even in the with pysftp the code successfully returns all expected rows:
with pysftp.Connection(host, username=user, password=pw, cnopts=cnopts) as sftp:
print('sucessfully connected to {} via Port 22'.format(host))
print(fileBuffer.getvalue()) # -> prints all expected rows
sftp.putfo(fileBuffer, file2BeSavedAs) # -> no rows put on FTP Server
Here comes the actual problem:
Unfortunately, the code only creates the file without writing the data respectively the body into it. On the other hand, my code does not return any error message.
How can I put a CSV from StringIO to an SFTP server?
You have to seek a read pointer of the buffer back to the beginning, before you try to upload the buffer:
fileBuffer.seek(0)
sftp.putfo(fileBuffer, file2BeSavedAs)
Though a better approach is to write the CSV directly to the server, without an intermediate buffer. Use Connection.open to obtain a file-like object representing a file on the SFTP server:
with sftp.open(file2BeSavedAs, mode='w', bufsize=32768) as f:
writer = csv.writer(f, delimiter=',')
# ...
For the purpose of the bufsize argument, see:
Writing to a file on SFTP server opened using Paramiko/pysftp "open" method is slow
For a similar question, with progress display, see:
How to use Paramiko getfo to download file from SFTP server to memory to process it
Though pysftp is dead. You better use Paramiko. See pysftp vs. Paramiko. With Paramiko the code would be pretty much the same.
Once I connect to a remote server as follows,
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
I can do sftp.listdir() and see that there are some gzip files on the remote server, like example.gz.2016. How can I access the text of this file through the sftp connection, without actually downloading the file?
Your question has two parts:
How to view the content of a zip file from the command line
How to execute remote commands and get the output using python¶miko
First things first: How to list the content of a zip file on the console.
less can look into zip files, so in your case, executing less example.gz.2016 should give you a list of files inside the zip archive
Second: how to execute commands remotely.
import paramiko
ssh = paramiko.SSHClient()
# next is needed if your keys are not yet known on the client.
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(HOST_NAME, username=USER, password=PASSWORD)
stdin, stdout, stderr = ssh.exec_command('less ' + zipfilename)
for line in stdout.readlines():
# here you will have your listing
print (line)
for errline in stderr.readlines():
# Don't forget to check the error output
print ('***', errline)
Good Luck!
EDIT
If you need a sFTP connection to the same server, you need to get it from your ssh connection like this
sftp = ssh.open_sftp()
This question already has an answer here:
Reading file opened with Python Paramiko SFTPClient.open method is slow
(1 answer)
Closed 7 months ago.
I am using paramiko to open a remote sftp file in python. With the file object returned by paramiko, I am reading the file line by line and processing the information. This seems really slow compared to using the python in-built method 'open' from the os. Following is the code I am using to get the file object.
Using paramiko (slower by 2 times) -
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(myHost,myPort,myUser,myPassword)
sftp = client.open_sftp()
fileObject = sftp.file(fullFilePath,'rb')
Using os -
import os
fileObject = open(fullFilePath,'rb')
Am I missing anything? Is there a way to make the paramiko fileobject read method as fast as the one using the os fileobject?
Thanks!!
Your problem is likely to be caused by the file being a remote object. You've opened it on the server and are requesting one line at a time - because it's not local, each request takes much longer than if the file was sitting on your hard drive. The best alternative is probably to copy the file down to a local location first, using Paramiko's SFTP get.
Once you've done that, you can open the file from the local location using os.open.
I was having the same issue and I could not afford to copy the file locally because of security reasons, I solved it by using a combination of prefetching and bytesIO:
def fetch_file_as_bytesIO(sftp, path):
"""
Using the sftp client it retrieves the file on the given path by using pre fetching.
:param sftp: the sftp client
:param path: path of the file to retrieve
:return: bytesIO with the file content
"""
with sftp.file(path, mode='rb') as file:
file_size = file.stat().st_size
file.prefetch(file_size)
file.set_pipelined()
return io.BytesIO(file.read(file_size))
Here is a way that works using scraping the command line (cat) in paramiko, and reading all lines at once. Works well for me:
import paramiko
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.WarningPolicy())
client.connect(hostname=host, port=port, username=user, key_filename=ssh_file)
stdin, stdout, stderr = client.exec_command('cat /proc/net/dev')
net_dump = stdout.readlines()
#your entire file is now in net_dump .. do as you wish with it below ...
client.close()
The files I open are quite small so it all depends on your file size. Worth a try :)