I'm trying to load a .csv file stored on a FTP Server (SFTP protocol). I'm using Python in combination with pysftp library. On the FTP server, the CSV file is inside a .zip file. Is there a way to open the zip and then retrieve only the csv file inside it?
Thank you in advance,
import pysftp
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
# Make connection to sFTP
with pysftp.Connection(hostname,
username=sftp_username,
password=sftp_pw,
cnopts = cnopts
)
with pysftp.cd(download_directory):
with sftp.cd('download_directory'):
print(f'Downloading this file: {filename}')
sftp.get(filename, preserve_mtime=True)
sftp.close()
If you have ssh access to the remote host and know enough about the remote path to the zip file you want and the zip utilities on that host, you can use your ssh client to run the unzip command remotely and capture its output. Here, my target is a linux machine and the zipfile is in the login user's home directory path. I can use the paramiko ssh client to do the work
Its a good idea to log into the remote server via ssh and practice to see what the path structure is like
import sys
import paramiko
import shutil
def sshclient_exec_command_binary(sshclient, command, bufsize=-1,
timeout=None, get_pty=False):
"""Paramiko SSHClient helper that implements exec_command with binary
output.
"""
chan = sshclient._transport.open_session()
if get_pty:
chan.get_pty()
chan.settimeout(timeout)
chan.exec_command(command)
stdin = chan.makefile('wb', bufsize)
stdout = chan.makefile('rb', bufsize)
stderr = chan.makefile_stderr('rb', bufsize)
return stdin, stdout, stderr
# example gets user/pw from command line
if len(sys.argv) != 3:
print("usage: test.py username password")
exit(1)
username, password = sys.argv[1:3]
# put your host/file info here
hostname = "localhost"
remote_zipfile = "tmp/mytest.zip"
file_to_extract = "myfile"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname, username=username, password=password)
unzip_cmd = "unzip -p {} {}".format(remote_zipfile, file_to_extract)
print("running", unzip_cmd)
stdin, out, err = sshclient_exec_command_binary(ssh, unzip_cmd)
# if the command worked, out is a file-like object to read.
print("writing", file_to_extract)
with open(file_to_extract, 'wb') as out_fp:
shutil.copyfileobj(out, out_fp)
Related
Is there way i can copy remote files that ends with name "output" using paramiko scp.
I have below code, which copies only if i provide full path or exact file name
Below is code
import paramiko
import os
from paramiko import SSHClient
from scp import SCPClient
def createSSHClient(self, server):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(server, self.port, self.user, self.password)
return client
def get_copy(self, hostname, dst):
ssh = self.createSSHClient(hostname)
scp = SCPClient(ssh.get_transport())
scp.get(dst)
scp.close()
What am trying is
get_copy(1.1.1.1, "*output")
I am getting file not found Error
Maybe need to use ssh to get list first, then scp them one by one.
Something like follows, just FYI.
def get_copy(self, hostname, dst):
ssh = createSSHClient(hostname)
stdin, stdout, stderr = ssh.exec_command('ls /home/username/*output')
result = stdout.read().split()
scp = SCPClient(ssh.get_transport())
for per_result in result:
scp.get(per_result)
scp.close()
ssh.close()
There are two more ways which I found useful in this context.
1) You may also do it without using SCPClient, but just Paramiko itself. Like -
def get_copy(self, hostname, dst):
ssh = createSSHClient(hostname)
sftp = ssh.open_sftp()
serverfilelist = sftp.listdir(remote_path)
for f in serverfilelist:
if re.search("*output", f):
sftp.get(os.path.join(remote_path, f), local_path)
ssh.close()
2) If you want to use SCPClient to SCP files using regex(wildcards), THIS link will
be helpful, I think.
Once I connect to a remote server as follows,
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
I can do sftp.listdir() and see that there are some gzip files on the remote server, like example.gz.2016. How can I access the text of this file through the sftp connection, without actually downloading the file?
Your question has two parts:
How to view the content of a zip file from the command line
How to execute remote commands and get the output using python¶miko
First things first: How to list the content of a zip file on the console.
less can look into zip files, so in your case, executing less example.gz.2016 should give you a list of files inside the zip archive
Second: how to execute commands remotely.
import paramiko
ssh = paramiko.SSHClient()
# next is needed if your keys are not yet known on the client.
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(HOST_NAME, username=USER, password=PASSWORD)
stdin, stdout, stderr = ssh.exec_command('less ' + zipfilename)
for line in stdout.readlines():
# here you will have your listing
print (line)
for errline in stderr.readlines():
# Don't forget to check the error output
print ('***', errline)
Good Luck!
EDIT
If you need a sFTP connection to the same server, you need to get it from your ssh connection like this
sftp = ssh.open_sftp()
I want to copy a file in python(3.4) using the paramiko library.
My approach:
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(192.168.1.1, 22, root, root)
sftp = ssh.open_sftp()
sftp.put(local_file, remote_file)
sftp.close()
The error I get:
EOF during negotiation
The problem is that the connected system doesn't use sftp.
So is there a way to copy a file without using sftp?
You can use scp to send files, and sshpass to pass password.
import os
os.system('sshpass -p "password" scp local_file root#192.168.1.?:/remotepath/remote_file')
Use the built in paramiko.Transport layer and create your own Channel:
with paramiko.SSHClient() as ssh:
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('192.168.1.1', 22, 'root', 'root')
transport = ssh.get_transport()
with transport.open_channel(kind='session') as channel:
file_data = open('local_data', 'rb').read()
channel.exec_command('cat > remote_file')
channel.sendall(file_data)
I am trying to make a script that downloads ( or upload ) files over ssh, as ftp port is disabled from firewall. This is my script :
import os
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('10.170.21.93', username="abhishek", password="#bhishek$")
sftp = ssh.open_sftp()
localpath = 'abc.txt'
remotepath = '/opt/crestelsetup/patchzip'
sftp.put(localpath, remotepath)
sftp.close()
ssh.close()
This is giving me "IOError: Failure", can any one help?
You need to explicitly specify the remote path:
import os
import paramiko
ssh = paramiko.SSHClient()
ssh.connect('10.170.21.93', username="abhishek", password="#bhishek$")
sftp = ssh.open_sftp()
localpath = 'abc.txt'
remotepath = '/opt/crestelsetup/patchzip/abc.txt'
sftp.put(localpath, remotepath)
sftp.close()
ssh.close()
As per Martin Prikryl's comment, the following code line is highly discouraged as it opens you up against man in the middle attack, however, it can be a temporary fix for missing host keys
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
Just modified the destination path to include the file name as well.Try to change.
remotepath = '/opt/crestelsetup/patchzip'
to
remotepath = '/opt/crestelsetup/patchzip/abc.txt'
You need to modify remotepath. Since, your remote path is /opt/crestelsetup/patchzip. Now need to upload file join with remote path. It can be done using following way.
fname = os.path.basename(localpath)
sftp.put(localpath, os.path.join(remotepath, fname))
I'd like to delete all the files in a given directory on a remote server that I'm already connected to using Paramiko. I cannot explicitly give the file names, though, because these will vary depending on which version of file I had previously put there.
Here's what I'm trying to do... the line below the #TODO is the call I'm trying where remoteArtifactPath is something like /opt/foo/*
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, pkey=mykey)
sftp = ssh.open_sftp()
# TODO: Need to somehow delete all files in remoteArtifactPath remotely
sftp.remove(remoteArtifactPath+"*")
# Close to end
sftp.close()
ssh.close()
Any idea how I can achieve this?
I found a solution: Iterate over all the files in the remote location, then call remove on each of them:
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, pkey=mykey)
sftp = ssh.open_sftp()
# Updated code below:
filesInRemoteArtifacts = sftp.listdir(path=remoteArtifactPath)
for file in filesInRemoteArtifacts:
sftp.remove(remoteArtifactPath+file)
# Close to end
sftp.close()
ssh.close()
You need a recursive routine since your remote directory may have subdirectories.
def rmtree(sftp, remotepath, level=0):
for f in sftp.listdir_attr(remotepath):
rpath = posixpath.join(remotepath, f.filename)
if stat.S_ISDIR(f.st_mode):
rmtree(sftp, rpath, level=(level + 1))
else:
rpath = posixpath.join(remotepath, f.filename)
print('removing %s%s' % (' ' * level, rpath))
sftp.remove(rpath)
print('removing %s%s' % (' ' * level, remotepath))
sftp.rmdir(remotepath)
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, pkey=mykey)
sftp = ssh.open_sftp()
rmtree(sftp, remoteArtifactPath)
# Close to end
stfp.close()
ssh.close()
A Fabric routine could be as simple as this:
with cd(remoteArtifactPath):
run("rm *")
Fabric is great for executing shell commands on remote servers. Fabric actually uses Paramiko underneath, so you can use both if you need to.
For #markolopa answer, you need 2 imports to get it working:
import posixpath
from stat import S_ISDIR
I found a solution, using python3.7 e spur 0.3.20. It is very possible that works with others versions as well.
import spur
shell = spur.SshShell( hostname="ssh_host", username="ssh_usr", password="ssh_pwd")
ssh_session = shell._connect_ssh()
ssh_session.exec_command('rm -rf /dir1/dir2/dir3')
ssh_session.close()
So I know this is an older post but I would still like to give a short answer that I found to be more usefull than the rest I found. Also this uses paramikos in-built functions so it should work on all devices
import paramiko
class remote_operations:
def __init__(self):
pass
def connect(self, hostname, username, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, username=username, password=password)
return client
def delete_files():
print("--Deleting files--")
test = remote_operations()
# these ips and passwords are just examples
username = "aabfbkbakjdfb123"
password = "I_am_not_good_at_making_passwords_123"
host_ip = "111.111.11.11"
client = test.connect(host_ip,username,password)
sftp_client = client.open_sftp()
folderPath = "/my_secret_files/"
sftp_client.chdir(folderPath)
for file in sftp_client.listdir():
sftp_client.remove(file)
if __name__ == '__main__':
delete_files()