How To Read/Write Files From a SFTP Server in Python? [duplicate] - python

I'm working on a simple tool that transfers files to a hard-coded location with the password also hard-coded. I'm a python novice, but thanks to ftplib, it was easy:
import ftplib
info= ('someuser', 'password') #hard-coded
def putfile(file, site, dir, user=(), verbose=True):
"""
upload a file by ftp to a site/directory
login hard-coded, binary transfer
"""
if verbose: print 'Uploading', file
local = open(file, 'rb')
remote = ftplib.FTP(site)
remote.login(*user)
remote.cwd(dir)
remote.storbinary('STOR ' + file, local, 1024)
remote.quit()
local.close()
if verbose: print 'Upload done.'
if __name__ == '__main__':
site = 'somewhere.com' #hard-coded
dir = './uploads/' #hard-coded
import sys, getpass
putfile(sys.argv[1], site, dir, user=info)
The problem is that I can't find any library that supports sFTP. What's the normal way to do something like this securely?
Edit: Thanks to the answers here, I've gotten it working with Paramiko and this was the syntax.
import paramiko
host = "THEHOST.com" #hard-coded
port = 22
transport = paramiko.Transport((host, port))
password = "THEPASSWORD" #hard-coded
username = "THEUSERNAME" #hard-coded
transport.connect(username = username, password = password)
sftp = paramiko.SFTPClient.from_transport(transport)
import sys
path = './THETARGETDIRECTORY/' + sys.argv[1] #hard-coded
localpath = sys.argv[1]
sftp.put(localpath, path)
sftp.close()
transport.close()
print 'Upload done.'
Thanks again!

Paramiko supports SFTP. I've used it, and I've used Twisted. Both have their place, but you might find it easier to start with Paramiko.

You should check out pysftp https://pypi.python.org/pypi/pysftp it depends on paramiko, but wraps most common use cases to just a few lines of code.
import pysftp
import sys
path = './THETARGETDIRECTORY/' + sys.argv[1] #hard-coded
localpath = sys.argv[1]
host = "THEHOST.com" #hard-coded
password = "THEPASSWORD" #hard-coded
username = "THEUSERNAME" #hard-coded
with pysftp.Connection(host, username=username, password=password) as sftp:
sftp.put(localpath, path)
print 'Upload done.'

Here is a sample using pysftp and a private key.
import pysftp
def upload_file(file_path):
private_key = "~/.ssh/your-key.pem" # can use password keyword in Connection instead
srv = pysftp.Connection(host="your-host", username="user-name", private_key=private_key)
srv.chdir('/var/web/public_files/media/uploads') # change directory on remote server
srv.put(file_path) # To download a file, replace put with get
srv.close() # Close connection
pysftp is an easy to use sftp module that utilizes paramiko and pycrypto. It provides a simple interface to sftp.. Other things that you can do with pysftp which are quite useful:
data = srv.listdir() # Get the directory and file listing in a list
srv.get(file_path) # Download a file from remote server
srv.execute('pwd') # Execute a command on the server
More commands and about PySFTP here.

If you want easy and simple, you might also want to look at Fabric. It's an automated deployment tool like Ruby's Capistrano, but simpler and of course for Python. It's build on top of Paramiko.
You might not want to do 'automated deployment' but Fabric would suit your use case perfectly none the less. To show you how simple Fabric is: the fab file and command for your script would look like this (not tested, but 99% sure it will work):
fab_putfile.py:
from fabric.api import *
env.hosts = ['THEHOST.com']
env.user = 'THEUSER'
env.password = 'THEPASSWORD'
def put_file(file):
put(file, './THETARGETDIRECTORY/') # it's copied into the target directory
Then run the file with the fab command:
fab -f fab_putfile.py put_file:file=./path/to/my/file
And you're done! :)

fsspec is a great option for this, it offers a filesystem like implementation of sftp.
from fsspec.implementations.sftp import SFTPFileSystem
fs = SFTPFileSystem(host=host, username=username, password=password)
# list a directory
fs.ls("/")
# open a file
with fs.open(file_name) as file:
content = file.read()
Also worth noting that fsspec uses paramiko in the implementation.

With RSA Key then refer here
Snippet:
import pysftp
import paramiko
from base64 import decodebytes
keydata = b"""AAAAB3NzaC1yc2EAAAADAQABAAABAQDl"""
key = paramiko.RSAKey(data=decodebytes(keydata))
cnopts = pysftp.CnOpts()
cnopts.hostkeys.add(host, 'ssh-rsa', key)
with pysftp.Connection(host=host, username=username, password=password, cnopts=cnopts) as sftp:
with sftp.cd(directory):
sftp.put(file_to_sent_to_ftp)

Twisted can help you with what you are doing, check out their documentation, there are plenty of examples. Also it is a mature product with a big developer/user community behind it.

There are a bunch of answers that mention pysftp, so in the event that you want a context manager wrapper around pysftp, here is a solution that is even less code that ends up looking like the following when used
path = "sftp://user:p#ssw0rd#test.com/path/to/file.txt"
# Read a file
with open_sftp(path) as f:
s = f.read()
print s
# Write to a file
with open_sftp(path, mode='w') as f:
f.write("Some content.")
The (fuller) example: http://www.prschmid.com/2016/09/simple-opensftp-context-manager-for.html
This context manager happens to have auto-retry logic baked in in the event you can't connect the first time around (which surprisingly happens more often than you'd expect in a production environment...)
The context manager gist for open_sftp: https://gist.github.com/prschmid/80a19c22012e42d4d6e791c1e4eb8515

Paramiko is so slow. Use subprocess and shell, here is an example:
remote_file_name = "filename"
remotedir = "/remote/dir"
localpath = "/local/file/dir"
ftp_cmd_p = """
#!/bin/sh
lftp -u username,password sftp://ip:port <<EOF
cd {remotedir}
lcd {localpath}
get {filename}
EOF
"""
subprocess.call(ftp_cmd_p.format(remotedir=remotedir,
localpath=localpath,
filename=remote_file_name
),
shell=True, stdout=sys.stdout, stderr=sys.stderr)

PyFilesystem with its sshfs is one option. It uses Paramiko under the hood and provides a nicer paltform independent interface on top.
import fs
sf = fs.open_fs("sftp://[user[:password]#]host[:port]/[directory]")
sf.makedir('my_dir')
or
from fs.sshfs import SSHFS
sf = SSHFS(...

Here's a generic function that will download any given sftp url to a specified path
from urllib.parse import urlparse
import paramiko
url = 'sftp://username:password#hostname/filepath.txt'
def sftp_download(url, dest):
url = urlparse(url)
with paramiko.Transport((url.hostname, 22)) as transport:
transport.connect(None,url.username,url.password)
with paramiko.SFTPClient.from_transport(transport) as sftp:
sftp.get(url.path, dest)
Call it with
sftp_download(url, "/tmp/filepath.txt")

Related

Downloading CSV inside a .zip with pysftp

I'm trying to load a .csv file stored on a FTP Server (SFTP protocol). I'm using Python in combination with pysftp library. On the FTP server, the CSV file is inside a .zip file. Is there a way to open the zip and then retrieve only the csv file inside it?
Thank you in advance,
import pysftp
cnopts = pysftp.CnOpts()
cnopts.hostkeys = None
# Make connection to sFTP
with pysftp.Connection(hostname,
username=sftp_username,
password=sftp_pw,
cnopts = cnopts
)
with pysftp.cd(download_directory):
with sftp.cd('download_directory'):
print(f'Downloading this file: {filename}')
sftp.get(filename, preserve_mtime=True)
sftp.close()
If you have ssh access to the remote host and know enough about the remote path to the zip file you want and the zip utilities on that host, you can use your ssh client to run the unzip command remotely and capture its output. Here, my target is a linux machine and the zipfile is in the login user's home directory path. I can use the paramiko ssh client to do the work
Its a good idea to log into the remote server via ssh and practice to see what the path structure is like
import sys
import paramiko
import shutil
def sshclient_exec_command_binary(sshclient, command, bufsize=-1,
timeout=None, get_pty=False):
"""Paramiko SSHClient helper that implements exec_command with binary
output.
"""
chan = sshclient._transport.open_session()
if get_pty:
chan.get_pty()
chan.settimeout(timeout)
chan.exec_command(command)
stdin = chan.makefile('wb', bufsize)
stdout = chan.makefile('rb', bufsize)
stderr = chan.makefile_stderr('rb', bufsize)
return stdin, stdout, stderr
# example gets user/pw from command line
if len(sys.argv) != 3:
print("usage: test.py username password")
exit(1)
username, password = sys.argv[1:3]
# put your host/file info here
hostname = "localhost"
remote_zipfile = "tmp/mytest.zip"
file_to_extract = "myfile"
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname, username=username, password=password)
unzip_cmd = "unzip -p {} {}".format(remote_zipfile, file_to_extract)
print("running", unzip_cmd)
stdin, out, err = sshclient_exec_command_binary(ssh, unzip_cmd)
# if the command worked, out is a file-like object to read.
print("writing", file_to_extract)
with open(file_to_extract, 'wb') as out_fp:
shutil.copyfileobj(out, out_fp)

How can I download file from SFTP using pysftp?

I have a SFTP server that contains a set of folders. I would like to download a file from one of these folders.
Can someone help me with the python code?
Where should I start?
Have you tried the example in pysftp's documentation?
import pysftp
with pysftp.Connection('hostname', username='me', password='secret') as sftp:
with sftp.cd('public'): # temporarily chdir to public
sftp.put('/my/local/filename') # upload file to public/ on remote
sftp.get('remote_file') # get a remote file
An addition from Michael Powers.
If your server is one account and your personal account is another one.
If your server account has root permission, then follow Michael's method.
If no, you may use io buffer to download this file
import io
server_sftp = pysftp.Connection(host=test_host, username=server_account,
password=server_pwd, cnopts=cn_opts)
your_user_sftp = pysftp.Connection(host=host_name, username = user_name,
password = user_pwd, cnopts=cn_opts)
try:
file_like=io.BytesIO()
server_account.getfo(server_file_location,file_like)
your_user_sftp.putfo(file_like, target_file_location)
finally:
flie_like.close()
server_sftp.close()
your_user_sftp.close()

Writing path for sftp server python

I am using pysftp on Python and am trying to run a loop for a certain directory in the sftp server.
I don't know how to write the directory paths on sftp servers. I thought that connecting to the server and just writing the directory as below would work, but it doesn't. Please let me know how to write sftp paths so that python can read them.
sftp = pysftp.Connection('128.59.164.112', username = '', password = '');
source = sftp.u\'weatherForecast\'/dataRAW/2004/grib/tmax/
Try this:
import pysftp
sftp = pysftp.Connection('hostname', username='me', password='secret')
sftp.chdir('/home/user/development/stackoverflow')
ls = sftp.listdir()
for filename in ls:
print filename
You should read this: http://pysftp.readthedocs.org/en/latest/index.html
PS1: ; is optional in Python, but not Pythonic.
After enough trial and error I figured it out.
source = 'weatherForecast/dataRAW/2004/grib/tmax/'
destination= 'sftp.u\'weatherForecast\'/csv/2004/tmax'
This works.

Python Script Uploading files via FTP

I would like to make a script to upload a file to FTP.
How would the login system work? I'm looking for something like this:
ftp.login=(mylogin)
ftp.pass=(mypass)
And any other sign in credentials.
Use ftplib, you can write it like this:
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()
Use ftplib.FTP_TLS instead if you FTP host requires TLS.
To retrieve it, you can use urllib.retrieve:
import urllib
urllib.urlretrieve('ftp://server/path/to/file', 'file')
EDIT:
To find out the current directory, use FTP.pwd():
FTP.pwd(): Return the pathname of the current directory on the server.
To change the directory, use FTP.cwd(pathname):
FTP.cwd(pathname): Set the current directory on the server.
ftplib now supports context managers so I guess it can be made even easier
from ftplib import FTP
from pathlib import Path
file_path = Path('kitten.jpg')
with FTP('server.address.com', 'USER', 'PWD') as ftp, open(file_path, 'rb') as file:
ftp.storbinary(f'STOR {file_path.name}', file)
No need to close the file or the session
You will most likely want to use the ftplib module for python
import ftplib
ftp = ftplib.FTP()
host = "ftp.site.uk"
port = 21
ftp.connect(host, port)
print (ftp.getwelcome())
try:
print ("Logging in...")
ftp.login("yourusername", "yourpassword")
except:
"failed to login"
This logs you into an FTP server. What you do from there is up to you. Your question doesnt indicate any other operations that really need doing.
Try this:
#!/usr/bin/env python
import os
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('hostname', username="username", password="password")
sftp = ssh.open_sftp()
localpath = '/home/e100075/python/ss.txt'
remotepath = '/home/developers/screenshots/ss.txt'
sftp.put(localpath, remotepath)
sftp.close()
ssh.close()
To avoid getting the encryption error you can also try out below commands
ftp = ftplib.FTP_TLS("ftps.dummy.com")
ftp.login("username", "password")
ftp.prot_p()
file = open("filename", "rb")
ftp.storbinary("STOR filename", file)
file.close()
ftp.close()
ftp.prot_p() ensure that your connections are encrypted
I just answered a similar question here
IMHO, if your FTP server is able to communicate with Fabric please us Fabric. It is far better than doing raw ftp.
I have an FTP account from dotgeek.com so I am not sure if this will work for other FTP accounts.
#!/usr/bin/python
from fabric.api import run, env, sudo, put
env.user = 'username'
env.hosts = ['ftp_host_name',] # such as ftp.google.com
def copy():
# assuming i have wong_8066.zip in the same directory as this script
put('wong_8066.zip', '/www/public/wong_8066.zip')
save the file as fabfile.py and run fab copy locally.
yeukhon#yeukhon-P5E-VM-DO:~$ fab copy2
[1.ai] Executing task 'copy2'
[1.ai] Login password:
[1.ai] put: wong_8066.zip -> /www/public/wong_8066.zip
Done.
Disconnecting from 1.ai... done.
Once again, if you don't want to input password all the time, just add
env.password = 'my_password'
You can use the below function. I haven't tested it yet, but it should work fine. Remember the destination is a directory path where as source is complete file path.
import ftplib
import os
def uploadFileFTP(sourceFilePath, destinationDirectory, server, username, password):
myFTP = ftplib.FTP(server, username, password)
if destinationDirectory in [name for name, data in list(remote.mlsd())]:
print "Destination Directory does not exist. Creating it first"
myFTP.mkd(destinationDirectory)
# Changing Working Directory
myFTP.cwd(destinationDirectory)
if os.path.isfile(sourceFilePath):
fh = open(sourceFilePath, 'rb')
myFTP.storbinary('STOR %s' % f, fh)
fh.close()
else:
print "Source File does not exist"

How to delete all files in directory on remote SFTP server in Python?

I'd like to delete all the files in a given directory on a remote server that I'm already connected to using Paramiko. I cannot explicitly give the file names, though, because these will vary depending on which version of file I had previously put there.
Here's what I'm trying to do... the line below the #TODO is the call I'm trying where remoteArtifactPath is something like /opt/foo/*
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, pkey=mykey)
sftp = ssh.open_sftp()
# TODO: Need to somehow delete all files in remoteArtifactPath remotely
sftp.remove(remoteArtifactPath+"*")
# Close to end
sftp.close()
ssh.close()
Any idea how I can achieve this?
I found a solution: Iterate over all the files in the remote location, then call remove on each of them:
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, pkey=mykey)
sftp = ssh.open_sftp()
# Updated code below:
filesInRemoteArtifacts = sftp.listdir(path=remoteArtifactPath)
for file in filesInRemoteArtifacts:
sftp.remove(remoteArtifactPath+file)
# Close to end
sftp.close()
ssh.close()
You need a recursive routine since your remote directory may have subdirectories.
def rmtree(sftp, remotepath, level=0):
for f in sftp.listdir_attr(remotepath):
rpath = posixpath.join(remotepath, f.filename)
if stat.S_ISDIR(f.st_mode):
rmtree(sftp, rpath, level=(level + 1))
else:
rpath = posixpath.join(remotepath, f.filename)
print('removing %s%s' % (' ' * level, rpath))
sftp.remove(rpath)
print('removing %s%s' % (' ' * level, remotepath))
sftp.rmdir(remotepath)
ssh = paramiko.SSHClient()
ssh.load_host_keys(os.path.expanduser(os.path.join("~", ".ssh", "known_hosts")))
ssh.connect(server, username=username, pkey=mykey)
sftp = ssh.open_sftp()
rmtree(sftp, remoteArtifactPath)
# Close to end
stfp.close()
ssh.close()
A Fabric routine could be as simple as this:
with cd(remoteArtifactPath):
run("rm *")
Fabric is great for executing shell commands on remote servers. Fabric actually uses Paramiko underneath, so you can use both if you need to.
For #markolopa answer, you need 2 imports to get it working:
import posixpath
from stat import S_ISDIR
I found a solution, using python3.7 e spur 0.3.20. It is very possible that works with others versions as well.
import spur
shell = spur.SshShell( hostname="ssh_host", username="ssh_usr", password="ssh_pwd")
ssh_session = shell._connect_ssh()
ssh_session.exec_command('rm -rf /dir1/dir2/dir3')
ssh_session.close()
So I know this is an older post but I would still like to give a short answer that I found to be more usefull than the rest I found. Also this uses paramikos in-built functions so it should work on all devices
import paramiko
class remote_operations:
def __init__(self):
pass
def connect(self, hostname, username, password):
client = paramiko.SSHClient()
client.load_system_host_keys()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
client.connect(hostname, username=username, password=password)
return client
def delete_files():
print("--Deleting files--")
test = remote_operations()
# these ips and passwords are just examples
username = "aabfbkbakjdfb123"
password = "I_am_not_good_at_making_passwords_123"
host_ip = "111.111.11.11"
client = test.connect(host_ip,username,password)
sftp_client = client.open_sftp()
folderPath = "/my_secret_files/"
sftp_client.chdir(folderPath)
for file in sftp_client.listdir():
sftp_client.remove(file)
if __name__ == '__main__':
delete_files()

Categories

Resources