Paramiko SFTP - Avoid having to specify full local filename? - python

I have some Python code that uses Paramiko to grab build files from a remote server:
def setup_sftp_session(self, host='server.com', port=22, username='puppy'):
self.transport = paramiko.Transport((host, port))
privatekeyfile = os.path.expanduser('~/.ssh/id_dsa')
try:
ssh_key = paramiko.DSSKey.from_private_key_file(privatekeyfile)
except IOError, e:
self.logger.error('Unable to find SSH keyfile: %s' % str(e))
sys.exit(1)
try:
self.transport.connect(username = username, pkey = ssh_key)
except paramiko.AuthenticationException, e:
self.logger.error("Unable to logon - are you sure you've added the pubkey to the server?: %s" % str(e))
sys.exit(1)
self.sftp = paramiko.SFTPClient.from_transport(self.transport)
self.sftp.chdir('/some/location/buildfiles')
def get_file(self, remote_filename):
try:
self.sftp.get(remote_filename, 'I just want to save it in the local cwd')
except IOError, e:
self.logger.error('Unable to find copy remote file %s' % str(e))
def close_sftp_session(self):
self.sftp.close()
self.transport.close()
I'd like to retrieve each file, and deposit it in the current local working directory.
However, Paramiko doesn't seem to have an option for this - you need to specify the full local destination. You can't even specify a directory (e.g. "./", or even "/home/victorhooi/files") - you need the full path including filename.
Is there any way around this? It'll be annoying if we have to specify the local filename as well, instead of just copying the remote one.
Also - the way I'm handling exceptions in setup_sftp_session, with exit(1) - is that a good practice, or is there a better way?
Cheers,
Victor

you have to insert
os.path.join(os.getcwd(), remote_filename)
Calling exit() in a function is not a good idea. Maybe you want to reuse the code and take some action in case of an exception. If you keep the exit() call you are lost.
I suggest to modify this function such that a True is returned in case of success, and False else. Then the caller can decide what to do.
Another approach would be not to catch the exceptions. So the caller has to handle them and the caller gets the full information (including the stacktrace) about the circumstances of the failure.

Related

Using Paramiko in Python check if directory can be deleted

I found out a way to delete a remote directory using Paramiko based on:
How to delete all files in directory on remote SFTP server in Python?
However, if directory does not have valid permissions than it would fail. I have moved deletion to an exception block. However, is there a way to check if directory has valid permissions to delete?
Currently, what I notice is that in a recursive directory, if one of the sub-directory does not write permissions, then it would fail to delete, but I want deletion to continue and ignore the one missing necessary permission. But, if the root directory itself doesn't have valid permission, then throw and exception and bail out with appropriate exit code.
How can I do this?
def rmtree(sftp, remotepath, level=0):
try:
for f in sftp.listdir_attr(remotepath):
rpath = posixpath.join(remotepath, f.filename)
if stat.S_ISDIR(f.st_mode):
rmtree(sftp, rpath, level=(level + 1))
else:
rpath = posixpath.join(remotepath, f.filename)
try:
sftp.remove(rpath)
except Exception as e:
print("Error: Failed to remove: {0}".format(rpath))
print(e)
except IOError as io:
print("Error: Access denied. Do not have permissions to remove: {0} -- {1}".format(remotepath, level))
print(io)
if level == 0:
sys.exit(1)
except Exception as e:
print("Error: Failed to delete: {0} -- {1}".format(remotepath, level))
print(e)
if level == 0:
sys.exit(1)
if level <= 2:
print('removing %s%s' % (' ' * level, remotepath))
try:
sftp.rmdir(remotepath)
except IOError as io:
print("Error: Access denied for deleting. Invalid permission")
print(io)
except Exception as e:
print("Error: failed while deleting: {0}".format(remotepath))
print(e)
return
It's not possible to "check" for actual permissions for a specific operation with SFTP protocol. SFTP API does not provide such functionality nor enough information for you to decide on your own.
See also How to check for read permission using JSch with SFTP protocol?
You would have to use another API – e.g. execute some test in a shell using SSHClient.exec_command. For example you can use the test command, e.g.:
test -w /parent/directory

Checking FTP connection is valid using NOOP command

I'm having trouble with one of my scripts seemingly disconnecting from my FTP during long batches of jobs. To counter this, I've attempted to make a module as shown below:
def connect_ftp(ftp):
print "ftp1"
starttime = time.time()
retry = False
try:
ftp.voidcmd("NOOP")
print "ftp2"
except:
retry = True
print "ftp3"
print "ftp4"
while (retry):
try:
print "ftp5"
ftp.connect()
ftp.login('LOGIN', 'CENSORED')
print "ftp6"
retry = False
print "ftp7"
except IOError as e:
print "ftp8"
retry = True
sys.stdout.write("\rTime disconnected - "+str(time.time()-starttime))
sys.stdout.flush()
print "ftp9"
I call the function using only:
ftp = ftplib.FTP('CENSORED')
connect_ftp(ftp)
However, I've traced how the code runs using print lines, and on the first use of the module (before the FTP is even connected to) my script runs ftp.voidcmd("NOOP") and does not except it, so no attempt is made to connect to the FTP initially.
The output is:
ftp1
ftp2
ftp4
ftp success #this is ran after the module is called
I admit my code isn't the best or prettiest, and I haven't implemented anything yet to make sure I'm not reconnecting constantly if I keep failing to reconnect, but I can't work out why this isn't working for the life of me so I don't see a point in expanding the module yet. Is this even the best approach for connecting/reconnecting to an FTP?
Thank you in advance
This connects to the server:
ftp = ftplib.FTP('CENSORED')
So, naturally the NOOP command succeeds, as it does not need an authenticated connection.
Your connect_ftp is correct, except that you need to specify a hostname in your connect call.

How to print long listing for files in a remote server using pysftp in Python2.7

I have a script which connects to remote server using pysftp and does all basic functions such as put,get and listing files in remote server(basic operation between remote and local machine). The script doesn't show me the long listing of files in a folder in remote machine.But it prints me the long listing of files in the current path in local machine. I found it strange and hence have been trying out all possible solutions like cwd,cd,chdir etc. Please find the particular portion of the code and help me resolve the issue.
if (command == 'LIST'):
print"Script will start listing files"
try:
s = pysftp.Connection('ip', username='user', password='pwd')
except Exception, e:
print e
logfile.write("Unable to connect to FTP Server: Erro is-->" + "\n")
sys.exit()
try:
s.cwd('remote_path')
print(s.pwd)
except Exception, e:
print e
logfile.write("Unable to perform cwd:" + "\n")
sys.exit()
try:
print(s.pwd)
print(subprocess.check_output(["ls"]))
except Exception, e:
print "Unable to perform listing of files in Remote Directory"
s.close()
sys.exit()
Thanks and regards,
Shreeram

check if directory exists on remote machine before sftp

this my function that copies file from local machine to remote machine with paramiko, but it doesn't check if the destination directory exists and continues copying and doesn't throws error if remote path doesn't exists
def copyToServer(hostname, username, password, destPath, localPath):
transport = paramiko.Transport((hostname, 22))
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
sftp.put(localPath, destPath)
sftp.close()
transport.close()
i want to check if path on remote machine exists and throw error if not.
thanks in advance
In my opinion it's better to avoid exceptions, so unless you have lots of folders, this is a good option for you:
if folder_name not in self.sftp.listdir(path):
sftp.mkdir(os.path.join(path, folder_name))
This will do
def copyToServer(hostname, username, password, destPath, localPath):
transport = paramiko.Transport((hostname, 22))
sftp = paramiko.SFTPClient.from_transport(transport)
try:
sftp.put(localPath, destPath)
sftp.close()
transport.close()
print(" %s SUCCESS " % hostname )
return True
except Exception as e:
try:
filestat=sftp.stat(destPath)
destPathExists = True
except Exception as e:
destPathExists = False
if destPathExists == False:
print(" %s FAILED - copying failed because directory on remote machine doesn't exist" % hostname)
log.write("%s FAILED - copying failed directory at remote machine doesn't exist\r\n" % hostname)
else:
print(" %s FAILED - copying failed" % hostname)
log.write("%s FAILED - copying failed\r\n" % hostname)
return False
You can use the chdir() method of SFTPClient. It checks if the remote path exists, raises error if not.
try:
sftp.chdir(destPath)
except IOError as e:
raise e
I would use the listdir method within SFTPClient. You will probably have to use this recursively to ensure the entire path is valid.

Python: how to setup python-ldap to ignore referrals?

how can I avoid getting (undocumented) exception in following code?
import ldap
import ldap.sasl
connection = ldap.initialize('ldaps://server:636', trace_level=0)
connection.set_option(ldap.OPT_REFERRALS, 0)
connection.protocol_version = 3
sasl_auth = ldap.sasl.external()
connection.sasl_interactive_bind_s('', sasl_auth)
baseDN = 'ou=org.com,ou=xx,dc=xxx,dc=com'
filter = 'objectclass=*'
try:
result = connection.search_s(baseDN, ldap.SCOPE_SUBTREE, filter)
except ldap.REFERRAL, e:
print "referral"
except ldap.LDAPError, e:
print "Ldaperror"
It happens that baseDN given in example is a referral. When I run this code I get referral as output.
What would I want is that python-ldap just would skip it or ignore without throwing strange exception (I cannot find documentation about it)?
(this may help or not) The problem happened when I was searching baseDN upper in a tree. When I was searching 'ou=xx,dc=xxx,dc=com' it started to freeze on my production env when on development env everything works great. When I started to looking at it I found that it freezing on referral branches. How can I tell python-ldap to ignore referrals? Code above does not work as I want.
This is a working example, see if it helps.
def ldap_initialize(remote, port, user, password, use_ssl=False, timeout=None):
prefix = 'ldap'
if use_ssl is True:
prefix = 'ldaps'
# ask ldap to ignore certificate errors
ldap.set_option(ldap.OPT_X_TLS_REQUIRE_CERT, ldap.OPT_X_TLS_NEVER)
if timeout:
ldap.set_option(ldap.OPT_NETWORK_TIMEOUT, timeout)
ldap.set_option(ldap.OPT_REFERRALS, ldap.OPT_OFF)
server = prefix + '://' + remote + ':' + '%s' % port
l = ldap.initialize(server)
l.simple_bind_s(user, password)

Categories

Resources