I'm familiar with ftplib and it works great for simple interface but I need file properties and basically a rich ftp client. does anyone know of a good ftp client library?
Use the MLSD command. You have to parse it yourself but it's fairly easy.
# Note that portions of MLSD data are case insensitive...
def parseinfo(info):
for fact in info.split(';'):
if not fact:
continue
name, value = fact.split('=', 1)
yield name.lower(), value
ftp = ftplib.FTP(host, user, passwd)
dirinfo = {}
def callback(line):
info, fname = line.split(' ', 1)
dirinfo[fname] = dict(parseinfo(info))
ftp.retrlines('MLSD {}'.format(path), callback)
print(dirinfo)
That's about as rich as FTP gets.
The ftputil could be what you are looking for:
The FTPHost objects generated with ftputil allow many operations similar to those of os and os.path.
The API supports well file information gathering.
Related
I used the ftputil module for this, but ran into a problem that it doesn't support 'a'(append) appending to the file, and if you write via 'w' it overwrites the contents.
That's what I tried and I'm stuck there:
with ftputil.FTPHost(host, ftp_user, ftp_pass) as ftp_host:
with ftp_host.open("my_path_to_file_on_the_server", "a") as fobj:
cupone_wr = input('Enter coupons with a space: ')
cupone_wr = cupone_wr.split(' ')
for x in range(0, len(cupone_wr)):
cupone_str = '<p>Your coupon %s</p>\n' % cupone_wr[x]
data = fobj.write(cupone_str)
print(data)
The goal is to leave the old entries in the file and add fresh entries to the end of the file every time the script is called again.
Indeed, ftputil does not support appending. So either you will have to download complete file and reupload it with appended records. Or you will have to use another FTP library.
For example the built-in Python ftplib supports appending. On the other hand, it does not (at least not easily) support streaming. Instead, it's easier to construct the new records in-memory and upload/append them at once:
from ftplib import FTP
from io import BytesIO
flo = BytesIO()
cupone_wr = input('Enter coupons with a space: ')
cupone_wr = cupone_wr.split(' ')
for x in range(0, len(cupone_wr)):
cupone_str = '<p>Your coupon %s</p>\n' % cupone_wr[x]
flo.write(cupone_str)
ftp = FTP('ftp.example.com', 'username', 'password')
flo.seek(0)
ftp.storbinary('APPE my_path_to_file_on_the_server', flo)
ftputil author here :-)
Martin is correct in that there's no explicit append mode. That said, you can open file-like objects with a rest argument. In your case, rest would need to be the original length of the file you want to append to.
The documentation warns against using a rest argument that points after the file because I'm quite sure rest isn't expected to be used that way. However, if you use your program only against a specific server and can verify its behavior, it might be worthwhile to experiment with rest. I'd be interested whether it works for you.
I am trying to fetch my Gmail account's folder (or it terms of Google, label) names using python and IMAP protocol. To achieve this, I have the following code (omitting exception handling and other details for simplicity):
mail = imaplib.IMAP4_SSL('imap.gmail.com')
rv, data = mail.login(EMAIL_ACCOUNT, psw)
rv, folders = mail.list()
if rv != 'OK':
for folder in folders:
list_response_pattern = re.compile(r'\((?P<flags>.*?)\) "(?P<delimiter>.*)" (?P<name>.*)')
folder = folder.decode("utf-8")
flags, delimiter, name = list_response_pattern.match(folder).groups()
name = mailbox_name.strip('"')
print(name)
The output is the list of my mailboxes ("INBOX", "Junk", "Important", etc).
However, the issue with this code is that if the mailbox name is in a language other than English (say, Russian), I get strange strings instead of real names (I guess, this is sort of encoding). For example, one of my mailboxes is named "Личное". Instead of "Личное", I get something like '&BBsEOARHBD0EPgQ1' in the output.
Some time ago, someone already asked an identical question, but it remains unanswered up to now. I decided to repeat the question, because I have spent a whole day trying to google this. And nothing...Help me please. I am completely stuck at this stage.
P.S. It looks like in PHP, there's a function for solving this issue
You can use from imapclient import imap_utf7 to decode bytes to cyrrillic, than split name with '|'. Like:
from imapclient import imap_utf7
# Вывожу список папок
for folder in mail.list()[1]:
# b'(\\Marked \\HasNoChildren) "|" "&BB0EEAQRBB4EIA-"'
decoded_folder = imap_utf7.decode(folder)
# (\Marked \HasNoChildren) "|" "НАБОР"
folder_name = decoded_folder.split(' "|" ')
# "НАБОР"
I was wondering if any one observed that the time taken to download or upload a file over ftp using Python's ftplib is very large as compared to performing FTP get/put over windows command prompt or using Perl's Net::FTP module.
I created a simple FTP client similar to http://code.activestate.com/recipes/521925-python-ftp-client/ but I am unable to attain the speed which I get when running FTP at the Windows DOS prompt or using perl. Is there something I am missing or is it a problem with the Python ftplib module.
I would really appreciate if you could throw some light as to why I am getting low throughput with Python.
The problem was with the block size, i was using a block size of 1024 which was too small. After increasing the block size to 250Kb the speeds are similar across all the different platforms.
def putfile(file=file, site=site, dir=dir, user=())
upFile = open(file, 'rb')
handle = ftplib.FTP(site)
apply(handle.login, user)
print "Upload started"
handle.storbinary('STOR ' + file, upFile, 262144)
print "Upload completed"
handle.quit()
upFile.close()
I had a similar issue with the default blocksize of 8192 using FTP_TLS
site = 'ftp.siteurl.com'
user = 'username-here'
upass = 'supersecretpassword'
ftp = FTP_TLS(host=site, user=user, passwd=upass)
with open(newfilename, 'wb') as f:
def callback(data):
f.write(data)
ftp.retrbinary('RETR filename.txt', callback, blocksize=262144)
Increasing the block size increased speed 10x. Thanks #Tanmoy Dube
I'm using python -m SimpleHTTPServer to serve up a directory for local testing in a web browser. Some of the content includes large data files. I would like to be able to gzip them and have SimpleHTTPServer serve them with Content-Encoding: gzip.
Is there an easy way to do this?
This is an old question, but it still ranks #1 in Google for me, so I suppose a proper answer might be of use to someone beside me.
The solution turns out to be very simple. in the do_GET(), do_POST, etc, you only need to add the following:
content = self.gzipencode(strcontent)
...your other headers, etc...
self.send_header("Content-length", str(len(str(content))))
self.send_header("Content-Encoding", "gzip")
self.end_headers()
self.wfile.write(content)
self.wfile.flush()
strcontent being your actual content (as in HTML, javascript or other HTML resources)
and the gzipencode:
def gzipencode(self, content):
import StringIO
import gzip
out = StringIO.StringIO()
f = gzip.GzipFile(fileobj=out, mode='w', compresslevel=5)
f.write(content)
f.close()
return out.getvalue()
Since this was the top google result I figured I would post my simple modification to the script that got gzip to work.
https://github.com/ksmith97/GzipSimpleHTTPServer
As so many others, I've been using python -m SimpleHTTPServer for local testing as well. This is still the top result on google and while https://github.com/ksmith97/GzipSimpleHTTPServer is a nice solution, it enforces gzip even if not requested and there's no flag to enable/disable it.
I decided to write a tiny cli tool that supports this. It's go, so the regular install procedure is simply:
go get github.com/rhardih/serve
If you already have $GOPATH added to $PATH, that's all you need. Now you have serve as a command.
https://github.com/rhardih/serve
This was a feature request but rejected due to wanting to keep the simple http server simple: https://bugs.python.org/issue30576
The issue author eventually released a standalone version for Python 3: https://github.com/PierreQuentel/httpcompressionserver
Building on #velis answer above, here is how I do it. gZipping small data is not worth the time and can increase its size. Tested with Dalvik client.
def do_GET(self):
... get content
self.send_response(returnCode) # 200, 401, etc
...your other headers, etc...
if len(content) > 100: # don't bother compressing small data
if 'accept-encoding' in self.headers: # case insensitive
if 'gzip' in self.headers['accept-encoding']:
content = gzipencode(content) # gzipencode defined above in #velis answer
self.send_header('content-encoding', 'gzip')
self.send_header('content-length', len(content))
self.end_headers() # send a blank line
self.wfile.write(content)
From looking at SimpleHTTPServer's documentation, there is no way. However, I recommend lighttpd with the mod_compress module.
I am using python and paramiko to read some files using sftp. The get is working fine. When I am done processing the file, I would like to put a file summarizing the results. I would rather not have to save the file locally first in order to do this; I have a dict of the results, I just want to create a file on the sftp server to put that into. Below is my code, with I hope all of the relevant bits in and the unrelated parts removed for readability.
Note that I am successfully reading the file and processing it, and creating the dict of results, without a problem, and I can print it to my terminal when I run csv_import. When I try to add the final step of putting the dict of results into a file on the same sftp server, though, it hangs forever. Any help is appreciated.
def csv_import():
we_are_live = True
host = "111.111.111.111"
port = 22
password = "cleverpwd"
username = "cleverun"
t = paramiko.Transport((host,port))
t.connect(username=username, password=password)
if we_are_live and t.is_authenticated():
sftp = paramiko.SFTPClient.from_transport(t)
sftp.chdir('.'+settings.REMOTE_SFTP_DIRECTORY)
files_to_pick_from = sftp.listdir()
…file processing code happens here, get back a dictionary of the results...
results_file_name = 'results'+client_file_name
results_file = paramiko.SFTPClient.from_transport(t)
results_file.file(results_file_name,mode='w',bufsize=-1)
results_file.write(str(sftp_results_of_import))
results_file.close()
t.close()
Did something similar a while ago, but i used disk files, maybe you find something useful:
http://code.activestate.com/recipes/576810-copy-files-over-ssh-using-paramiko/
And if you need to only create files in memory you could try
StringIO:
http://docs.python.org/library/stringio.html