Getting an EOFError when getting large files with Paramiko - python

I'm trying to write a quick python script to grab some logs with sftp. My first inclination was to use Pysftp, since it seemed like it made it very simple. It worked great, until it got to a larger file. I got an error while getting any file over about 13 MB. I then decided to try writing what I needed directly in Paramiko, rather than relying on the extra layer of Pysftp. After figuring out how to do that, I ended up getting the exact same error. Here's the Paramiko code, as well as the trace from the error I get. Does anyone have any idea why this would have an issue pulling any largish files? Thanks.
# Create tranport and connect
transport = paramiko.Transport((host, 22))
transport.connect(username=username, password=password)
sftp = paramiko.SFTPClient.from_transport(transport)
# List of the log files in c:
files = sftp.listdir('c:/logs')
# Now pull them, logging as you go
for f in files:
if f[0].lower() == 't' or f[:3].lower() == 'std':
logger.info('Pulling {0}'.format(f))
sftp.get('c:/logs/{0}'.format(f), output_dir +'/{0}'.format(f))
# Close the connection
sftp.close()
transport.close()
And here's the error:
No handlers could be found for logger "paramiko.transport"
Traceback (most recent call last):
File "pull_logs.py", line 420, in <module> main()
File "pull_logs.py", line 410, in main
pull_logs(username, host, password, location)
File "pull_logs.py", line 142, in pull_logs
sftp.get('c:/logs/{0}'.format(f), output_dir +'/{0}'.format(f))
File "/Users/me/my_site/site_packages/paramiko/sftp_client.py", line 676, in get
size = self.getfo(remotepath, fl, callback)
File "/Users/me/my_site/site_packages/paramiko/sftp_client.py", line 645, in getfo
data = fr.read(32768)
File "/Users/me/my_site/site_packages/paramiko/file.py", line 153, in read
new_data = self._read(read_size)
File "/Users/me/my_site/site_packages/paramiko/sftp_file.py", line 152, in _read
data = self._read_prefetch(size)
File "/Users/me/my_site/site_packages/paramiko/sftp_file.py", line 132, in _read_prefetch
self.sftp._read_response()
File "/Users/me/my_site/site_packages/paramiko/sftp_client.py", line 721, in _read_response
raise SSHException('Server connection dropped: %s' % (str(e),))
paramiko.SSHException: Server connection dropped:

Related

VI_ERROR_TMO (-1073807339) on Anritsu OSA

I'm working on interfacing an old Optical Spectrum Analyzer Anritsu MS9710B with a RS232 connection. A year ago, I managed to comunicate with him, send SCPI command using pyvisa and receive data. Today, I execute the exact same code with libraries up to date and I get an error Timeout when I query IDN or anything. The RS232-USB drivers are ok, I manage to open a communication but a query or read fail.
I changed the parameter to "RS232C" on the OSA parameters and my communication parameters are the same between OSA and program. Following advices from the forum, I tried the connection with NI-VISA where I get the same error. I tried to change timeout parameter, write_termination and read_termination but there isn't any change. The manual is very hard to understand termination value, but when it worked I didn't add anything.
I don't know anymore what to do.
Any advices and help would be warmly welcome to fix my problem !
import pyvisa
from pyvisa.constants import StopBits, Parity
rm = pyvisa.ResourceManager()
print(rm.list_resources())
my_instrument = rm.open_resource('ASRL5::INSTR')
my_instrument.baud_rate=9600
my_instrument.data_bits=8
my_instrument.parity=Parity.even
my_instrument.stop_bits=StopBits.one
my_instrument.write('*IDN?')
print(my_instrument.read())```
'''
('ASRL5::INSTR',)
Traceback (most recent call last):
File "PremiereComm.py", line 26, in <module>
print(my_instrument.read())
File "C:\Program Files\Python38\lib\site-packages\pyvisa\resources\messagebased.py", line 486, in read
message = self._read_raw().decode(enco)
File "C:\Program Files\Python38\lib\site-packages\pyvisa\resources\messagebased.py", line 442, in _read_raw
chunk, status = self.visalib.read(self.session, size)
File "C:\Program Files\Python38\lib\site-packages\pyvisa\ctwrapper\functions.py", line 2337, in read
ret = library.viRead(session, buffer, count, byref(return_count))
File "C:\Program Files\Python38\lib\site-packages\pyvisa\ctwrapper\highlevel.py", line 222, in _return_handler
return self.handle_return_value(session, ret_value) # type: ignore
File "C:\Program Files\Python38\lib\site-packages\pyvisa\highlevel.py", line 251, in handle_return_value
raise errors.VisaIOError(rv)
pyvisa.errors.VisaIOError: VI_ERROR_TMO (-1073807339): Timeout expired before operation completed.
>>> '''

EFOError when trying to connect Pyftpsync to remote server on port 22

I am trying to sync two folders via FTP, yes I know there are better or different ways but for now I need to implement it this way, I was trying the example code from pyftpsync since well, a sample code should work easily right? I am just trying to connect between some test folders I made, one is empty(local) and the remote has a single text file that I want to fetch. It tries to connect but after about 2 minutes I get this error.
Well, my FTP does work outside of python. I can connect over WinSCP just fine.
Some places mentioned that a proxy could possibly cause this, but it seems I am not behind a proxy currently, but maybe I did not set that properly and it believes there should be a proxy somehow?
Here is my code, just using commands on the prompt for pyftpsync produces the same errors for me. So it is possible some input parameter is off causing all of this.
import time
import os
import re
import shutil
import string
import sys
from ftpsync.targets import FsTarget
from ftpsync.ftp_target import FtpTarget
from ftpsync.synchronizers import DownloadSynchronizer
#synchronize a local folder with ftp
local = FsTarget( "C:\\testfolder\\" )
user = "login"
passwd = "password"
remote = FtpTarget("/my/folder/location/testfold/", "126.0.0.1",port=22, username=user,password=passwd,tls=False,timeout=None,extra_opts=None)
opts = {}
s=DownloadSynchronizer(local, remote, opts)
s.run()
This is the output I am getting, I have edited out the folder names and IP addresses.
INFO:keyring.backend:Loading KWallet
INFO:keyring.backend:Loading SecretService
INFO:keyring.backend:Loading Windows
INFO:keyring.backend:Loading chainer
INFO:keyring.backend:Loading macOS
INFO:pyftpsync:Download to C:\testfolder
from ftp://126.0.0.1/.../testfold
INFO:pyftpsync:Encoding local: utf-8, remote: utf-8
Traceback (most recent call last):
File "c:\..\.py", line 30, in <module>
s.run()
File "C:\\AppData\Local\Programs\Python\Python37-32\lib\site-
packages\ftpsync\synchronizers.py", line 1268, in run
res = super(DownloadSynchronizer, self).run()
File "C:\\AppData\Local\Programs\Python\Python37-
32\lib\site-packages\ftpsync\synchronizers.py", line 827, in run
res = super(BiDirSynchronizer, self).run()
File "C:\\AppData\Local\Programs\Python\Python37-
32\lib\site-packages\ftpsync\synchronizers.py", line 211, in run
self.remote.open()
File "C:\\AppData\Local\Programs\Python\Python37-
32\lib\site-packages\ftpsync\ftp_target.py", line 141, in open
self.ftp.connect(self.host, self.port)
File "C:\\AppData\Local\Programs\Python\Python37-
32\lib\ftplib.py", line 155, in connect
self.welcome = self.getresp()
File "C:\\Local\Programs\Python\Python37-
32\lib\ftplib.py", line 236, in getresp
resp = self.getmultiline()
File "C:\\AppData\Local\Programs\Python\Python37-
32\lib\ftplib.py", line 226, in getmultiline
nextline = self.getline()
File "C:\\AppData\Local\Programs\Python\Python37-
32\lib\ftplib.py", line 210, in getline
raise EOFError
EOFError
Anyways any possible troubleshooting ideas would help. Thank you.
Pyftpsync uses FTP protocol.
You are connecting to port 22, which is used for SSH/SFTP.
So if your server is actually SFTP server, not FTP server, you cannot use Pyftpsync with it.

Python ftplib cannot use STOR in callback function following RETR

Here is what I need to accomplish:
- connect to FTP
- get contents of test.txt
- write new contents into test.txt right after getting the results
In the real case scenario I need to get previos modification time, stored in a txt file and then upload to FTP only those files which were modified after that time without checking every file specifically (there are thousands of them, that would be too long).
Here is where I'm stuck.
def continueTest(data, ftp):
print(data, ftp)
with open('test.txt', 'w+') as file:
file.write('test')
with open('test.txt', 'rb') as file:
ftp.storbinary('STOR htdocs/test.txt', file)
def test():
host_data=FTP_HOSTS['planz-norwegian']
ftp = ftplib.FTP(host=host_data['server'],
user = host_data['username'],
passwd = host_data['password'])
print('connected to ftp')
ftp.retrbinary('RETR htdocs/test.txt', lambda data:continueTest(data, ftp))
if __name__=='__main__':
test()
This outputs:
connected to ftp
b'test' <ftplib.FTP object at 0x0322FAB0>
Traceback (most recent call last):
File "C:\Python33\Plan Z Editor SL\redistdb.py", line 111, in <module>
test()
File "C:\Python33\Plan Z Editor SL\redistdb.py", line 107, in test
ftp.retrbinary('RETR htdocs/test.txt', lambda data:continueTest(data, ftp))
File "C:\Python33\lib\ftplib.py", line 434, in retrbinary
callback(data)
File "C:\Python33\Plan Z Editor SL\redistdb.py", line 107, in <lambda>
ftp.retrbinary('RETR htdocs/test.txt', lambda data:continueTest(data, ftp))
File "C:\Python33\Plan Z Editor SL\redistdb.py", line 99, in continueTest
ftp.storbinary('STOR htdocs/test.txt', file)
File "C:\Python33\lib\ftplib.py", line 483, in storbinary
with self.transfercmd(cmd, rest) as conn:
File "C:\Python33\lib\ftplib.py", line 391, in transfercmd
return self.ntransfercmd(cmd, rest)[0]
File "C:\Python33\lib\ftplib.py", line 351, in ntransfercmd
host, port = self.makepasv()
File "C:\Python33\lib\ftplib.py", line 329, in makepasv
host, port = parse227(self.sendcmd('PASV'))
File "C:\Python33\lib\ftplib.py", line 873, in parse227
raise error_reply(resp)
ftplib.error_reply: 200 Type set to I.
If I don't use STOR in a callback, everything works fine, But then, how am I supposed to get data from RETR command?
I know possible solutions, but I'm sure there must be a more elegant one:
- use urllib.request instead of RETR (what if there's no HTTP on the server?)
- reinitialize FTP connection in callback function (may be slower than expected because of waiting for the server to reconnect)
- user ftp.set_pasv(False) (callback launches, but the script does not end and cannot use ftp.quit() or ftp.close())
According to the documentation of retrbinary:
The callback function is called for each block of data received, with a single string argument giving the data block.
This suggests that the callback is called while the data connection to retrieve the file is still open and the STOR command is not yet completed. It is not possible with FTP to create a new data connection (in the same FTP session) while another is still active. Additionally it looks like ftplib gets confused and considers the response to TYPE I beeing the response for PASV:
File "C:\Python33\lib\ftplib.py", line 873, in parse227
raise error_reply(resp)
ftplib.error_reply: 200 Type set to I.
What you should do instead is to call STOR only after the RETR is completed, i.e. let the callback store everything in the file but then open the file only after retrbinary returned.
But then, how am I supposed to get data from RETR command?
In your current callback you store the data inside a file and then you read the file. The callback should still store the data in the file but reading and calling STOR should be done outside the callback, right after retrbinary. You cannot RETR and STOR data in parallel.

Uploading large files to proftpd through paramiko times out

I've set up a SFTP server using proftpd on my local machine. It works fine, except that it times out when uploading files larger than approximately 30000 characters.
Uploading from the command line through proftpd works without any problems, and using paramiko to upload to a different SFTP server also works. This leads me to think there is a bug specifically in the interaction between paramiko and proftpd.
I've made a small script to illustrate the problem:
import paramiko
transport = paramiko.Transport(('localhost', 2220)) # my proftpd SFTP port
transport.connect(username='x', password='x')
client = paramiko.SFTPClient.from_transport(transport)
with open('testimage.jpg') as f: # 35241 characters
content = f.read()
with client.open('testimage.jpg', 'w') as f:
f.write(content)
My SFTP-specific proftpd configuration:
<IfModule mod_sftp.c>
<VirtualHost 0.0.0.0>
Include /etc/proftpd/conf.d
SFTPEngine on
SFTPLog /var/log/proftpd/sftp.log
Port 2220
SFTPHostKey /etc/ssh/ssh_host_rsa_key
SFTPHostKey /etc/ssh/ssh_host_dsa_key
SFTPAuthMethods password
SFTPCompression delayed
MaxLoginAttempts 3
</VirtualHost>
</IfModule>
After 10 minutes, the program exits and spits out this error:
Traceback (most recent call last):
File "ftptest.py", line 9, in <module>
f.write(content)
File "/Library/Python/2.7/site-packages/paramiko/file.py", line 330, in write
self._write_all(data)
File "/Library/Python/2.7/site-packages/paramiko/file.py", line 447, in _write_all
count = self._write(data)
File "/Library/Python/2.7/site-packages/paramiko/sftp_file.py", line 176, in _write
self._reqs.append(self.sftp._async_request(type(None), CMD_WRITE, self.handle, long(self._realpos), data[:chunk]))
File "/Library/Python/2.7/site-packages/paramiko/sftp_client.py", line 668, in _async_request
self._send_packet(t, msg)
File "/Library/Python/2.7/site-packages/paramiko/sftp.py", line 170, in _send_packet
self._write_all(out)
File "/Library/Python/2.7/site-packages/paramiko/sftp.py", line 135, in _write_all
raise EOFError()
EOFError
Using paramiko 1.15 and proftpd 1.3.5
The default window size of 4 GB was too big for paramiko, causing data transfer to stall.
The issue was resolved by adding this to the proftpd SFTP configuration:
SFTPClientMatch ".*" channelWindowSize 3999MB

Why does my code raise an exception?

Hi I am trying to create an FTP server and to aid the development I'm using pyftpdlib. What I wanted to do is to do some file operations if a user downloads a specific file but sometimes it raises an exception and I don't really know why.
I wrote my own handler in pyftpdlib after this tutorial: http://code.google.com/p/pyftpdlib/wiki/Tutorial#3.8_-_Event_callbacks
But something goes terribly wrong sometimes when the user downloads the log file (which I intend to do some file operations on) and I don't really understand why. I have another class which basically reads from a configuration file and the error message said it couldn't find FTP Section. But it's strange because I clearly have it in my configuration file and it is working sometimes perfectly.
May this error appear because I have two "Connection" objects? That's the only guess I have, so I would be very glad if someone could explain what's going wrong. Here is my code that's troubled (nevermind the file.name check because that was very recently added):
class ArchiveHandler(ftpserver.FTPHandler):
def on_login(self, username):
# do something when user login
pass
def on_logout(self, username):
# do something when user logs out
pass
def on_file_sent(self, file):
"What to do when retrieved the file the class is watching over"
attr = Connection()
if attr.getarchive() == 'true':
t = datetime.now()
if file.name == "log.log":
try:
shutil.copy2(file, attr.getdir() + ".archive/" + str(t.strftime("%Y-%m-%d_%H:%M:%S") + '.log'))
except OSError:
print 'Could not copy file'
raise
if attr.getremain() == 'false':
try:
os.remove(file)
except OSError:
print 'Could not remove file'
raise
The full source:
http://pastie.org/3552079
Source of the config-file:
http://pastie.org/3552085
EDIT-> (and of course the error):
[root]#85.230.122.159:40659 unhandled exception in instance <pyftpdlib.ftpserver.DTPHandler object at 0xb75f49ec>
Traceback (most recent call last):
File "/usr/lib/python2.6/asyncore.py", line 84, in write
obj.handle_write_event()
File "/usr/lib/python2.6/asyncore.py", line 435, in handle_write_event
self.handle_write()
File "/usr/lib/python2.6/asynchat.py", line 174, in handle_write
self.initiate_send()
File "/usr/lib/python2.6/asynchat.py", line 215, in initiate_send
self.handle_close()
File "/usr/local/lib/python2.6/dist-packages/pyftpdlib/ftpserver.py", line 1232, in handle_close
self.close()
File "/usr/local/lib/python2.6/dist-packages/pyftpdlib/ftpserver.py", line 1261, in close
self.cmd_channel.on_file_sent(filename)
File "ftp.py", line 87, in on_file_sent
File "ftp.py", line 12, in __init__
File "/usr/lib/python2.6/ConfigParser.py", line 311, in get
raise NoSectionError(section)
NoSectionError: No section: 'FTP Section'
The problem is in a section you didn't include. It says
File "ftp.py", line 12, in __init__
File "/usr/lib/python2.6/ConfigParser.py", line 311, in get
raise NoSectionError(section)
NoSectionError: No section: 'FTP Section'
So from the first line, we know that whatever is on line 12 of ftp.py is the problem (since everything below that isn't our code, so we assume that it's correct).
Line 12 is this:
self.ip = config.get('FTP Section', 'hostname')
And the error message says "No section: 'FTP Section'".
From this we can assume there's an error in the config file (that it doesn't have a "FTP Section").
Are you pointing at the correct config file? Is it in the same directory that you're running the script from? Being in the same folder as the script will not work, it must be the folder that you run the script from.
I think this is the problem you're having, since according to the documentation:
If none of the named files exist, the ConfigParser instance will contain an empty dataset.
You can confirm this by trying to open the file.
The problem in this case was that by reading the file it left the file open.
I changed to this and it's working much better:
config = ConfigParser.RawConfigParser()
fp = open('config.cfg')
config.readfp(fp)
And then when I'm finished reading in the constructor I add:
#Close the file
fp.close()
And voila, you can open how many objects of the class you want and it won't show any errors. :)

Categories

Resources