I'm trying to upload a file using ftp in python, but I get an error saying:
ftplib.error_perm: 550 Filename invalid
when I run the following code:
ftp = FTP('xxx.xxx.x.xxx', 'MY_FTP', '')
ftp.cwd("/incoming")
file = open('c:\Automation\FTP_Files\MaxErrors1.json', 'rb')
ftp.storbinary('STOR c:\Automation\FTP_Files\MaxErrors1.json', file)
ftp.close()
I've checked that the file exists in the location I specified, does anyone know what might be causing the issue?
The problem is that on the server, the path c:\Automation\FTP_Files\MaxErrors1.json is not valid. Instead try just doing:
ftp.storbinary('STOR MaxErrors1.json', file)
The argument to STOR needs to be the destination file name, not the source path. You should just do ftp.storbinary('STOR MaxErrors1.json', file).
you should upload file without absolute path in ftp server
for example :
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()
Related
I have a problem that I can't solve with Python.
Testing on colab everything goes smoothly for me but it doesn't work elsewhere for me.
I need to have a user upload a jpg file and send it via ftp.
This is my code:
import os
uploaded = files.upload()
dst = nameEx
os.rename(list(uploaded.keys())[0], dst)
#send via ftp
file_path = Path(dst)
with FTP('ftp.site.com','User','Password') as ftp, open(file_path, 'rb') as file:
ftp.storbinary(f'STOR {file_path.name}', file)
the error returns to me
uploaded = files.upload()
NameError: name 'files' is not defined.
how can i fix?
I expected, like on colab, an upload buttoncolab
I'm following a simple tutorial on YouTube about how to automatically upload files in S3 using Python, and I'm getting this error shows that:
FileNotFoundError: [WinError 2] The system cannot find the file specified: 'age.csv'
And this does not make sense to me, because files are there. For example, this my code looks like:
client = boto3.client('s3',
aws_access_key_id=access_key,
aws_secret_access_key=secret_access_key)
path = 'C:/Users/User/Desktop/python/projects/AWS-Data-Processing/example_data'
for file in os.listdir(path):
upload_file_bucket = 'my-uploaded-data'
print(file)
if '.txt' in file:
upload_file_key_txt = 'txt/' + str(file)
client.upload_file(file, upload_file_bucket, upload_file_key_txt)
print("txt")
elif '.csv' in file:
upload_file_key_csv = 'csv/' + str(file)
client.upload_file(file, upload_file_bucket, upload_file_key_csv)
print("csv")
And when I comment out the part where it says:
client.upload_file(file, upload_file_bucket, upload_file_key_txt)
it prints out either "txt" or "cvs", and I comment out to just read files such as:
for file in os.listdir(path):
upload_file_bucket = 'my-uploaded-data'
print(file)
Then it successfully prints out the file names. So I don't understand why I get the error of there is no file existing when there is. It sounds contradicting and I need some help to understand this error.
I read a post where I might need to download AWS CLI, so which I did but it didn't help. I'm guessing the problem lies in the function upload_file but I just don't understand how there is no file?
Any advice will be appreciated!
The upload_file function takes a full file path, and not just a name. It cannot guess what is your directory, so you need to prepend it or use a different way of iterating over the files.
Source: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html
my problem is that on local, my code works fine but when I pushed it on the server, the file isn't created.
Here's the code:
def write_binary_file(bfile, location):
"""Write binary file in the location"""
try:
with open(location, "wb+") as img_file:
img_file.write(bfile)
except IOError as err:
Handling error
file_url = os.path.join(settings.BASE_DIR, "dir", "dir", "dir", "dir", "user_img", filename + '.jpeg')
write_binary_file(bfile, file_url):
Difference between local and server:
_local os is windows
_server os is linux
I don't know if this matter or not since I'm using os.path.join and os.path.sep to build the url and getting file url without the first dir.
It worked before in the server but one day, somehow, it didn't work anymore till now
space left on the server: about 3Go
permission on the directory: 775 (rwxrwxr-x)
Well, figured what was the problem, it was nginx config to hide file path in the server
I am using Paramiko to connect to the SFTP server from my local machine and download txt files from remote path. I am able to make successful connection and can also print the remote path and the files but i cannot get the files locally. I can print the file_path and file_name but not able to download all the files. Below is the code I am using:
import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname=hostname, username=username, password=password, port=port)
remotepath = '/home/blahblah'
pattern = '"*.txt"'
stdin,stdout,stderr = ssh.exec_command("find {remotepath} -name {pattern}".format(remotepath=remotepath, pattern=pattern))
ftp = ssh.open_sftp()
for file_path in stdout.readlines():
file_name = file_path.split('/')[-1]
print(file_path)
print(file_name)
ftp.get(file_path, "/home/mylocalpath/{file_name}".format(file_name=file_name))
I can see the file_path and file_name like below from print statement but get error while using ftp.get for multiple files. I can copy a single file by hardcoding the name on source and destination.
file_path = '/home/blahblah/abc.txt'
file_name = 'abc.txt'
file_path = '/home/blahblah/def.txt'
file_name = 'def.txt'
I see one file is downloaded and then i get the following error:
FileNotFoundErrorTraceback (most recent call last)
Error trace:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "...anaconda3/lib/python3.6/site-packages/paramiko/sftp_client.py", line 769, in get
with open(localpath, 'wb') as fl:
FileNotFoundError: [Errno 2] No such file or directory: 'localpath/abc.txt\n'
readlines does not remove newline from the line. So as you can see in the traceback, you are trying to create a file named abc.txt\n, what is not possible on many file systems, and mainly, it's not what you want.
Trim the trailing new lines from file_path:
for file_path in stdout.readlines():
file_path = file_path.rstrip()
file_name = file_path.split('/')[-1]
# ...
Though you would have saved yourself lot of troubles, had you used a pure SFTP solution, instead of hacking it by executing a remote find command (what is a very fragile solution, as hinted in comments by #CharlesDuffy).
See List files on SFTP server matching wildcard in Python using Paramiko.
Side note: Do not use AutoAddPolicy. You
lose security by doing so. See Paramiko "Unknown Server".
I have a zip file to upload. I know how to upload it.I open the file with "Rb" mode. When i want to extract the zip file that i uploaded i get an error and files in the ZIP archive are gone, i think that's because of the "Rb" mode . I don't know how to extract my uploaded file.
Here is the code:
filename="test.zip"
ftp=ftplib.FTP("ftp.test.com")
ftp.login('xxxx','xxxxx')
ftp.cwd("public_html/xxx")
myfile=open("filepath","rb")
ftp.storlines('STOR ' + filename,myfile)
ftp.quit()
ftp.close()
Your code is currently using ftp.storlines() which is intended for use with ASCII files.
For binary files such as ZIP files, you need to use ftp.storbinary() instead:
import ftplib
filename = "test.zip"
with open(filename, 'rb') as f_upload:
ftp = ftplib.FTP("ftp.test.com")
ftp.login('xxxx', 'xxxxx')
ftp.cwd("public_html/xxx")
ftp.storbinary('STOR ' + filename, f_upload)
ftp.quit()
ftp.close()
When ASCII mode is used on a ZIP file, it will result in an unusable file which is what you were getting.