I have a problem that I can't solve with Python.
Testing on colab everything goes smoothly for me but it doesn't work elsewhere for me.
I need to have a user upload a jpg file and send it via ftp.
This is my code:
import os
uploaded = files.upload()
dst = nameEx
os.rename(list(uploaded.keys())[0], dst)
#send via ftp
file_path = Path(dst)
with FTP('ftp.site.com','User','Password') as ftp, open(file_path, 'rb') as file:
ftp.storbinary(f'STOR {file_path.name}', file)
the error returns to me
uploaded = files.upload()
NameError: name 'files' is not defined.
how can i fix?
I expected, like on colab, an upload buttoncolab
Related
I've trying to upload multiple MP4 files using pysftp/paramiko libraries of python.
After uploading the mp4 file are not playable ie. they are corrupted.
I compared the hash of the files and they are different.
If I upload just 1 file it works fine, but for multiple files upload the files get corrupted.
mediaStorageLocation = './uploadDir/'
t = paramiko.Transport((myHostname, port))
t.connect(username=myUsername, password=myPassword)
sftp = paramiko.SFTPClient.from_transport(t)
myfiles = os.listdir('/mnt/dcim/upload/')
for __file in myfiles:
if('.mp4' in __file):
print("uploading - "+__file)
sftp.put('/mnt/dcim/upload/'+__file,mediaStorageLocation+__file)
print("finished uploading - "+__file)
sftp.close()
hash before upload = b752716f42a5b5046839ab60c33a3387203dcaa3
hash after upload = 644ae34545d850b80dfd5eb4d316b36b75dd3af4
Multiple files are uploaded but hash of every uploaded file is different from the original file.
I can play the files just fine before uploading.
How to fix this?
I'm using Flask to make a web application and I want to upload a user input file to Google Storage Cloud. I'm using Heroku to host my web app and I don't know how to save files on Heroku's temporary storage so I'm trying to use tempfile to store the file in a directory and then access the directory to upload the file.
When I try to do that, I get this error: PermissionError: [Errno 13] Permission denied: 'C:\\Users\\[MyName]\\AppData\\Local\\Temp\\tmpbpom7ull'
Here is my code I'm working with, if anyone has any other way to upload a FileStorage object to the Google Storage cloud or a way to access the saved file, that would be very appreciated!
# File is currently a "FileStorage" object from werkzeug, gotten by doing
# file = request.files["filename"]
tempdir = tempfile.mkdtemp()
file.name = filename
file.save(tempdir)
upload_blob(BUCKET_NAME,filename,filename)
Following up on yesterday's Flask: Could not authenticate question the Google Cloud Storage client, you can use werkzeug's FileStorage object as described in the Flask-GoogleStorage usage:
Assuming you a have a file hellofreddie.txt in the working directory:
hellofreddie.txt:
Hello Freddie!
You can then open it, create a FileStorage object and then use the save on Bucket object (files):
from datetime import timedelta
from flask import Flask
from flask_googlestorage import GoogleStorage, Bucket
from werkzeug.datastructures import FileStorage
import os
files = Bucket("files")
storage = GoogleStorage(files)
app = Flask(__name__)
app.config.update(
GOOGLE_STORAGE_LOCAL_DEST = app.instance_path,
GOOGLE_STORAGE_SIGNATURE = {"expiration": timedelta(minutes=5)},
GOOGLE_STORAGE_FILES_BUCKET = os.getenv("BUCKET")
)
storage.init_app(app)
with app.app_context():
with open("hellofreddie.txt","rb") as f:
file = FileStorage(f)
filename = files.save(file)
After the code has run, you will see a UUID-named equivalent created in Cloud Storage.
You can use the storage browser or gsutil:
gsutil ls gs://${BUCKET}
gs://{BUCKET}/361ea9ea-5599-4ff2-84d1-3fe1a802ac08.txt
NOTE I was unable to resolve an issue trying to print either files.url(filename) or files.signed_url(filename). These methods correctly return the Cloud Storage Object but as PurePosixPath('f3745268-5c95-4c61-a892-09c0de556635.txt'). My Python naivete.
I've realized my error, I was trying to use file.save() to a folder and not to an actual file, my code has been updated to
tempdir = tempfile.mkdtemp()
file.name = filename
file.save(tempdir + "/" + filename)
upload_blob(BUCKET_NAME,tempdir + "/" + filename,filename)
Thank you to PermissionError: [Errno 13] Permission denied
I'm following a simple tutorial on YouTube about how to automatically upload files in S3 using Python, and I'm getting this error shows that:
FileNotFoundError: [WinError 2] The system cannot find the file specified: 'age.csv'
And this does not make sense to me, because files are there. For example, this my code looks like:
client = boto3.client('s3',
aws_access_key_id=access_key,
aws_secret_access_key=secret_access_key)
path = 'C:/Users/User/Desktop/python/projects/AWS-Data-Processing/example_data'
for file in os.listdir(path):
upload_file_bucket = 'my-uploaded-data'
print(file)
if '.txt' in file:
upload_file_key_txt = 'txt/' + str(file)
client.upload_file(file, upload_file_bucket, upload_file_key_txt)
print("txt")
elif '.csv' in file:
upload_file_key_csv = 'csv/' + str(file)
client.upload_file(file, upload_file_bucket, upload_file_key_csv)
print("csv")
And when I comment out the part where it says:
client.upload_file(file, upload_file_bucket, upload_file_key_txt)
it prints out either "txt" or "cvs", and I comment out to just read files such as:
for file in os.listdir(path):
upload_file_bucket = 'my-uploaded-data'
print(file)
Then it successfully prints out the file names. So I don't understand why I get the error of there is no file existing when there is. It sounds contradicting and I need some help to understand this error.
I read a post where I might need to download AWS CLI, so which I did but it didn't help. I'm guessing the problem lies in the function upload_file but I just don't understand how there is no file?
Any advice will be appreciated!
The upload_file function takes a full file path, and not just a name. It cannot guess what is your directory, so you need to prepend it or use a different way of iterating over the files.
Source: https://boto3.amazonaws.com/v1/documentation/api/latest/guide/s3-uploading-files.html
I'm trying to upload a whole folder to dropbox but only the files get uploaded. Should I create a folder programatically or can I solve the folder-uploading so simple? Thanks
import os
import dropbox
access_token = '***********************'
dbx = dropbox.Dropbox(access_token)
dropbox_destination = '/live'
local_directory = 'C:/Users/xoxo/Desktop/man'
for root, dirs, files in os.walk(local_directory):
for filename in files:
local_path = root + '/' + filename
print("local_path", local_path)
relative_path = os.path.relpath(local_path, local_directory)
dropbox_path = dropbox_destination + '/' + relative_path
# upload the file
with open(local_path, 'rb') as f:
dbx.files_upload(f.read(), dropbox_path)
error:
dropbox.exceptions.ApiError: ApiError('xxf84e5axxf86', UploadError('path', UploadWriteFailed(reason=WriteError('disallowed_name', None), upload_session_id='xxxxxxxxxxx')))
[Cross-linking for reference: https://www.dropboxforum.com/t5/API-support/UploadWriteFailed-reason-WriteError-disallowed-name-None/td-p/245765 ]
There are a few things to note here:
In your sample, you're only iterating over files, so you won't get dirs uploaded/created.
The /2/files/upload endpoint only accepts file uploads, not folders. If you want to create folders, use /2/files/create_folder_v2. You don't need to explicitly create folders for any parent folders in the path for files you upload via /2/files/upload though. Those will be automatically created with the upload.
Per the /2/files/upload documentation, disallowed_name means:
Dropbox will not save the file or folder because of its name.
So, it's likely you're getting this error because you're trying to upload an ignored filed, e.g., ".DS_STORE". You can find more information on those in this help article under "Ignored files".
I'm trying to upload a file using ftp in python, but I get an error saying:
ftplib.error_perm: 550 Filename invalid
when I run the following code:
ftp = FTP('xxx.xxx.x.xxx', 'MY_FTP', '')
ftp.cwd("/incoming")
file = open('c:\Automation\FTP_Files\MaxErrors1.json', 'rb')
ftp.storbinary('STOR c:\Automation\FTP_Files\MaxErrors1.json', file)
ftp.close()
I've checked that the file exists in the location I specified, does anyone know what might be causing the issue?
The problem is that on the server, the path c:\Automation\FTP_Files\MaxErrors1.json is not valid. Instead try just doing:
ftp.storbinary('STOR MaxErrors1.json', file)
The argument to STOR needs to be the destination file name, not the source path. You should just do ftp.storbinary('STOR MaxErrors1.json', file).
you should upload file without absolute path in ftp server
for example :
import ftplib
session = ftplib.FTP('server.address.com','USERNAME','PASSWORD')
file = open('kitten.jpg','rb') # file to send
session.storbinary('STOR kitten.jpg', file) # send the file
file.close() # close file and FTP
session.quit()