I'm using Flask to make a web application and I want to upload a user input file to Google Storage Cloud. I'm using Heroku to host my web app and I don't know how to save files on Heroku's temporary storage so I'm trying to use tempfile to store the file in a directory and then access the directory to upload the file.
When I try to do that, I get this error: PermissionError: [Errno 13] Permission denied: 'C:\\Users\\[MyName]\\AppData\\Local\\Temp\\tmpbpom7ull'
Here is my code I'm working with, if anyone has any other way to upload a FileStorage object to the Google Storage cloud or a way to access the saved file, that would be very appreciated!
# File is currently a "FileStorage" object from werkzeug, gotten by doing
# file = request.files["filename"]
tempdir = tempfile.mkdtemp()
file.name = filename
file.save(tempdir)
upload_blob(BUCKET_NAME,filename,filename)
Following up on yesterday's Flask: Could not authenticate question the Google Cloud Storage client, you can use werkzeug's FileStorage object as described in the Flask-GoogleStorage usage:
Assuming you a have a file hellofreddie.txt in the working directory:
hellofreddie.txt:
Hello Freddie!
You can then open it, create a FileStorage object and then use the save on Bucket object (files):
from datetime import timedelta
from flask import Flask
from flask_googlestorage import GoogleStorage, Bucket
from werkzeug.datastructures import FileStorage
import os
files = Bucket("files")
storage = GoogleStorage(files)
app = Flask(__name__)
app.config.update(
GOOGLE_STORAGE_LOCAL_DEST = app.instance_path,
GOOGLE_STORAGE_SIGNATURE = {"expiration": timedelta(minutes=5)},
GOOGLE_STORAGE_FILES_BUCKET = os.getenv("BUCKET")
)
storage.init_app(app)
with app.app_context():
with open("hellofreddie.txt","rb") as f:
file = FileStorage(f)
filename = files.save(file)
After the code has run, you will see a UUID-named equivalent created in Cloud Storage.
You can use the storage browser or gsutil:
gsutil ls gs://${BUCKET}
gs://{BUCKET}/361ea9ea-5599-4ff2-84d1-3fe1a802ac08.txt
NOTE I was unable to resolve an issue trying to print either files.url(filename) or files.signed_url(filename). These methods correctly return the Cloud Storage Object but as PurePosixPath('f3745268-5c95-4c61-a892-09c0de556635.txt'). My Python naivete.
I've realized my error, I was trying to use file.save() to a folder and not to an actual file, my code has been updated to
tempdir = tempfile.mkdtemp()
file.name = filename
file.save(tempdir + "/" + filename)
upload_blob(BUCKET_NAME,tempdir + "/" + filename,filename)
Thank you to PermissionError: [Errno 13] Permission denied
Here's my code for a very simple program:
import os, shutil
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
source_path = r"c:/users/x/appdata/roaming/medianxl/save"
destination_path = r"c:/users/x/desktop/backup_saves"
print("Contents being backed up:")
print(os.listdir(source_path))
destination = shutil.copytree(source_path, destination_path)
print("Contents successfully backed up to:", destination_path)
print("Now uploading backup saves to Google Drive...")
auth = GoogleAuth()
auth.LocalWebserverAuth()
drive = GoogleDrive(auth)
saves = drive.CreateFile()
saves.SetContentFile(r"c:/users/x/desktop/backup_saves")
saves.Upload()
So far I am having no issues when it comes to taking the folder from the appdata location and copying it to my desktop. Where I am getting the error in my title is when I go to upload that folder and its contents on my desktop to Google Drive using pydrive.
Here's the output from the command window after running the program:
Contents being backed up:
['preferences.json', 'TSW', 'uhp_prettycolor.d2s', 'uhp_prettycolor.key', 'uhp_prettycolor.ma0', 'uhp_prettycolor.map']
Contents successfully backed up to: c:/users/x/desktop/backup_saves
Now uploading backup saves to Google Drive...
Your browser has been opened to visit:
url here
Authentication successful.
Traceback (most recent call last):
File "backup.py", line 21, in <module>
saves.SetContentFile(r"c:/users/x/desktop/backup_saves")
File "C:\Users\x\AppData\Local\Programs\Python\Python38-32\lib\site-packages\pydrive\files.py", line 169, in SetContentFile
self.content = open(filename, 'rb')
PermissionError: [Errno 13] Permission denied: 'c:/users/x/desktop/backup_saves'
I've tried running cmd as admin but am still getting the same permissions error. Any ideas?
U need to check ur backup file is opening or not. if opening, close it.
u can try it to move to another disk (D: or E:) to test first.
FileNotFoundError When trying to read/access a file or a folder that exists in the bucket in the google cloud by referencing gs://BUCKET_NAME/FolderName/.
I am using python 3 as the kernel with a jupyter notebook. I have a cluster configured in the google cloud linked to a bucket. When ever I try to read/upload a file I am getting the file not found error
def get_files(bucketName):
files = [f for f in listdir(localFolder) if
isfile(join(localFolder, f))]
for file in files:
print("file path:", file)
get_files("agriculture-bucket-gl")
I should be able to access the folder contents or to reference any file that exists inside any folder in the bucket.
Error Message:
FileNotFoundError: [Errno 2] No such file or directory: 'gs://agriculture-bucket-gl/Data sets/'
You need to access the bucket using the storage library, to get the file and then get content.
You may find this code template helpful.
from google.cloud import storage
# Instantiates a client
client = storage.Client()
bucket_name = 'your_bucket_name'
bucket = client.get_bucket(bucket_name)
blob = bucket.get_blob('route/to/file.txt')
downloaded_blob = blob.download_as_string()
print(downloaded_blob)
To add to the the previous answers, the path in the Error Message: FileNotFoundError: [Errno 2] No such file or directory: 'gs://agriculture-bucket-gl/Data sets/' also contains some issues. I'd try fixing the following:
The folder name "Data sets" has a space. I'd try a name without the space.
There is the / sign is at the end of the path. The path should end without a slash.
If you want to access from storage
from google.cloud import storage
bucket_name = 'your_bucket_name'
blob_path = 'storage/path/fileThatYouWantToAccess'
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(blob_path)
#this is optional if you want to download it to tmp folder
blob.download_to_filename('/tmp/fileThatYouWantToAccess')
I`m using PyDrive QuickStart script to list my Google Drive files.
Code:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
file_list = drive.ListFile({'q': "'root' in parents and trashed=false"}).GetList()
print(file_list)
I'm able to list my files normally, but I need to list and manage files from another public drive URL (which is not the my personal authenticated drive) from my already authenticated GoogleDrive account like if I was using requests lib.
Any ideas how to do it?
You need to get the folder ID. You can find the ID in the URL of the folder. An example would be:
https://drive.google.com/open?id=0B-schRXnDFZeX0t0RnhQVXXXXXX (the part of the URL after the id=).
List contents of a folder based on ID. Given your code you replace file_list = ... with:
file_id = '<Your folder id here.>'
file_list = drive.ListFile({'q': "'%s' in parents and trashed=false" % file_id}).GetList()
If this does not work, you may have to add the remote folder to your Google Drive using the "Add to Drive" button in the top right corner of the shared folder when opened in a browser.
2.1 Creating a file in a folder can be done like so:
file_object = drive.CreateFile({
"parents": [{"kind": "drive#fileLink",
"id": parent_id}],
'title': file_name,
# (Only!) If the new 'file' object is going be a folder:
'mimeType': "application/vnd.google-apps.folder"
})
file_object.Upload()
If this fails check whether you have write permissions to the folder.
2.2 Deleting/Trashing a file can be done with the updated version available from GitHub: pip install instructions, Delete/Trash/UnTrash documentation
Finally, there is a feature request to Upload to folders as described in 2.1, and listing files of a folder, as described in 2. - if you find the above not to work you can add this as an issue / feature request to the repository.
I have successfully uploaded single text file on Google Cloud Storage. But when i try to upload whole folder, It gives permission denied error.
filename = "d:/foldername" #here test1 is the folder.
Error:
Traceback (most recent call last):
File "test1.py", line 142, in <module>
upload()
File "test1.py", line 106, in upload
media = MediaFileUpload(filename, chunksize=CHUNKSIZE, resumable=True)
File "D:\jatin\Project\GAE_django\GCS_test\oauth2client\util.py", line 132, in positional_wrapper
return wrapped(*args, **kwargs)
File "D:\jatin\Project\GAE_django\GCS_test\apiclient\http.py", line 422, in __init__
fd = open(self._filename, 'rb')
IOError: [Errno 13] Permission denied: 'd:/foldername'
This works for me. Copy all content from a local directory to a specific bucket-name/full-path (recursive) in google cloud storage:
import glob
from google.cloud import storage
def upload_local_directory_to_gcs(local_path, bucket, gcs_path):
assert os.path.isdir(local_path)
for local_file in glob.glob(local_path + '/**'):
if not os.path.isfile(local_file):
upload_local_directory_to_gcs(local_file, bucket, gcs_path + "/" + os.path.basename(local_file))
else:
remote_path = os.path.join(gcs_path, local_file[1 + len(local_path):])
blob = bucket.blob(remote_path)
blob.upload_from_filename(local_file)
upload_local_directory_to_gcs(local_path, bucket, BUCKET_FOLDER_DIR)
A version without a recursive function, and it works with 'top level files' (unlike the top answer):
import glob
import os
from google.cloud import storage
GCS_CLIENT = storage.Client()
def upload_from_directory(directory_path: str, dest_bucket_name: str, dest_blob_name: str):
rel_paths = glob.glob(directory_path + '/**', recursive=True)
bucket = GCS_CLIENT.get_bucket(dest_bucket_name)
for local_file in rel_paths:
remote_path = f'{dest_blob_name}/{"/".join(local_file.split(os.sep)[1:])}'
if os.path.isfile(local_file):
blob = bucket.blob(remote_path)
blob.upload_from_filename(local_file)
A folder is a cataloging structure containing references to files and directories. The library will not accept a folder as an argument.
As far as I understand, your use case is to make an upload to GCS preserving a local folder structure. To accomplish that you can use the os python module and make a recursive function (e.g process_folder) that will take path as an argument. This logic can be used for the function:
Use os.listdir() method to get a list of objects within the source path (will return both files and folders).
Iterate over a list from step 1 to separate files from folders via os.path.isdir() method.
Iterate over files and upload them with adjusted path (e.g. path+ “/“ + file_name).
Iterate over folders making a recursive call (e.g. process_folder(path+folder_name)).
It’ll be necessary to work with two paths:
Real system path (e.g. “/Users/User/…/upload_folder/folder_name”) used with os module.
Virtual path for GCS file uploads (e.g. “upload”+”/“ + folder_name + ”/“ + file_name).
Don’t forget to implement exponential backoff referenced at [1] to deal with 500 errors. You can use a Drive SDK example at [2] as a reference.
[1] - https://developers.google.com/storage/docs/json_api/v1/how-tos/upload#exp-backoff
[2] - https://developers.google.com/drive/web/handle-errors
I assume the sheer filename = "D:\foldername" is not enough info about the source code. Neither am I sure that this is even possible.. via the web interface you can also just upload files or create folders where you then upload the files.
You could save the folders name, then create it (I've never used the google-app-engine, but I guess that should be possible) and then upload the contents to the new folder
Refer -
https://hackersandslackers.com/manage-files-in-google-cloud-storage-with-python/
from os import listdir
from os.path import isfile, join
...
def upload_files(bucketName):
"""Upload files to GCP bucket."""
files = [f for f in listdir(localFolder) if isfile(join(localFolder, f))]
for file in files:
localFile = localFolder + file
blob = bucket.blob(bucketFolder + file)
blob.upload_from_filename(localFile)
return f'Uploaded {files} to "{bucketName}" bucket.'
The solution can also be used for windows systems. Simply provide the folder name to upload the destination bucket name.Additionally, it can handle any level of subdirectories in a folder.
import os
from google.cloud import storage
storage_client = storage.Client()
def upload_files(bucketName, folderName):
"""Upload files to GCP bucket."""
bucket = storage_client.get_bucket(bucketName)
for path, subdirs, files in os.walk(folderName):
for name in files:
path_local = os.path.join(path, name)
blob_path = path_local.replace('\\','/')
blob = bucket.blob(blob_path)
blob.upload_from_filename(path_local)
Here is my recursive implementation . we need to create a file named gdrive_utils.py and write the following.
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
from apiclient.http import MediaFileUpload, MediaIoBaseDownload
import pickle
import glob
import os
# The following scopes are required for access to google drive.
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/drive.metadata.readonly',
'https://www.googleapis.com/auth/drive.metadata',
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/drive.appdata']
def get_gdrive_service():
"""
Tries to authenticate using a token. If token expires or not present creates one.
:return: Returns authenticated service object
:rtype: object
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'keys/client-secret.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
# return Google Drive API service
return build('drive', 'v3', credentials=creds)
def createRemoteFolder(drive_service, folderName, parent_id):
# Create a folder on Drive, returns the newely created folders ID
body = {
'name': folderName,
'mimeType': "application/vnd.google-apps.folder",
'parents': [parent_id]
}
root_folder = drive_service.files().create(body = body, supportsAllDrives=True, fields='id').execute()
return root_folder['id']
def upload_file(drive_service, file_location, parent_id):
# Create a folder on Drive, returns the newely created folders ID
body = {
'name': os.path.split(file_location)[1],
'parents': [parent_id]
}
media = MediaFileUpload(file_location,
resumable=True)
file_details = drive_service.files().create(body = body,
media_body=media,
supportsAllDrives=True,
fields='id').execute()
return file_details['id']
def upload_file_recursively(g_drive_service, root, folder_id):
files_list = glob.glob(root)
if files_list:
for file_contents in files_list:
if os.path.isdir(file_contents):
# create new _folder
new_folder_id = createRemoteFolder(g_drive_service, os.path.split(file_contents)[1],
folder_id)
upload_file_recursively(g_drive_service, os.path.join(file_contents, '*'), new_folder_id)
else:
# upload to given folder id
upload_file(g_drive_service, file_contents, folder_id)
After that use the following
import os
from gdrive_utils import createRemoteFolder, upload_file_recursively, get_gdrive_service
g_drive_service = get_gdrive_service()
FOLDER_ID_FOR_UPLOAD = "<replace with folder id where you want upload>"
main_folder_id = createRemoteFolder(g_drive_service, '<name_of_main_folder>', FOLDER_ID_FOR_UPLOAD)
And finally use this
upload_file_recursively(g_drive_service, os.path.join("<your_path_>", '*'), main_folder_id)
I just came across the gcsfs library which seems to be also about better interfaces
You could copy an entire directory into a gcs location like this:
def upload_to_gcs(src_dir: str, gcs_dst: str):
fs = gcsfs.GCSFileSystem()
fs.put(src_dir, gcs_dst, recursive=True)
Another option is to use gsutils, the command-line tool for interacting with Google Cloud:
gsutil cp -r ./my/local/directory gs://my_gcp_bucket/foo/bar
The -r flag tells gsutils to copy recursively. Link gsutils to documentation.
Invoking gsutils in Python can be done like this:
import subprocess
subprocess.check_call('gsutil cp -r ./my/local/directory gs://my_gcp_bucket/foo/bar')