i'm using django app and firebase-admin and i have a files and sub folders(nested folders) as shown in image
each folder has its own files and folders.
i want to get a List of all folders and files inside each root folder ,
my code is :
service_account_key = 'mysak.json'
cred = firebase_admin.credentials.Certificate(service_account_key)
default_app = firebase_admin.initialize_app(cred, {
'storageBucket': 'myBucketUrl'
})
bucket = storage.bucket()
blob = list(bucket.list_blobs()) #this is returning all objects and files in storage not for the folder i want
for example i want all files in first_stage/math so i can get a url for each file
i have also read the docs about firebase storage and there is no such a method
Based on documentation List objects | Cloud Storage | Google Cloud you can do something like
bucket.list_blobs(prefix="first_stage/math")
I have this code:
from google.colab import auth
auth.authenticate_user()
import gspread
from oauth2client.client import GoogleCredentials as GC
gc = gspread.authorize(creds)
from gspread_dataframe import set_with_dataframe
title = 'Sheet_name'
gc.create(title)
sheet = gc.open(title).sheet1
set_with_dataframe(sheet, aa)
It works but I have problem, I can't save it into specific folder. It is always saved in the main folder
For exmple I would like to save it to folder:
/content/drive/My Drive/My sheets
Could you help me how to do this, please?
I tried to add pretext to "say" where I wan't to save this folder but it didn't works.
I thought that in your script, an argument of folder_id can be used. When this is reflected in your script, it becomes as follows.
In this case, please retrieve the folder ID from the folder of "My sheets" in your Google Drive. The folder ID is ### of https://drive.google.com/drive/folders/###.
From:
gc.create(title)
To:
folder_id = '###' # Please set the folder ID of "My sheets" folder.
gc.create(title, folder_id)
or
folder_id = '###' # Please set the folder ID of "My sheets" folder.
gc.create(title, folder_id=folder_id)
Reference:
create(title, folder_id=None)
Here is my code
suggestions with my below code to upload only new files to google drive leaving already uploaded files using Python ??
I want to upload files whenever a new files comes to a folder then to google drive
import os
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
drive = GoogleDrive(gauth)
folder = '1aoonhCebvI5DddvJVGTzBvWs2BPn_yXN'
directory = "/Volumes/g2track/Shared/Egnyte Audit Reports/2022/Abhi"
for f in os.listdir(directory):
filename=os.path.join(directory, f)
gfile = drive.CreateFile({'parents' : [{'id' : folder}],'title' : f})
gfile.SetContentFile(filename )
gfile.Upload()
I've got the exact same question as the one asked on this post: List files and folders in a google drive folder
I don't figure out in the google drive rest api documentation how to get a list of files in a folder of google drive
You can look here for an example of how to list files in Drive: https://developers.google.com/drive/api/v3/search-files . You need to construct a query that lists the files in a folder: use
q = "'1234' in parents"
where 1234 is the ID of the folder that you want to list. You can modify the query to list all the files of a particular type (such as all jpeg files in the folder), etc.
Here's a hacky-yet-successful solution. This actually gets all the files from a particular Google Drive folder (in this case, a folder called "thumbnails"). I needed to get (not just list) all the files from a particular folder and perform image adjustments on them, so I used this code:
`# First, get the folder ID by querying by mimeType and name
folderId = drive.files().list(q = "mimeType = 'application/vnd.google-apps.folder' and name = 'thumbnails'", pageSize=10, fields="nextPageToken, files(id, name)").execute()
# this gives us a list of all folders with that name
folderIdResult = folderId.get('files', [])
# however, we know there is only 1 folder with that name, so we just get the id of the 1st item in the list
id = folderIdResult[0].get('id')
# Now, using the folder ID gotten above, we get all the files from
# that particular folder
results = drive.files().list(q = "'" + id + "' in parents", pageSize=10, fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
# Now we can loop through each file in that folder, and do whatever (in this case, download them and open them as images in OpenCV)
for f in range(0, len(items)):
fId = items[f].get('id')
fileRequest = drive.files().get_media(fileId=fId)
fh = io.BytesIO()
downloader = MediaIoBaseDownload(fh, fileRequest)
done = False
while done is False:
status, done = downloader.next_chunk()
fh.seek(0)
fhContents = fh.read()
baseImage = cv2.imdecode(np.fromstring(fhContents, dtype=np.uint8), cv2.IMREAD_COLOR)
See the API for the available functions...
You can search for files with the Drive API files: list method. You can call Files.list without any parameters, which returns all files on the user's drive. By default, Files.list only returns a subset of properties for a resource. If you want more properties returned, use the fields parameter that specifies which properties to return in the query string q. To make your search query more specific, you can use several operators with each query property.
# Import PyDrive and associated libraries.
# This only needs to be done once per notebook.
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# Authenticate and create the PyDrive client.
# This only needs to be done once per notebook.
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# List .txt files in the root.
#
# Search query reference:
# https://developers.google.com/drive/v2/web/search-parameters
listed = drive.ListFile({'q': "title contains 'CV'"}).GetList()
for file in listed:
print('title {}, id {}'.format(file['title'], file['id']))
Easiest solution if your are working with google collab.
Connect to your Drive in the collab notebook:
from google.colab import drive
drive.mount('/content/drive')
Use the special command '!' with the "ls" command to see the list of files in the path of folder drive you specify.
!ls PATH OF YOUR DRIVE FOLDER
Example: !ls drive/MyDrive/Folder1/Folder2/
I'm making a spreadsheet that I need to reset every single week, I'm trying to use the PyDrive API to make a copy of the file, which works perfectly with the code below.
My problem is that i cant specify which folder I want to save the copy in. I would like to have a folder called "archive" which contains all my backups. Is this possible, using PyDrive and if so, how? Thanks!
## Create a new Document in Google Drive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
folder = "########"
title = "Copy of my other file"
file = "############"
drive.auth.service.files().copy(fileId=file,
body={"parents": [{"kind": "drive#fileLink",
"id": folder}], 'title': title}).execute()
Thanks to the comment from #Rawing I figured it out. It was specified in the "folder" variable