I'm making a spreadsheet that I need to reset every single week, I'm trying to use the PyDrive API to make a copy of the file, which works perfectly with the code below.
My problem is that i cant specify which folder I want to save the copy in. I would like to have a folder called "archive" which contains all my backups. Is this possible, using PyDrive and if so, how? Thanks!
## Create a new Document in Google Drive
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
folder = "########"
title = "Copy of my other file"
file = "############"
drive.auth.service.files().copy(fileId=file,
body={"parents": [{"kind": "drive#fileLink",
"id": folder}], 'title': title}).execute()
Thanks to the comment from #Rawing I figured it out. It was specified in the "folder" variable
Related
I have this code:
from google.colab import auth
auth.authenticate_user()
import gspread
from oauth2client.client import GoogleCredentials as GC
gc = gspread.authorize(creds)
from gspread_dataframe import set_with_dataframe
title = 'Sheet_name'
gc.create(title)
sheet = gc.open(title).sheet1
set_with_dataframe(sheet, aa)
It works but I have problem, I can't save it into specific folder. It is always saved in the main folder
For exmple I would like to save it to folder:
/content/drive/My Drive/My sheets
Could you help me how to do this, please?
I tried to add pretext to "say" where I wan't to save this folder but it didn't works.
I thought that in your script, an argument of folder_id can be used. When this is reflected in your script, it becomes as follows.
In this case, please retrieve the folder ID from the folder of "My sheets" in your Google Drive. The folder ID is ### of https://drive.google.com/drive/folders/###.
From:
gc.create(title)
To:
folder_id = '###' # Please set the folder ID of "My sheets" folder.
gc.create(title, folder_id)
or
folder_id = '###' # Please set the folder ID of "My sheets" folder.
gc.create(title, folder_id=folder_id)
Reference:
create(title, folder_id=None)
Here is my code
suggestions with my below code to upload only new files to google drive leaving already uploaded files using Python ??
I want to upload files whenever a new files comes to a folder then to google drive
import os
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
drive = GoogleDrive(gauth)
folder = '1aoonhCebvI5DddvJVGTzBvWs2BPn_yXN'
directory = "/Volumes/g2track/Shared/Egnyte Audit Reports/2022/Abhi"
for f in os.listdir(directory):
filename=os.path.join(directory, f)
gfile = drive.CreateFile({'parents' : [{'id' : folder}],'title' : f})
gfile.SetContentFile(filename )
gfile.Upload()
Is it possible to perform a "deep" copy of Google Drive files, so that the copied file doesn't point to the same file object as the original? I'd like to be able to copy a file and have the copy be completely independent of the original, such that any modifications that are made to the copy don't also show up in the original. Using the following code I'm able to:
Create a folder in Google Drive
Copy a file into the new folder
But the problem is that any changes that are made to the copy also show up in the original. I'd like for the copied file to be a completely independent file. Is this possible?
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
#load previously generated credentials file
gauth.LoadCredentialsFile("mycreds3.txt")
drive = GoogleDrive(gauth)
#define ID of file to be copied
template_file_id = "1RQWYeeth-Ph ..."
#create a new folder to store the copied file
folder = drive.CreateFile({"title":"test_folder", 'mimeType': 'application/vnd.google-apps.folder'})
folder.Upload()
folder_id = folder['id']
#copy file into newly created folder
drive.auth.service.files().copy(fileId=template_file_id,body={'parents':[{"kind":'drive#file',"id":folder_id}], 'title':'new_file_title'}).execute()
EDIT:
I was able to perform a deep copy by copying a shared file. When a file is copied from a shared file (which doesn't have a shortcut in Drive that links to the original), a deep copy is created such that modifications to the copied file don't show up in the original. Copying shared folders this way threw an error, but individual files worked just fine.
destination_folder_id = 'YTRCA18EE ...'
shared_files = drive.ListFile({'q':'sharedWithMe'}).GetList()
for file in shared_files:
drive.auth.service.files().copy(fileId=file['id'],body={'parents':[{"kind":'drive#file',"id":destination_folder_id}], 'title':file['title']}).execute()
Lets take this step by step
The way this library works is that all calls must go through a service. In this case a drive service will give your application access to all the methods available in the Google drive api.
drive_service = GoogleDrive(gauth)
You have named your variable drive when creating your drive_service for constastancy.
Creating a new file and uploading it to google drive is a two part process. The first part is the file_metadata , that being the name and description of the file. The second is the media or the actual file data itself.
file_metadata = {'name': 'photo.jpg'}
media = MediaFileUpload('files/photo.jpg', mimetype='image/jpeg')
file = drive_service.files().create(body=file_metadata,
media_body=media,
fields='id').execute()
print 'File ID: %s' % file.get('id')
Note: all fields does is limit the response returned by the api to only the file id.
I can't find the way to share Google drive files after I upload them, currently I'm using Python and PyDrive . I'm not even sure if this is even possible.
Sharing is a Google Drive option, it's allows other users to access use it(read, comment or edit).
Here's my code:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
ufile = drive.CreateFile({'title': 'Hello.txt'})
ufile.SetContentString('Hello World!')
ufile.Upload()
I`m using PyDrive QuickStart script to list my Google Drive files.
Code:
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
gauth = GoogleAuth()
gauth.LocalWebserverAuth()
drive = GoogleDrive(gauth)
file_list = drive.ListFile({'q': "'root' in parents and trashed=false"}).GetList()
print(file_list)
I'm able to list my files normally, but I need to list and manage files from another public drive URL (which is not the my personal authenticated drive) from my already authenticated GoogleDrive account like if I was using requests lib.
Any ideas how to do it?
You need to get the folder ID. You can find the ID in the URL of the folder. An example would be:
https://drive.google.com/open?id=0B-schRXnDFZeX0t0RnhQVXXXXXX (the part of the URL after the id=).
List contents of a folder based on ID. Given your code you replace file_list = ... with:
file_id = '<Your folder id here.>'
file_list = drive.ListFile({'q': "'%s' in parents and trashed=false" % file_id}).GetList()
If this does not work, you may have to add the remote folder to your Google Drive using the "Add to Drive" button in the top right corner of the shared folder when opened in a browser.
2.1 Creating a file in a folder can be done like so:
file_object = drive.CreateFile({
"parents": [{"kind": "drive#fileLink",
"id": parent_id}],
'title': file_name,
# (Only!) If the new 'file' object is going be a folder:
'mimeType': "application/vnd.google-apps.folder"
})
file_object.Upload()
If this fails check whether you have write permissions to the folder.
2.2 Deleting/Trashing a file can be done with the updated version available from GitHub: pip install instructions, Delete/Trash/UnTrash documentation
Finally, there is a feature request to Upload to folders as described in 2.1, and listing files of a folder, as described in 2. - if you find the above not to work you can add this as an issue / feature request to the repository.