PyDrive Upload and Remove - python

I am new to Google Drive API and writing a simplest form of a script that automatically upload an image from the local drive on to google drive, then once that image is uploaded, delete the local copy, following is what I have got:
#%%
import os
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from googleapiclient.http import MediaFileUpload
g_login = GoogleAuth()
g_login.LocalWebserverAuth()
drive = GoogleDrive(g_login)
#%%
header = 'images/dice'
path = header + str(i) + '.png'
file = drive.CreateFile()
file.SetContentFile(path)
file.Upload()
if file.uploaded:
print("test")
os.remove(path)
however when attempting in deleting the local copy, following error occurs:
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'images/dice1.png'
I searched it up, thinking it might be the SetContentFile(path) where it did no close the file after Upload(), which according to
https://gsuitedevs.github.io/PyDrive/docs/build/html/pydrive.html
it should close automatically after upload.
What am I overseeing here?
Note: In the end, I want to use a loop that go through all the files within the directory.
This is the output:
1
test
---------------------------------------------------------------------------
PermissionError Traceback (most recent call last)
<ipython-input-21-2aeb578b5851> in <module>
9 if file.uploaded:
10 print("test")
---> 11 os.remove(path)
12
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'images/dice1.png'

Even if PyDrive does not close it for you, from looking into the code, it looks like you can do something like this:
...
try:
file.Upload()
finally:
file.content.close()
if file.uploaded:
...
could you give a try please and see if that helps?

Related

Python: How to upload folder to Google Drive

i want to upload a local folder to google drive with python
Folder example
1 folder level
C:\Users\test\Documents\google drvie\test\
the folder you want to upload
Folder name: upload
C:\Users\test\Documents\google drvie\test\upload
* In the upload folder
There is still a hierarchy of folders and files.
I want to upload all folders and files in upload.
C:\Users\test\Documents\google drvie\test\upload\upload2
C:\Users\test\Documents\google drvie\test\upload\test.txt
C:\Users\test\Documents\google drvie\test\upload\upload2\uplpad3
C:\Users\test\Documents\google drvie\test\upload\upload2\uplpad3\test.txt
I did it with the script below, but I can't upload by folder.
Only files can be uploaded.
When uploading by folder, the following access permission error is displayed.
If anyone knows, I would appreciate it if you could tell me.
error contents
Traceback
GoogleDriveFile({'parents': [{'id': '1J8TXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX'}], 'title': 'upload'})
Traceback (most recent call last):
File "c:\Users\test\Documents\google drvie\googledrive_file_up.py", line 43, in <module>
f.SetContentFile(os.path.join(path,x))
File "C:\Users\test\AppData\Roaming\Python\Python39\site-packages\pydrive\files.py", line 169, in SetContentFile
self.content = open(filename, 'rb')
PermissionError: [Errno 13] Permission denied: 'C:\\Users\\test\\Documents\\google drvie\\test\\upload'
Reference page
How to Upload File to Google Drive using Python Script?
code
from pydrive.drive import GoogleDrive
from pydrive.auth import GoogleAuth
import os
#Authenticate Google services
gauth = GoogleAuth()
# load credentials or create empty credentials if none exist
gauth.LoadCredentialsFile("mycreds.txt")
#If you don't have Google service credentials
if gauth.credentials is None:
#Automatically receive authorization code from user and configure local web server
gauth. LocalWebserverAuth()
# if the access token does not exist or has expired
elif gauth.access_token_expired:
#refresh authorization for google services
gauth. Refresh()
# if none match
else:
#Authorize Google services
gauth. Authorize()
# save credentials to file in txt format
gauth.SaveCredentialsFile("mycreds.txt")
#Authentication process for Google Drive
drive = GoogleDrive(gauth)
# specify folder path to upload
path = r'C:\Users\test\Documents\google drvie\test'
# File ID to upload to GOOGLE DRIVE
folder_id='1J8TXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX'
Loop processing (repeated processing) by #for statement
for x in os.listdir(path):
# Create GoogleDriveFile object
#f = drive.CreateFile({'title' : x})
f = drive.CreateFile({"parents": [{"id": folder_id},]})
#file title
f['title'] = x
# set local file and upload
print(f)
f.SetContentFile(os.path.join(path,x))
print(f)
#Upload to Google Drive
f.Upload()
print(f)
f = None
Sorry for the inconvenience, but thank you in advance.

How to copy a file to mapped network drive with python

I'm trying to copy files from a folder in my C drive to a mapped network Drive (Z) and am receiving an error using shutil.copy, can someone please tell me where I'm going wrong? Thanks! Here's the code below, much of which is borrowed from another SO post
import os, shutil, re
from os import path
start_file = "C:\\data\\"
end_file = "Z:\\test\\"
files = os.listdir(start_file)
file_list = [i for i in os.listdir(start_file) if and path.isfile(path.join(start_file, i))]
for f in file_list:
shutil.copy(path.join(start_file, f), end_file)
The exact error I'm getting is "Exception has occurred: FileNotFoundError
[Errno 2] No such file or directory" on the last line of that code block

Flask: How to upload a user file into Google Cloud Storage using upload_blob

I'm using Flask to make a web application and I want to upload a user input file to Google Storage Cloud. I'm using Heroku to host my web app and I don't know how to save files on Heroku's temporary storage so I'm trying to use tempfile to store the file in a directory and then access the directory to upload the file.
When I try to do that, I get this error: PermissionError: [Errno 13] Permission denied: 'C:\\Users\\[MyName]\\AppData\\Local\\Temp\\tmpbpom7ull'
Here is my code I'm working with, if anyone has any other way to upload a FileStorage object to the Google Storage cloud or a way to access the saved file, that would be very appreciated!
# File is currently a "FileStorage" object from werkzeug, gotten by doing
# file = request.files["filename"]
tempdir = tempfile.mkdtemp()
file.name = filename
file.save(tempdir)
upload_blob(BUCKET_NAME,filename,filename)
Following up on yesterday's Flask: Could not authenticate question the Google Cloud Storage client, you can use werkzeug's FileStorage object as described in the Flask-GoogleStorage usage:
Assuming you a have a file hellofreddie.txt in the working directory:
hellofreddie.txt:
Hello Freddie!
You can then open it, create a FileStorage object and then use the save on Bucket object (files):
from datetime import timedelta
from flask import Flask
from flask_googlestorage import GoogleStorage, Bucket
from werkzeug.datastructures import FileStorage
import os
files = Bucket("files")
storage = GoogleStorage(files)
app = Flask(__name__)
app.config.update(
GOOGLE_STORAGE_LOCAL_DEST = app.instance_path,
GOOGLE_STORAGE_SIGNATURE = {"expiration": timedelta(minutes=5)},
GOOGLE_STORAGE_FILES_BUCKET = os.getenv("BUCKET")
)
storage.init_app(app)
with app.app_context():
with open("hellofreddie.txt","rb") as f:
file = FileStorage(f)
filename = files.save(file)
After the code has run, you will see a UUID-named equivalent created in Cloud Storage.
You can use the storage browser or gsutil:
gsutil ls gs://${BUCKET}
gs://{BUCKET}/361ea9ea-5599-4ff2-84d1-3fe1a802ac08.txt
NOTE I was unable to resolve an issue trying to print either files.url(filename) or files.signed_url(filename). These methods correctly return the Cloud Storage Object but as PurePosixPath('f3745268-5c95-4c61-a892-09c0de556635.txt'). My Python naivete.
I've realized my error, I was trying to use file.save() to a folder and not to an actual file, my code has been updated to
tempdir = tempfile.mkdtemp()
file.name = filename
file.save(tempdir + "/" + filename)
upload_blob(BUCKET_NAME,tempdir + "/" + filename,filename)
Thank you to PermissionError: [Errno 13] Permission denied

Python: PermissionError: [Errno 13] Permission denied with pydrive?

Here's my code for a very simple program:
import os, shutil
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
source_path = r"c:/users/x/appdata/roaming/medianxl/save"
destination_path = r"c:/users/x/desktop/backup_saves"
print("Contents being backed up:")
print(os.listdir(source_path))
destination = shutil.copytree(source_path, destination_path)
print("Contents successfully backed up to:", destination_path)
print("Now uploading backup saves to Google Drive...")
auth = GoogleAuth()
auth.LocalWebserverAuth()
drive = GoogleDrive(auth)
saves = drive.CreateFile()
saves.SetContentFile(r"c:/users/x/desktop/backup_saves")
saves.Upload()
So far I am having no issues when it comes to taking the folder from the appdata location and copying it to my desktop. Where I am getting the error in my title is when I go to upload that folder and its contents on my desktop to Google Drive using pydrive.
Here's the output from the command window after running the program:
Contents being backed up:
['preferences.json', 'TSW', 'uhp_prettycolor.d2s', 'uhp_prettycolor.key', 'uhp_prettycolor.ma0', 'uhp_prettycolor.map']
Contents successfully backed up to: c:/users/x/desktop/backup_saves
Now uploading backup saves to Google Drive...
Your browser has been opened to visit:
url here
Authentication successful.
Traceback (most recent call last):
File "backup.py", line 21, in <module>
saves.SetContentFile(r"c:/users/x/desktop/backup_saves")
File "C:\Users\x\AppData\Local\Programs\Python\Python38-32\lib\site-packages\pydrive\files.py", line 169, in SetContentFile
self.content = open(filename, 'rb')
PermissionError: [Errno 13] Permission denied: 'c:/users/x/desktop/backup_saves'
I've tried running cmd as admin but am still getting the same permissions error. Any ideas?
U need to check ur backup file is opening or not. if opening, close it.
u can try it to move to another disk (D: or E:) to test first.

Can't access directory Tensorflow Google Colab

Sorry I'm new to Tensorflow2.1 andGoogleColab`. And I don't understand why I have this error :
My code :
%tensorflow_version 2.x
import tensorflow as tf
from tensorflow import keras
print(tf.__version__)
import pathlib
import os
path_data_dir = tf.keras.utils.get_file(origin='https://www.kaggle.com/c/dogs-vs-cats/download/0iMGwZllApFLiU35zX78%2Fversions%2Fm5lLqMS0KLfxJUozn3gR%2Ffiles%2Ftrain.zip',fname='train',untar= True)
data_dir = pathlib.Path(path_data_dir)
entries = os.listdir(data_dir)
for entry in entries:
print(entry)
And I have this error (I tried to mount a GoogleDrive folder and I have access
FileNotFoundError Traceback (most recent call last)
<ipython-input-1-88f88035f225> in <module>()
12 data_dir = pathlib.Path(path_data_dir)
13
---> 14 entries = os.listdir(data_dir)
15 for entry in entries:
16 print(entry)
FileNotFoundError: [Errno 2] No such file or directory: '/root/.keras/datasets/train'
Thanks a lot for your help
Lily
I am assuming this is because of the different file system structure between a normal Linux machine and the runtime hosted by Google Colab.
As a workaround, pass the cache_dir='/content' argument to the get_file function to be as follows: path_data_dir = tf.keras.utils.get_file(origin='https://www.kaggle.com/c/dogs-vs-cats/download/0iMGwZllApFLiU35zX78%2Fversions%2Fm5lLqMS0KLfxJUozn3gR%2Ffiles%2Ftrain.zip',fname='train',untar= True, cache_dir='/content')
Be aware that the returned value path_data_dir is a full path to the file, so the function call os.list_dir(data_dir) will fail since data_dir points to a file and not a directory.
To fix this, change entries = os.listdir(data_dir) to entries = os.listdir(data_dir.parent)
I think this is simply a bad link to download data finally... On google colab I can't see correctly the downloaded file (because I can't see folders...) but I tried later on a computer and It's juste the link.

Categories

Resources