Google Drive: Upload and get a link through Python - python

I have a small startup that is growing a little now and I'm trying to optimize some processes.
Every day I upload manually more than 100 PDFs to Google Drive and after that, I create a link one by one (that can be sharable).
Is this possible to do through Python? I tried to find some information and I have a lot about uploading but not about getting a sharable link.

Related

Uploading a very large file to google drive using the drive api using python script

I am looking to upload a very large zip file (several hundred GBs) from my remote server to my google drive using the drive api v3. I tried following a tutorial at Toward Data Science but it defers the use of resumable uploads to the drive api documentation, which isn't very beginner friendly. Other questions on this matter don't handle the file sizes I am handling. They also don't mention the issue of keeping the access-token valid for the time the file is being uploaded. I also found another SO answer during my search. However, it is again a "multi-part" upload method.
Any help would be appreciated. I am looking to automate this using a python script.
Thanks in advance!

Restructure connected google drives

I want to reorganize a series of linked google drives. Currently, each drive contains files corresponding to a letter (A-Z), and I want to reorganize them, so the files are organized by year instead. There is a massive amount of data in each drive, so it would take a lot of time to share and then copy files from one to another, and there are also many different file types. I've looked at some cloud transfer solutions, but if anyone knows if this is feasible with the drive API please let me know. I've looked at the documentation, but I'm not sure how to apply it to a transfer this large.
Use the Google Drive API, specifically the files listing via the Google Python library to list the files by page and store those in a file or database. Then, sort them accordingly with help of Python or the database.
To copy files between accounts, try rclone. It supports the server-side copy feature (--drive-server-side-across-configs) which means the file doesn't have to be downloaded and uploaded locally and instead is copied on the Google drive side. This should be significantly faster.

Publish a .html file from Google Drive to WordPress

I'm using Google Colaboratory to create a series of plots using plotly and saving them as .html on Google Drive. I would like to publish does files on a WordPress.
Furthermore, I will update these plots regularly. So every time I update them, they should be updated on WordPress.
Is there a good way to implement this?
It seems that there isn't a super easy way to get the HTML doc between Google and WordPress, so the next best thing is to automatically pull the raw document from Drive, and have that integrate on a schedule. Based on this SO post, get the file id for your document (follow these steps here) to be able to access the raw data of your file. Note: you may want the download link instead, which can be found in the second link.
https://drive.google.com/uc?id=file_id
This may not work, as I've found there's redirection, but you could also possibly use the redirected URL to grab the image. I haven't had much success, however (using curl and wget). After that, you should be able to upload it to WordPress, or have WordPress itself download it and implement it.

How do I work with large data with a web application?

I recently am wrapping up a personal project that involved using flask, python, and pythonanywhere. I learned a lot and now I have some new ideas for personal projects.
My next project involves changing video files and converting them into other file types for example JPGs. when I drafted up how my system could work I quikly realized that the current platform I am using for web application hosting, meaning pythonanywhere, will be too expensive and perhaps even too slow it since I will be working with large files.
I searched around and found AWS S3 for file storage but I am having trouble finding out how I can operate on that data to do my conversions in python. I definitely don't want to download from S3 operate, on the data in Python anywhere, and then reupload the converted files to a bucket. The project will be available for use on the internet so I am trying to make it as robust and scalable as possible.
I found it hard to even word this question on what to ask as I am not too sure if I am even asking the right questions. I guess I am looking for a way to manipulate large data files, preferably in python, without having to work with the data locally if that makes any sense.
I am open to learning new technologies if that is the case and am looking for some direction on how I might achieve this personal project.
Have you looked into AWS Elastic Transcoder?
Amazon Elastic Transcoder lets you convert media files that you have stored in Amazon Simple Storage Service (Amazon S3) into media files in the formats required by consumer playback devices. For example, you can convert large, high-quality digital media files into formats that users can play back on mobile devices, tablets, web browsers, and connected televisions.
Like all things AWS, there are SDKs (e.g. Python SDK) that allow you to programmatically access the service.

Automatically upload photos to a particular Google Photos album

I'm trying to automatically upload JPG photo files from a particular directory on my computer to a particular album on Google Photos. I'd like the photos to periodically get pushed up to Google Photos (every day or so is frequent enough). Google Photos Backup almost does what I want, but it just uploads the files -- it doesn't put them into a particular [pre-existing] album on Google Photos. It's possible that I can somehow use Google Drive and a simple cron job for this, although I don't know how. I am also considering using the Picassa Web Albums API, but that feels overkill and I'd like to avoid that work unless it's necessary. Are there any straightforward solutions to this?
As you said that Google Photo Backup do the (upload) job, in my opinion the best way then is to use directly a Google Apps Script stored inside your Google Drive (running periodicaly) in order to push each new detected pictures inside a particular album.
If you need relative documentation, you may take a look at the album class documentation and also https://developers.google.com/apps-script/
If you need to use an other language to do the job (python, js, etc...) please specify which one and give us also more precision. (mac / windows / linux)
Use IFTTT for this. Google Photos channel perfectly fits for this purpose. https://ifttt.com/applets/DMgPS2uZ-back-up-new-android-photos-you-take-to-google-photos

Categories

Resources