EZSheets Module won't open Window to log into Google account - python

I'm learning Python using Automate the Boring Stuff and I am currently on Chapter 14 which centers on using EZSheets to work with Google Sheets.
The book says to place the credentials-sheets.json file in the same folder as the Python scripts, so I placed it in the "\Python38\Scripts" folder.
The next step is to run import ezsheets and it will open a new browser window for me to log in to my Google account. Here is where I am stuck. Importing EZSheets does nothing. I don't get any errors, but it doesn't open the window for me to log into my Google account.
All the code I used was:
import ezsheets

I just spent about the last 2 hours tackling this problem. As someone with a thin layer of SQL and Python experience...I was very frustrated with your same problem. Here's what I did when I got stuck in a couple places.
It might just be that you need to change the directory you are working in to "\Python38\Scripts" and then run the script, like you mentioned. If not, here's what I did next.
I followed the ezsheets doc's, much like you, except I skipped over the Google Python Quickstart guide linked in the doc. No problem, I circled back and found it.
"Setting up the Same" was my next issue. First problem there: the documentation says to name your credentials "credentials-sheets.json" while the Google quickstart.py file calls for "credentials.json". You have to change that in the quickstart.py file (and quickstart.py should be in the same file location as the credentials. In your case: "\Python38\Scripts").
Look for this
flow = InstalledAppFlow.from_client_secrets_file(
'credentials-sheets.json', SCOPES)
The above is what I fixed in the quickstart.py file (to be the same as the ezsheets documentation) and worked for me.
Then, I was getting a 400 error when running Quickstart. The browser window would open and it would give me this Authorization Error: "Error 400: redirect_uri_mismatch". It specified that I did not have the right redirect URL authorized. Stumped me for a while but if you can get to your developer console and edit the credentials, you can add the local host location and the port number given to you in the error.
Here comes the final issue with that solution. In the quickstart, "port:0" was specified and it would randomize every time I added the most recent port number to the authorization. After adding 3-4 ports and having it change on me, I saw in the quickstart.py file that I needed to specify the port like so.
creds = flow.run_local_server(port=8080)
After changing that in the quickstart.py file and adding that to my "Authorized redirect URI's" in Developer Console, I was able to finish this step and move on.
I'm open to criticism if this is incorrect. I found those two errors were missing from every answer I came across. Hopefully this saves some people some time while trying to setup ezsheets to sync with Google Drive and Google Sheets.

For the import command to work you must first type python or python3 to your command prompt while inside the directory you want to import ezsheets Once you have the python prompt then import ezsheets command will work

Related

How can I get the token-sheets.pickle and token-drive.pickle files to use ezsheets for Google Sheets?

I am trying to set up ezsheets for the use with Google Sheets. I followed the instructions from here https://ezsheets.readthedocs.io/en/latest/ and here https://automatetheboringstuff.com/2e/chapter14/
The set up process works quite differently on my computer: Somehow I could download the credentials-sheets.json. I need to download the token-sheets.pickle and token-drive.pickle files. When I run import ezsheets, no browser window is opended as described in the set up instructions. Nothing happens.
Is there another way to download both files?
I followed the steps you referenced and managed to generate the files, but I also encountered the same issue before figuring out the cause. The problem is that there are a few possible causes and the script silently fails without telling you exactly what happened.
Here are a few suggestions:
First off you need to configure your OAuth Consent Screen. You won't be able to create the credentials without it.
Make sure that you have the right credentials file. To generate it you have to go to the Credentials page in the Cloud Console. The docs say that you need an OAuth Client ID. Make sure that you have chosen the correct app at the top.
Then you will be prompted to choose an application type. According to the docs you shared the type should be "Other", but this is no longer available so "Desktop app" is the best equivalent if you're just running a local script.
After that you can just choose a name and create the credentials. You will be prompted to download the file afterwards.
Check that the credentials-sheets.json file has that exact name.
Make sure that the credentials-sheets.json file is located in the same directory where you're running your python script file or console commands.
Check that you've enabled both the Sheets and Drive API in your GCP Project.
Python will try to setup a temporary server on http://localhost:8080/ to retrieve the pickle files. If another application is using port 8080 then it will also fail. In my case a previously failed Python script was hanging on to that port.
To find and close the processes using port 8080 you can refer to this answer for Linux/Mac or this other answer for Windows. Just make sure that the process is not something you're currently using.
I just used the single import ezsheets command to get the files so after getting the token-sheets.pickle I had to run it again to get the token-drive.pickle, but after that the library should detect that you already have the files.

Problems runing a code with ipynb in google drive

Sooo, this is more of theoretical question. I have a python code that runs just fine in Pycharm but started to not work anymore in my .ipynb drive file. It downloads and writes over a binary document.
The code fails were trying to open the downloaded file for writing with the error message that '[Errno 2] No such file or directory:'.
I'm not sure if it is involved in the problem, but these lines below used to run without warnings, but now google drive asks for my permission for them to run:
drive.mount('/content/drive') drive.mount('/content/drive', force_remount=True)
As a told before, since the code runs without error in pycharm, I was wondering if you guys know whether this is a "google drive new update" kinda of thing, or a "jupiter notebook problem" or something else maybe?
EDIT: here's the message google drive sends me

Authentication issue when uploading video to YouTube using YouTube API and cron

I am trying to upload a video to YouTube each afternoon using the sample YouTube Python upload script from Google (see Python code examples on developers.google, I haven't enough reputation to post more links...). I would like to run it as a cronjob. I have created the client_secrets.json file and tested the script manually. It works fine when I run it manually, however when I run the script as a cronjob I get the following error:
To make this sample run you will need to populate the
client_secrets.json file found at:
/usr/local/cron/scripts/client_secrets.json
with information from the Developers Console
https://console.developers.google.com/
For more information about the client_secrets.json file format, please
visit:
https://developers.google.com/api-client-library/python/guide/aaa_client_secrets
I've included the information in the JSON file already and the -oauth2.json file is also present in /usr/local/cron/scripts.
Is the issue because the cronjob is running the script as root and somehow the credentials in one of those two files are no longer valid? Any ideas how I can enable the upload with cron?
Cheers
James
Ok, so 7 months later I've come back to this cron issue. It turns out that the upload2youtube.py example file was hardcoded to look in the current directory for the clients_secrets.json file. This explains why I could run it manually from the local directory but not on cron. I've included the full path in the example file and this works fine now.

Getting push to deploy to work, configuring "release pipeline"

So since last week suddenly git push origin master doesn't work anymore to "push to deploy". It sure pushes the sources to remote repository at Google, and the code is there but it never deploys. Read about it here: GAE: Trouble with push to deploy
It seems things are changing over at Google and this week there is new stuff in the Google Developer Console, in the "Cloud Development/Releases" section; "Configure Release Pipeline"
There are three settings: the pipeline name, pipeline tasks, and then an optional setting to have deploy notifications sent by email.
I just enter a random name like "mydevpipeline", select "Deploy source only", and check the email box. But I just get this error: "Failed to create the pipeline.". I also tried unchecking the email box, still same error. Tried it over and over.
No where to go from there...
Anyone been able to create this pipeline and get it all working?
It seems that this pipeline configuration must go through in order for push to deploy from now. I haven't seen any news or notification about this change...
Fwiw, the documentation https://developers.google.com/appengine/docs/push-to-deploy states nothing about pipelines. It's just outdated I guess.
Update:
What do you know... I went on trying to configure this pipeline on the live GAE project (the one described above is the dev GAE project I'm using)... and it worked. I could configure a pipeline ok. After that, I could once more push-to-deploy, alas only on the live version so far. I might try creating a new dev project, it seems existing projects "break" from time to time... I have had similar problems before and creating a new project DOES solve things from time to time.....
Google App Engine pipelines do not like .gitignore file. Try if it works without that file. It fixed the problem for me.
It took me a long time to get this working for PHP, after a lot of communication with Google it was finally revealed to me that in your app.yaml file you need to have a line that reads:
threadsafe: false
In order for the pipeline to successfully pick up and deploy your git push (I use sourcetree, but command line git has the same end result) that line must be present. If it's omitted or set to true the pipeline won't be able to deploy it.
I wanted to throw this answer on here in case anyone stumbled on this thread looking for help. One of my projects has "randomly broken" and after 3 months of successfully using my release pipeline for multiple commits per day it suddenly no longer deploys when I push. Ultimately giving the extremely helpful error message of "Unable to get deployment status" - and now none of my changes can be applied to the live site. Copying the entire source code, changing the app name, and pushing to a new GAE project with release pipeline works fine, but I need the original site to start working again.

Can't auto-start Python program on Raspberry Pi due to Google Calendar

I have written a python tkinter program which runs on my Raspberry Pi, which does a number of things, including interfacing with my google calendar (read only access). I can navigate to the directory it is in and run it there - it works fine.
I would like the program to start at boot-up, so I added it to the autostart file in /etc/xdg/lxsession/LXDE, as per advice from the web. However it does not start at boot. So I try running the line of code I put in that file manually, and I get this.
(code I run) python /home/blahblah/MyScript.py
WARNING: Please configure OAuth 2.0
To make this sample run you will need to download the client_secrets.json file and save it at:
/home/blahblah/client_secrets.json
The thing is, that file DOES exist. But for some reason the google code doesn't realise this when I run the script from elsewhere.
How then can I get my script to run at bootup?
Figured this out now. It's tough, not knowing whether it's a Python, Linux or Google issue, but it was a Google one. I found that other people across the web have had issues with client_secrets.json as well, and the solution is to find where its location is stored in the Python code, and instead of just having the name of the file, include the path as well, like this.
CLIENT_SECRETS = '/home/blahblahblah/client_secrets.json'
Then it all works fine - calling it from another folder and it starting on startup. :)

Categories

Resources