Google Adwords API - refresh token - python - python

I am in the process of setting up with Google Adwords API. They have a fantastic guide (https://developers.google.com/adwords/api/docs/guides/start), with the exception that one of the last steps is rather vague.
I have gotten to this step, pictured here (but from the link above)
I am instructed (for Python) to put the client ID and client secret into my own configuration file. All the other languages have specific files that were added to need to be edited (such as the PHP example below).
I have been working at this for the past 3 hours, and tried googling and youtubing and reading through every piece of documentation I can find. All of them just say "add the ID and secret to your config file." I have no idea what that means, or how to do it. I've gone into my python directory and found a file named "config.py", but have no idea how to add these credentials. There is a number of scripts on github (that Google links to), one of them for generating a refresh token, like I want. I have no idea how to implement this, though.
https://github.com/googleads/googleads-python-lib/tree/master/examples/adwords/authentication
Thank you in advance for any insight into adding credentials to my python config file or otherwise generating a refresh token.

I found the answer.
In short, the config file is in a directory that was not included in the instructions. It is advisable to download the entire "googleads-python-lib" directory versus just the directory "googlead".
https://github.com/googleads/googleads-python-lib
The config file (googleads.yaml) is within this "googleads-python-lib" directory. I unzipped it in my python2.7/site-packages. There are variables in this config file ready to take your authentication credentials.

Related

Achieving data persistence of a dynamic .json file with a python web app as a public repository

A little info about the project: https://ahashplace.#PreferablyAFreeHost.something/ is a grid of query strings. When you make a request to the site with x, y and color in the query string if the sha256 hash of your query is lower than the current query in that x,y position the site will save your query for that position.
So I was running on Heroku's free tier (a recently discontinued product) with an s3 bucket as backup. With Heroku private repos I had the AWS credentials as plain text. I need s3 to keep an updated copy of a file (data.json) so that the app can get it instead of rolling back to its deployment version when the build re-deploys.
I would like the build to be a public repository. This is for a couple reasons; first I want this to be open source and second I would prefer not to give out my github credentials even if Render will deploy on their hardware from your own private repos for free.
The question: How do I get my dynamic file to endure along side an open source python web app?
My thoughts so far...
It would be great if I could make whatever credentials I put out there only work from the host; based on like a known static ip and sub-net-mask but I'm not sure that would be secure or viable as the host might change my network address and it could be spoofed by an attacker.
I've used mongoDB before and it didn't really have this problem and I don't know why...
I could put the data in a public repo and just run code privately to periodically fetch data from the app and push updates to the repo. Then the app could look to the repo when starting up the build. This is messy though.

Sharepoint API how to read file havin only sharing link to it

I'm using python office365 library to access sharepoint documents. I don't know how to access file via API that have been shared with me by sharing link. I need to get this file content and if possible metadata (last modify date). Could anyone help?
The user that I'm using have no access to this sharepoint folder other than a sharing link to a single file.
I tried many variations of normal file access API, bot by hand and by office365 library. I couldnt find a way to access a file when I have only sharing link to it.
My sharing link looks like that:
https://[redacted].sharepoint.com/:x:/s/[redacted]/dir1/dir2/ESd0HkNNSbJMhQFavQsr9-4BNHC2rHSWsnbs3zRdjtZsC3g so there is not really a filename here and I cannot read via API content of any folder per se because I have an error Attempted to perform an unathorized operation.. Authentication goes fine (when i mistake password I get different error).
According to my research and testing, you can use the following Rest API to read file (get file content):
https://xxxx.sharepoint.com/sites/xxx/_api/web/GetFolderByServerRelativeUrl('/sites/xxx/Library_Name/Folder Name')/Files('Document.docx')/$value
If you want to get last modify date, you can use the following Rest API to get the Modified field:
https://xxxx.sharepoint.com/sites/xxx/_api/web/lists/getbytitle('test_library')/Items?$select=Modified

Using Custom Libraries in Google Colab without Mounting Drive

I am using Google Colab and I would like to use my custom libraries / scripts, that I have stored on my local machine. My current approach is the following:
# (Question 1)
from google.colab import drive
drive.mount("/content/gdrive")
# Annoying chain of granting access to Google Colab
# and entering the OAuth token.
And then I use:
# (Question 2)
!cp /content/gdrive/My\ Drive/awesome-project/*.py .
Question 1:
Is there a way to avoid the mounting of the drive entriely? Whenever the execution context changes (e.g. when I select "Hardware Acceleration = GPU", or when I wait an hour), I have to re-generate and re-enter the OAuth token.
Question 2:
Is there a way to sync files between my local machine and my Google Colab scripts more elegently?
Partial (not very satisfying answer) regarding Question 1: I saw that one could install and use Dropbox. Then you can hardcode the API Key into the application and mounting is done, regardless of whether or not it is a new execution context. I wonder if a similar approach exists based on Google Drive as well.
Question 1.
Great question and yes there is- I have been using this workaround which is particularly useful if you are a researcher and want other to be able to re run your code- or just 'colab'orate when working with larger datasets. The below method has worked well working as a team and there are challenges to each person having their own version of datasets.
I have used this regularly on 30 + Gb of image files downloaded and unzipped to colab run time.
The file id is in the link provided when you share from google drive
you can also select multiple files and select share all and then get a generate for example a .txt or .json file which you can parse and extract the file id's.
from google_drive_downloader import GoogleDriveDownloader as gdd
#some file id/ list of file ids parsed from file urls.
google_fid_id = '1-4PbytN2awBviPS4Brrb4puhzFb555g2'
destination = 'dir/dir/fid'
#if zip file ad kwarg unzip=true
gdd.download_file_from_google_drive(file_id=google_fid_id,
destination, unzip=True)
A url parsing function to get file ids from a list of urls might look like this:
def parse_urls():
with open('/dir/dir/files_urls.txt', 'r') as fb:
txt = fb.readlines()
return [url.split('/')[-2] for url in txt[0].split(',')]
One health warning is that you can only repeat this a small number of times in a 24 hour window for the same files.
Here's the gdd git repo:
https://github.com/ndrplz/google-drive-downloader
here is an working example (my own) of how it works inside bigger script:
https://github.com/fdsig/image_utils
Question 2.
You can connect to a local run time but this also means using local resources gpu/cpu etc.
Really hope this helps :-).
F~
If your code isn't secret, you can use git to sync your local codes to github. Then, git clone to Colab with no need for any authentication.

Python API example can't find the keys/docusign_private_key.txt file

I'm trying to use the Python API example in this page:
https://github.com/docusign/docusign-python-client
But the script can't configure the JWT token because it can't find the keys/docusign_private_key.txt file. Where is this file supposed to be?
You're supposed to have obtained a private key from docusign and saved it as "docusign_private_key.txt" on your computer. The example assumes that the file is inside a folder called "keys" which is located in your working directory. For example, you could have a docusign folder on your desktop be your working directory. Then, you could have a key folder which contains the text file. Also make sure your key is saved in PEM format as a lot of people seem to be having issues with that as well. Good luck.

Google cloud translate API - "Daily Limit Exceeded"

I'm writing a bit of python using the google cloud api to translate some text.
I have set up billing on my account and it says it's active (with some credit added for the free trial). I created an application_default_credentials.json file with -
gcloud auth application-default login
Which asked me to log in to my account (I logged into the same account I set billing up on).
I then used -
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "/home/theo/.config/gcloud/application_default_credentials.json"
at the start of my python script. For the coding I followed these samples here - https://github.com/GoogleCloudPlatform/python-docs-samples/tree/master/translate/cloud-client
Yesterday the api wouldn't work and I would receive "daily limit exceeded" even though I had not used it yet. Eventually I gave up and decided to sleep on it.
Tried again today and it was working. Without having to do anything. Ah great I thought, it must just have taken a while to update my billing information.
But I've since translated a few things, maybe 10000 characters and I'm already receiving the same error message.
I did create a "Project" on the cloud console and have an api key from there. I'm not entirely sure how to use it because the documentation I linked above just uses the json credentials file. From what I've read online, using the json file is recommended over using a key now.
Any ideas about what I need to do?
Thanks.
Solved by creating a token at https://console.cloud.google.com/apis/credentials/serviceaccountkey instead of the one created with the gcloud auth command.
After I referenced the generated json file from that page it started working.
More info here - https://cloud.google.com/docs/authentication/getting-started

Categories

Resources