Recently I've made a twitter-bot using tweepy for my own need. Now to host it online 24x7, I first tried Heroku (couldn't add the credit/debit card), then pythonanywhere.com (I don't know why the console was shutting down after a day), then repl.it (the API keys were returning None from the .env file - though in my system it was running fine) and finally WayScript.com.
Now, here I encountered a new file type saying .secrets. I used .env for storing all the keys, but they were asking to save those credentials in the .secrets file. They also provide the .env file type. But it says that it is preferable to add those credentials in the .secrets file as they will be saved as an encrypted string in that case. Here is the screen view -
Now, if I use .env file, I can easily import the credentials using the code below -
import os
Secret_Key = os.environ['key']
But if I use a .secret file, then how the credentials will be called?
I appreciate any help you can provide.
You should use ws.environment to read values stored inside .secrets file. If you need to retrieve your API key you can do that as follow.
api_key = ws.environment['API_KEY']
Related
I've been using PyDrive for Google Drive automation and it works perfectly locally. I plan to move the code to a remote shared machine, which would mean I'll need to move the secrets too. I am using LoadCredentialsFile and passing in credentials.json. I don't however think my issue is based on my own code, and rather the PyDrive code.
In short: I want to trigger a function in the PyDrive module where it would usually fetch the client_secret and client_id from the credentials.json file. And this function will fetch the client_secret and client_id from vault instead. With this, I could then delete client_id and client_secret from credentials.json (for security reasons on the shared machine) and just fetch them and store in memory them when the python script executes.
The problem I am having is this. I can delete the client_secret and client_id from the credentials.json file and hard code them under the PyDrive client.py class OAuth2Credentials(Credentials), under function from_json(), where it seems to be fetching the secrets from the json file. So instead of this inside that function:
data['access_token'],
data['client_id'],
data['client_secret'],
data['refresh_token'],
I could instead do this (i have tried this and it works):
data['access_token'],
"myhardcodedclient_id.apps.googleusercontent.com",
"myhardcodedclient_secret",
data['refresh_token'],
And then I could (haven't tried it yet) replace those hard coded values with functions to fetch the secrets from vault (hashicorp vault). Example:
data['access_token'],
get_client_id(),
get_client_secret(),
data['refresh_token'],
The problem, however, is that when the script runs, it does use the hardcoded values, but it also overwrites the existing credentials.json file (with no secrets - because I manually deleted them) and inserts the hardcoded values into that json file, which then ruins the whole idea behind using vault (not wanting to expose client secret/id to other users on the remote machine.)
Am I over complicating this? I would post the PyDrive code but there is 5000+ lines in the client.py file alone so I'm sure it would be spam and this isn't an issue with my own script (as that works perfectly as expected). If anyone has had experience doing something similar, please help! Thank you!
I had connected to google sheets in python using the json key file which was available on my computer. Now the key file is available online on slack/google drive. How do i use it so that everyone in my org is able to make changes to the python code linked to a particular spreadsheet?
This was the code before:
credentials = ServiceAccountCredentials.from_json_keyfile_name(
r"C:\Users\nilad\OneDrive\Desktop\creds.json", scope)
This worked fine.
The code now:
credentials = ServiceAccountCredentials.from_json_keyfile_name(
"https://layerfive.slack.com/files/UKVDY5RTN/F02CRDN636C/layerfiveintegrationmonitoring-c47eacd6d461.json", scope)
This throws an error on the path towards the json file.
The credentials file needs to be downloaded to the local disc. You can't link an online version of the file as this is not supported.
If you wish other people to edit the code and test it, then they will also need a copy of the credentials file.
I have a python app that makes a call to a web API to collect some data, processes the data displays it on a front end. Unfortunately, the credentials that are needed to access the API also give access to a bunch of sensitive information.
Therefore, I was wondering if there was any way to connect to the API in a way that doesn't let people with access to the python code reconstruct the credentials. I have found some posts about encrypting the data outside the app and decrypting it inside the app, but it seems to me that this would require revealing the decryption method in the python code, which would mean that everyone could simply reconstruct it.
By using .env file with e.g. python-dotenv library.
mycode.py file:
from dotenv import load_dotenv
load_dotenv()
my_var=os.getenv("MYHIDDEN_VAR")
.env file:
MYHIDDEN_VAR='REAL VALUE'
.env.example file:
MYHIDDEN_VAR='OBSCURED VALUE'
I have my data on google cloud platform and i want to be able to be able to download it locally, this is my first time trying that and eventually i'll use the downloaded data with my python code.
I have checked the docs, like https://cloud.google.com/genomics/downloading-credentials-for-api-access and https://cloud.google.com/storage/docs/cloud-console i have successfully got the Json file for my first link, the second one is where u'm struggling, i'm using python 3.5 and assuming my json files name is data.json i have added the following code:
os.environ["file"] = "data.json"
urllib.request.urlopen('https://storage.googleapis.com/[bucket_name]/[filename]')
first of all i don't even know what i should call the value near environ so i just called it file, not sure how i'm supposed to fill it and i got access denied on the second line, obviously it's not how to download my file as there is no destination local repository or anything in that command any guidance will be appreciated.
Edit:
from google.cloud.storage import Blob
os.environ["GOOGLE_APPLICATION_CREDENTIALS"] = "credentials/client_secret.json"
storage_client = storage.Client.from_service_account_json('service_account.json')
client = storage.Client(project='my-project')
bucket = client.get_bucket('my-bucket')
blob = Blob('path/to/my-object', bucket)
download_to_filename('local/path/to/my-file')
I'm getting unresolved reference for storage and download_to_filename and should i replace service_account.json with credentials/client_secret.json. Plus i tried to print the content of os.environ["GOOGLE_APPLICATION_CREDENTIALS"]['installed'] like i'd do with any Json but it just said i should give numbers meaning it read the input path as regular text only.
You should use the idiomatic Google Cloud library to run operations in GCS.
With the example there, and knowing that the client library will get the application default credentials, first we have to set the applicaiton default credentials with
gcloud auth application-default login
===EDIT===
That was the old way. Now you should use the instructions in this link.
This means downloading a service account key file from the console, and setting the environment variable GOOGLE_APPLICATION_CREDENTIALS to the path of the downloaded JSON.
Also, make sure that this service account has the proper permissions on the project of the bucket.
Or you can create the client with explicit credentials. You'll need to download the key file all the same, but when creating the client, use:
storage_client = storage.Client.from_service_account_json('service_account.json')
==========
And then, following the example code:
from google.cloud import storage
client = storage.Client(project='project-id')
bucket = client.get_bucket('bucket-id')
blob = storage.Blob('bucket/file/path', bucket)
blob.download_to_filename('/path/to/local/save')
Or, if this is a one-off download, just install the SDK and use gsutil to download:
gsutil cp gs://bucket/file .
I am building simple app which is using Twitter API. What I have to do to hide my Twitter app keys? For example, if I will put my program to the internet and somebody who look up to the code will know my consumer key, access token etc. And if I not include this information into my program, that it won't be work!
I'm assuming by putting on the internet you mean publishing your code on github or such.
In that case you should always separate code and configuration. Put your API keys in an .ini file, i.e. config.ini, then load that file from python program using configparser
Add configuration file to your .gitignore so it would not get added to the source control.
Assuming you're running on a Unix like system, one way to handle this is environment variables.
In your shell you can do this:
export TWITTER_API_KEY=yoursecretapikey
Note that you don't use quotes of any kind for this.
Then in your script:
import os
twitter_key = os.environ.get('TWITTER_API_KEY')