For security purpose, I have disabled public access under Networking Tab in Keyvault and have a private endpoint in place. Both keyvault and private endpoint reside in same resource group. I have an app registration for my application for which I have granted access under Access policies in Keyvault.
Using Python SDK,
from azure.keyvault.secrets import SecretClient
from azure.identity import ClientSecretCredential as cs
keyVaultName = "<NAME>"
kvURI = "https://<NAME>.vault.azure.net"
AZ_TENANT_ID = '<AZ_TENANT_ID>'
AZ_CLIENT_ID = '<AZ_CLIENT_ID>'
AZ_CLIENT_SECRET = '<AZ_CLIENT_SECRET>'
credential = cs(
tenant_id=AZ_TENANT_ID,
client_id=AZ_CLIENT_ID,
client_secret=AZ_CLIENT_SECRET)
def set_secret(secretname,secretvalue):
print(credential)
secret_client = SecretClient(vault_url=kvURI, credential=credential)
secret = secret_client.set_secret(secretname,secretvalue,enabled=True)
sec_dic={}
sec_dic['name']=secret.name
sec_dic['value']=secret.value
sec_dic['properties']=secret.properties.version
return sec_dic
xx=set_secret('g','ff')
print(xx)
When running this code, I get the follwing error,
azure.core.exceptions.HttpResponseError: (Forbidden) Public network access is disabled and request is not from a trusted service nor via an approved private link.
Code: Forbidden
Message: Public network access is disabled and request is not from a trusted service nor via an approved private link.
Inner error: {
"code": "ForbiddenByConnection"
}
What am I doing wrong? How do I connect to keyvault that has no public access only via private endpoint?
I have reproduced in my environment, and got expected results as below:
Firstly, I have done the same process you have explained, and I got same error as you have got:
So, this error comes when you create a private endpoint.
When you create a private endpoint with a particular Virtual Network-Subnet, you need to create a Virtual Machine where Virtual Network is integrated.
Then we need to open the Vm created integrated with the above Virtual Network-Subnet, and now when I try to access the key Vault, I got required result as below:
References:
azure - Unable to get storage account container details - Stack Overflow
Azure Key Vault not allow access via private endpoint connection
Azure functions and Azure KeyVault communicating through service endpoint
My Sql server DB password is saved on Azure app vault which has DATAREF ID as a identifier. I need that password to create spark dataframe from table which is present in SQL server. I am running this .py file on google Dataproc cluster. How can I get that password using python?
Since you are accessing an Azure service from a non-Azure service, you will need a service principal. You can use certificate or secret. See THIS link for the different methods. You will need to give the service principal proper access and this will depend if you are using RBAC or access policy for your key vault.
So the steps you need to follow are:
Create a key vault and create a secret.
Create a Service principal or application registration. Store the clientid, clientsecret and tenantid.
Give the service principal proper access to the key vault(if you are using access policies) or to the specific secret(if you are using RBAC model)
The python link for the code is HERE.
The code that will work for you is below:
from azure.identity import ClientSecretCredential
from azure.keyvault.secrets import SecretClient
tenantid = <your_tenant_id>
clientsecret = <your_client_secret>
clientid = <your_client_id>
my_credentials = ClientSecretCredential(tenant_id=tenantid, client_id=clientid, client_secret=clientsecret)
secret_client = SecretClient(vault_url="https://<your_keyvault_name>.vault.azure.net/", credential=my_credentials)
secret = secret_client.get_secret("<your_secret_name>")
print(secret.name)
print(secret.value)
I'm building a Kubernetes application that Dockerizes code and runs it on the cluster. In order for users to be able to invoke their Dockerized code, I need to modify the Istio configuration to expose the service they've created.
I'm trying to create Istio virtual services using the Python API. I'm able to list existing Istio resources:
group = 'networking.istio.io'
version = 'v1alpha3'
plural = 'destinationrules'
from kubernetes import client, config
config.load_kube_config()
myclient = client.CustomObjectsApi()
api_response = myclient.list_cluster_custom_object(group, version, plural)
but when I use the same parameters to create, I get a 404 not found error.
with open('destination-rule.yaml', 'r') as file_reader:
file_content = file_reader.read()
deployment_template = yaml.safe_load(file_content)
api_response = myclient.create_cluster_custom_object(
group=group, version=version, plural=plural, body=body)
The destination-rule.yaml file looks like:
apiVersion: networking.istio.io/v1alpha3
kind: DestinationRule
metadata:
name: test
spec:
host: test
subsets:
- name: v1
labels:
run: test
What am I doing wrong here?
My problem was that I was doing create_cluster_custom_object instead of create_namespaced_custom_object. When I switched over, it started working.
I've deployed a Python web app using Azure App Service from a docker container in Container Registries. In my app I'm using dotenv to load secrets, and locally I running docker run --env-file=.env my-container to pass the .env variables, but I can't really figure out how to do it when deployed to Azure?
I'm using dotenv in the following way:
import os
from dotenv import load_dotenv
load_dotenv()
SERVER = os.getenv("SERVER_NAME")
DATABASE = os.getenv("DB_NAME")
USERNAME = os.getenv("USERNAME")
PASSWORD = os.getenv("PASSWORD")
PORT = os.getenv("PORT", default=1433)
DRIVER = os.getenv("DRIVER")
How can I have my container fetch the .env variables?
I've added the secrets to Azure Key Vault, but I'm not sure how to pass these to the container.
In terms of passing variables to Dockerfile check my answer here
Please add argument after from
FROM alpine
ARG serverName
RUN echo $serverName
and then run it like this
- task: Docker#2
inputs:
containerRegistry: 'devopsmanual-acr'
command: 'build'
Dockerfile: 'stackoverflow/85-docker/DOCKERFILE'
arguments: '--build-arg a_version=$(SERVER_NAME)'
In terms of fetching values from KeyVault you can use Azure Key Vault task
# Azure Key Vault
# Download Azure Key Vault secrets
- task: AzureKeyVault#1
inputs:
azureSubscription:
keyVaultName:
secretsFilter: '*'
runAsPreJob: false # Azure DevOps Services only
Be aware that by deault Variables created by this task are marked as secret, so they are not mapped an environment variables.
You can still try to use your approach but first you need to map it.
- powershell: |
Write-Host "Using an input-macro works: $(mySecret)"
Write-Host "Using the env var directly does not work: $env:MYSECRET"
Write-Host "Using a global secret var mapped in the pipeline does not work either: $env:GLOBAL_MYSECRET"
Write-Host "Using a global non-secret var mapped in the pipeline works: $env:GLOBAL_MY_MAPPED_ENV_VAR"
Write-Host "Using the mapped env var for this task works and is recommended: $env:MY_MAPPED_ENV_VAR"
env:
MY_MAPPED_ENV_VAR: $(mySecret) # the recommended way to map to an env variable
To pick up secrets from Key Vault and use them as env vars in your app, use Key Vault references as described here:
https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references
Then just add the reference to your App Settings. For example:
#Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/mysecret/)
That's it. No need to amend your dotenv code to do anything special, as App Settings are already injected as environment variables by App Service into your application.
Don't forget to add your App Service instance (managed identity) to the Key Vault's access policy, otherwise none of this works -
https://learn.microsoft.com/en-us/azure/app-service/app-service-key-vault-references#granting-your-app-access-to-key-vault
I need to store API keys and other sensitive information in app.yaml as environment variables for deployment on GAE. The issue with this is that if I push app.yaml to GitHub, this information becomes public (not good). I don't want to store the info in a datastore as it does not suit the project. Rather, I'd like to swap out the values from a file that is listed in .gitignore on each deployment of the app.
Here is my app.yaml file:
application: myapp
version: 3
runtime: python27
api_version: 1
threadsafe: true
libraries:
- name: webapp2
version: latest
- name: jinja2
version: latest
handlers:
- url: /static
static_dir: static
- url: /.*
script: main.application
login: required
secure: always
# auth_fail_action: unauthorized
env_variables:
CLIENT_ID: ${CLIENT_ID}
CLIENT_SECRET: ${CLIENT_SECRET}
ORG: ${ORG}
ACCESS_TOKEN: ${ACCESS_TOKEN}
SESSION_SECRET: ${SESSION_SECRET}
Any ideas?
This solution is simple but may not suit all different teams.
First, put the environment variables in an env_variables.yaml, e.g.,
env_variables:
SECRET: 'my_secret'
Then, include this env_variables.yaml in the app.yaml
includes:
- env_variables.yaml
Finally, add the env_variables.yaml to .gitignore, so that the secret variables won't exist in the repository.
In this case, the env_variables.yaml needs to be shared among the deployment managers.
If it's sensitive data, you should not store it in source code as it will be checked into source control. The wrong people (inside or outside your organization) may find it there. Also, your development environment probably uses different config values from your production environment. If these values are stored in code, you will have to run different code in development and production, which is messy and bad practice.
In my projects, I put config data in the datastore using this class:
from google.appengine.ext import ndb
class Settings(ndb.Model):
name = ndb.StringProperty()
value = ndb.StringProperty()
#staticmethod
def get(name):
NOT_SET_VALUE = "NOT SET"
retval = Settings.query(Settings.name == name).get()
if not retval:
retval = Settings()
retval.name = name
retval.value = NOT_SET_VALUE
retval.put()
if retval.value == NOT_SET_VALUE:
raise Exception(('Setting %s not found in the database. A placeholder ' +
'record has been created. Go to the Developers Console for your app ' +
'in App Engine, look up the Settings record with name=%s and enter ' +
'its value in that record\'s value field.') % (name, name))
return retval.value
Your application would do this to get a value:
API_KEY = Settings.get('API_KEY')
If there is a value for that key in the datastore, you will get it. If there isn't, a placeholder record will be created and an exception will be thrown. The exception will remind you to go to the Developers Console and update the placeholder record.
I find this takes the guessing out of setting config values. If you are unsure of what config values to set, just run the code and it will tell you!
The code above uses the ndb library which uses memcache and the datastore under the hood, so it's fast.
Update:
jelder asked for how to find the Datastore values in the App Engine console and set them. Here is how:
Go to https://console.cloud.google.com/datastore/
Select your project at the top of the page if it's not already selected.
In the Kind dropdown box, select Settings.
If you ran the code above, your keys will show up. They will all have the value NOT SET. Click each one and set its value.
Hope this helps!
This didn't exist when you posted, but for anyone else who stumbles in here, Google now offers a service called Secret Manager.
It's a simple REST service (with SDKs wrapping it, of course) to store your secrets in a secure location on google cloud platform. This is a better approach than Data Store, requiring extra steps to see the stored secrets and having a finer-grained permission model -- you can secure individual secrets differently for different aspects of your project, if you need to.
It offers versioning, so you can handle password changes with relative ease, as well as a robust query and management layer enabling you to discover and create secrets at runtime, if necessary.
Python SDK
Example usage:
from google.cloud import secretmanager_v1beta1 as secretmanager
secret_id = 'my_secret_key'
project_id = 'my_project'
version = 1 # use the management tools to determine version at runtime
client = secretmanager.SecretManagerServiceClient()
secret_path = client.secret_version_path(project_id, secret_id, version)
response = client.access_secret_version(secret_path)
password_string = response.payload.data.decode('UTF-8')
# use password_string -- set up database connection, call third party service, whatever
My approach is to store client secrets only within the App Engine app itself. The client secrets are neither in source control nor on any local computers. This has the benefit that any App Engine collaborator can deploy code changes without having to worry about the client secrets.
I store client secrets directly in Datastore and use Memcache for improved latency accessing the secrets. The Datastore entities only need to be created once and will persist across future deploys. of course the App Engine console can be used to update these entities at any time.
There are two options to perform the one-time entity creation:
Use the App Engine Remote API interactive shell to create the entities.
Create an Admin only handler that will initialize the entities with dummy values. Manually invoke this admin handler, then use the App Engine console to update the entities with the production client secrets.
Best way to do it, is store the keys in a client_secrets.json file, and exclude that from being uploaded to git by listing it in your .gitignore file. If you have different keys for different environments, you can use app_identity api to determine what the app id is, and load appropriately.
There is a fairly comprehensive example here -> https://developers.google.com/api-client-library/python/guide/aaa_client_secrets.
Here's some example code:
# declare your app ids as globals ...
APPID_LIVE = 'awesomeapp'
APPID_DEV = 'awesomeapp-dev'
APPID_PILOT = 'awesomeapp-pilot'
# create a dictionary mapping the app_ids to the filepaths ...
client_secrets_map = {APPID_LIVE:'client_secrets_live.json',
APPID_DEV:'client_secrets_dev.json',
APPID_PILOT:'client_secrets_pilot.json'}
# get the filename based on the current app_id ...
client_secrets_filename = client_secrets_map.get(
app_identity.get_application_id(),
APPID_DEV # fall back to dev
)
# use the filename to construct the flow ...
flow = flow_from_clientsecrets(filename=client_secrets_filename,
scope=scope,
redirect_uri=redirect_uri)
# or, you could load up the json file manually if you need more control ...
f = open(client_secrets_filename, 'r')
client_secrets = json.loads(f.read())
f.close()
This solution relies on the deprecated appcfg.py
You can use the -E command line option of appcfg.py to setup the environment variables when you deploy your app to GAE (appcfg.py update)
$ appcfg.py
...
-E NAME:VALUE, --env_variable=NAME:VALUE
Set an environment variable, potentially overriding an
env_variable value from app.yaml file (flag may be
repeated to set multiple variables).
...
Most answers are outdated. Using google cloud datastore is actually a bit different right now. https://cloud.google.com/python/getting-started/using-cloud-datastore
Here's an example:
from google.cloud import datastore
client = datastore.Client()
datastore_entity = client.get(client.key('settings', 'TWITTER_APP_KEY'))
connection_string_prod = datastore_entity.get('value')
This assumes the entity name is 'TWITTER_APP_KEY', the kind is 'settings', and 'value' is a property of the TWITTER_APP_KEY entity.
With github action instead of google cloud triggers (Google cloud triggers aren't able to find it's own app.yaml and manage the freaking environment variable by itself.)
Here is how to do it:
My environment :
App engine,
standard (not flex),
Nodejs Express application,
a PostgreSQL CloudSql
First the setup :
1. Create a new Google Cloud Project (or select an existing project).
2. Initialize your App Engine app with your project.
[Create a Google Cloud service account][sa] or select an existing one.
3. Add the the following Cloud IAM roles to your service account:
App Engine Admin - allows for the creation of new App Engine apps
Service Account User - required to deploy to App Engine as service account
Storage Admin - allows upload of source code
Cloud Build Editor - allows building of source code
[Download a JSON service account key][create-key] for the service account.
4. Add the following [secrets to your repository's secrets][gh-secret]:
GCP_PROJECT: Google Cloud project ID
GCP_SA_KEY: the downloaded service account key
The app.yaml
runtime: nodejs14
env: standard
env_variables:
SESSION_SECRET: $SESSION_SECRET
beta_settings:
cloud_sql_instances: SQL_INSTANCE
Then the github action
name: Build and Deploy to GKE
on: push
env:
PROJECT_ID: ${{ secrets.GKE_PROJECT }}
DATABASE_URL: ${{ secrets.DATABASE_URL}}
jobs:
setup-build-publish-deploy:
name: Setup, Build, Publish, and Deploy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: actions/setup-node#v2
with:
node-version: '12'
- run: npm install
- uses: actions/checkout#v1
- uses: ikuanyshbekov/app-yaml-env-compiler#v1.0
env:
SESSION_SECRET: ${{ secrets.SESSION_SECRET }}
- shell: bash
run: |
sed -i 's/SQL_INSTANCE/'${{secrets.DATABASE_URL}}'/g' app.yaml
- uses: actions-hub/gcloud#master
env:
PROJECT_ID: ${{ secrets.GKE_PROJECT }}
APPLICATION_CREDENTIALS: ${{ secrets.GCLOUD_AUTH }}
CLOUDSDK_CORE_DISABLE_PROMPTS: 1
with:
args: app deploy app.yaml
To add secrets into github action you must go to : Settings/secrets
Take note that I could handle all the substitution with the bash script. So I would not depend on the github project "ikuanyshbekov/app-yaml-env-compiler#v1.0"
It's a shame that GAE doesn't offer an easiest way to handle environment variable for the app.yaml. I don't want to use KMS since I need to update the beta-settings/cloud sql instance.. I really needed to substitute everything into the app.yaml.
This way I can make a specific action for the right environment and manage the secrets.
It sounds like you can do a few approaches. We have a similar issue and do the following (adapted to your use-case):
Create a file that stores any dynamic app.yaml values and place it on a secure server in your build environment. If you are really paranoid, you can asymmetrically encrypt the values. You can even keep this in a private repo if you need version control/dynamic pulling, or just use a shells script to copy it/pull it from the appropriate place.
Pull from git during the deployment script
After the git pull, modify the app.yaml by reading and writing it in pure python using a yaml library
The easiest way to do this is to use a continuous integration server such as Hudson, Bamboo, or Jenkins. Simply add some plug-in, script step, or workflow that does all the above items I mentioned. You can pass in environment variables that are configured in Bamboo itself for example.
In summary, just push in the values during your build process in an environment you only have access to. If you aren't already automating your builds, you should be.
Another option option is what you said, put it in the database. If your reason for not doing that is that things are too slow, simply push the values into memcache as a 2nd layer cache, and pin the values to the instances as a first-layer cache. If the values can change and you need to update the instances without rebooting them, just keep a hash you can check to know when they change or trigger it somehow when something you do changes the values. That should be it.
Just wanted to note how I solved this problem in javascript/nodejs. For local development I used the 'dotenv' npm package which loads environment variables from a .env file into process.env. When I started using GAE I learned that environment variables need to be set in a 'app.yaml' file. Well, I didn't want to use 'dotenv' for local development and 'app.yaml' for GAE (and duplicate my environment variables between the two files), so I wrote a little script that loads app.yaml environment variables into process.env, for local development. Hope this helps someone:
yaml_env.js:
(function () {
const yaml = require('js-yaml');
const fs = require('fs');
const isObject = require('lodash.isobject')
var doc = yaml.safeLoad(
fs.readFileSync('app.yaml', 'utf8'),
{ json: true }
);
// The .env file will take precedence over the settings the app.yaml file
// which allows me to override stuff in app.yaml (the database connection string (DATABASE_URL), for example)
// This is optional of course. If you don't use dotenv then remove this line:
require('dotenv/config');
if(isObject(doc) && isObject(doc.env_variables)) {
Object.keys(doc.env_variables).forEach(function (key) {
// Dont set environment with the yaml file value if it's already set
process.env[key] = process.env[key] || doc.env_variables[key]
})
}
})()
Now include this file as early as possible in your code, and you're done:
require('../yaml_env')
You should encrypt the variables with google kms and embed it in your source code. (https://cloud.google.com/kms/)
echo -n the-twitter-app-key | gcloud kms encrypt \
> --project my-project \
> --location us-central1 \
> --keyring THEKEYRING \
> --key THECRYPTOKEY \
> --plaintext-file - \
> --ciphertext-file - \
> | base64
put the scrambled (encrypted and base64 encoded) value into your environment variable (in yaml file).
Some pythonish code to get you started on decrypting.
kms_client = kms_v1.KeyManagementServiceClient()
name = kms_client.crypto_key_path_path("project", "global", "THEKEYRING", "THECRYPTOKEY")
twitter_app_key = kms_client.decrypt(name, base64.b64decode(os.environ.get("TWITTER_APP_KEY"))).plaintext
#Jason F's answer based on using Google Datastore is close, but the code is a bit outdated based on the sample usage on the library docs. Here's the snippet that worked for me:
from google.cloud import datastore
client = datastore.Client('<your project id>')
key = client.key('<kind e.g settings>', '<entity name>') # note: entity name not property
# get by key for this entity
result = client.get(key)
print(result) # prints all the properties ( a dict). index a specific value like result['MY_SECRET_KEY'])
Partly inspired by this Medium post
Extending Martin's answer
from google.appengine.ext import ndb
class Settings(ndb.Model):
"""
Get sensitive data setting from DataStore.
key:String -> value:String
key:String -> Exception
Thanks to: Martin Omander # Stackoverflow
https://stackoverflow.com/a/35261091/1463812
"""
name = ndb.StringProperty()
value = ndb.StringProperty()
#staticmethod
def get(name):
retval = Settings.query(Settings.name == name).get()
if not retval:
raise Exception(('Setting %s not found in the database. A placeholder ' +
'record has been created. Go to the Developers Console for your app ' +
'in App Engine, look up the Settings record with name=%s and enter ' +
'its value in that record\'s value field.') % (name, name))
return retval.value
#staticmethod
def set(name, value):
exists = Settings.query(Settings.name == name).get()
if not exists:
s = Settings(name=name, value=value)
s.put()
else:
exists.value = value
exists.put()
return True
There is a pypi package called gae_env that allows you to save appengine environment variables in Cloud Datastore. Under the hood, it also uses Memcache so its fast
Usage:
import gae_env
API_KEY = gae_env.get('API_KEY')
If there is a value for that key in the datastore, it will be returned.
If there isn't, a placeholder record __NOT_SET__ will be created and a ValueNotSetError will be thrown. The exception will remind you to go to the Developers Console and update the placeholder record.
Similar to Martin's answer, here is how to update the value for the key in Datastore:
Go to Datastore Section in the developers console
Select your project at the top of the page if it's not already selected.
In the Kind dropdown box, select GaeEnvSettings.
Keys for which an exception was raised will have value __NOT_SET__.
Go to the package's GitHub page for more info on usage/configuration
My solution is to replace the secrets in the app.yaml file via github action and github secrets.
app.yaml (App Engine)
env_variables:
SECRET_ONE: $SECRET_ONE
ANOTHER_SECRET: $ANOTHER_SECRET
workflow.yaml (Github)
steps:
- uses: actions/checkout#v2
- uses: 73h/gae-app-yaml-replace-env-variables#v0.1
env:
SECRET_ONE: ${{ secrets.SECRET_ONE }}
ANOTHER_SECRET: ${{ secrets.ANOTHER_SECRET }}
Here you can find the Github action.
https://github.com/73h/gae-app-yaml-replace-env-variables
When developing locally, I write the secrets to an .env file.