Create new API config for GCP API Gateway using Python - python

I'm trying to create a new API config for an existing API and Gateway, using the Python client library google-cloud-api-gateway. I can't figure out how to specify the api_config parameter. The example in the docs doesn't include this parameter (which is required).
request = apigateway_v1.CreateApiConfigRequest(
parent="parent_value",
api_config_id="api_config_id_value",
api_config="?"
)
What is the correct syntax for providing my config.yaml file? Even better would be if I could provide the config as a Python dict representation instead of loading a .yaml file.

Related

How to store secrets for a Python Flask App on EC2

I have a simple Flask App that uses Stripe running on an EC2 instance.
I followed this guide to get it running: https://medium.com/techfront/step-by-step-visual-guide-on-deploying-a-flask-application-on-aws-ec2-8e3e8b82c4f7
I export the keys as environment variables and then in the code read them.
stripe_keys = {
"secret_key": os.environ["STRIPE_SECRET_KEY"],
"publishable_key": os.environ["STRIPE_PUBLISHABLE_KEY"],
"webhook_secret": os.environ["STRIPE_WEBHOOK_KEY"],
}
However, this requires me to SSH into the EC2 machines to set the variables. Is there a better approach?
I'd recommend AWS System Manager - Parameter Store
maintain your keys in SSM Parameter Store, choose SecureString type so your keys are encrypted at rest
give your EC2 instance IAM role enough permissions to fetch and decrypt your SecureString stored in SSM Parameter Store
make sure your EC2 instance can reach the Internet, as SSM Parameter Store is an Internet-facing service
in your code, use AWS SDK to fetch and decrypt your SecureString stored in SSM Parameter Store
I reckon you're writing in Python, so https://nqbao.medium.com/how-to-use-aws-ssm-parameter-store-easily-in-python-94fda04fea84
PS: if you use CloudFormation or other Infra-as-Code tools to provision your EC2 instances, most IaC tools support injecting SSM Parameter Store as env vars during deployment. With this approach, your code can stay as is, your EC2 instance doesn't need extra permission.
As Chris Chen pointed out, you can use AWS Parameter Store and on top of it: AWStanding
Suppose you stored your variables like this in Parameter store:
"/stripe/secret_key"
"/stripe/publishable_key"
"/stripe/webhook_secret"
Then you can write code like this:
from awstanding.parameter_store import load_path
load_path('/stripe')
# Now you can access you variables exactly like this:
os.environ["STRIPE_SECRET_KEY"]
os.environ["STRIPE_PUBLISHABLE_KEY"]
os.environ["STRIPE_WEBHOOK_SECRET"]
# or store them in settings variables:
STRIPE_SECRET_KEY = os.environ["STRIPE_SECRET_KEY"]
Also, it handles automatically any encrypted key.

Azure via Python API - Set Storage Account Property - Allow blob public access To Disabled

I'm trying to use python 3 in order to set property in Azure : Allow Blob public access
I didn't find any information on the net on how to implement this via python,
I did find solution via Powershell: https://learn.microsoft.com/en-us/azure/storage/blobs/anonymous-read-access-configure?tabs=powershell
looking for solution for python3...
Thanks!
Allow Blob public access feature is newly added in the latest python sdk azure-mgmt-storage 16.0.0.
When using this feature, you need to add this line in your code:
from azure.mgmt.storage.v2019_06_01.models import StorageAccountUpdateParameters
Here is an example, it can work at my side:
from azure.identity import ClientSecretCredential
from azure.mgmt.storage import StorageManagementClient
from azure.mgmt.storage.v2019_06_01.models import StorageAccountUpdateParameters
subscription_id = "xxxxxxxx"
creds = ClientSecretCredential(
tenant_id="xxxxxxxx",
client_id="xxxxxxxx",
client_secret="xxxxxxx"
)
resource_group_name="xxxxx"
storage_account_name="xxxx"
storage_client = StorageManagementClient(creds, subscription_id)
#set the allow_blob_public_access settings here
p1 = StorageAccountUpdateParameters(allow_blob_public_access=False)
#then use update method to update this feature
storage_client.storage_accounts.update(resource_group_name, storage_account_name, p1)
I haven't tried this myself, but looking at the Python Storage Management SDK and the REST API this should be possible.
Look here for an example on how to create a new storage account using the Python SDK. As you can see, the request body seems to be pretty much exactly what gets passed on to the underlying REST API.
That API does support the optional parameter properties.allowBlobPublicAccess so you should be able to add that directly in python as well.

Hiding sensitive information in Python

I am creating a Python script which reads a spreadsheet and issues Rest requests to an external service. A token must be obtained to issue requests to the Rest service, so the script needs a username and password to obtain the oauth2 token.
What's the best or standard way to hide or not have visible information in the Python script?
I recommend using a config file. Let's create a config file and name it config.cfg. The file structure should look more or less like this:
[whatever]
key=qwerertyertywert2345
secret=sadfgwertgrtujdfgh
Then in python you can load it this way:
from configparser import ConfigParser
config = ConfigParser()
config.read('config.cfg')
my_key = config['whatever']['key']
my_secret = config['whatever']['secret']
In general, the most standard way to handle secrets in Python is by putting them in runtime configuration.
You can do that by reading explicitly from external files or using os.getenv to read from environment variables.
Another way is to use a tool like python-decouple, which lets you use the environment (variables), a .env file, and an .ini file in a cascade, so that developers and operations can control the environment in local, dev, and production systems.

Authenticating a google.storage.Client without saving the service account JSON to disk

For authentication of a Google Cloud Platform storage client, I'd like to NOT write the service account JSON (credentials file that you create) to disk. I would like to keep them purely in memory after loading them from a Hashicorp Vault keystore that is shared by all cloud instances. Is there a way to pass the JSON credentials directly, rather than passing a pathlike/file object?
I understand how to do this using a pathlike/file object as follows, but this is what I want to avoid (due to security issues, I'd prefer to never write them to disk):
from google.cloud import storage
# set an environment variable that relies on a JSON file
export GOOGLE_APPLICATION_CREDENTIALS="/path/to/service_account.json"
# create the client (assumes the environment variable is set)
client = storage.Client()
# alternately, one can create the client without the environment
# variable, but this still relies on a JSON file.
client = storage.Client.from_service_account_json("/path/to/service_account.json")
I have tried to get around this by referencing the JSON DATA (json_data) directly, but this throws the error: TypeError: expected str, bytes or os.PathLike object, not dict
json_data = {....[JSON]....}
client = storage.Client.from_service_account_json(json_data)
Also, dumping to JSON, but I get the error:
with io.open(json_credentials_path, "r", encoding="utf-8") as json_fi:
OSError: [Errno 63] File name too long: '{"type": "service_account", "project_id",......
json_data = {....[JSON]....}
client = storage.Client.from_service_account_json(json.dumps(json_data))
Per the suggestion from #johnhanley, I have also tried:
from google.cloud import storage
from google.oauth2 import service_account
json_data = {...data loaded from keystore...}
type(json_data)
dict
credentials = service_account.Credentials.from_service_account_info(json_data)
type(credentials)
google.oauth2.service_account.Credentials
client = storage.Client(credentials=credentials)
This resulted in the DefaultCredentialsError:
raise exceptions.DefaultCredentialsError(_HELP_MESSAGE)
google.auth.exceptions.DefaultCredentialsError: Could not automatically determine credentials. Please set GOOGLE_APPLICATION_CREDENTIALS or explicitly create credentials and re-run the application. For more information, please see https://developers.google.com/accounts/docs/application-default-credentials.
If you have ideas on how to solve this, I'd love to hear it!
Currently there aren't built-in methods in the client library for Cloud Storage to achieve this. So here there would be 2 possibilites:
As #JohnHanley stated use the provided built-in methods [1][2] to create the constructor for the Cloud Storage client.
You might consider as well to use another product such as Cloud Functions or App Engine that would allow you to configure the authentication at the service level and avoid providing the service account credentials.

Swagger Codegen Authorisation for Python-Flask

I have build a swagger doc, generated the swagger client (python-flask with python2 support).
I've built my code up, tested, happy with what I've got. Now I want to secure my API endpoints using https and Basic Auth.
This is v2 of the Open Api Specification (OAS) so I'm setting up as follows (described https://swagger.io/docs/specification/2-0/authentication/basic-authentication/)
swagger: "2.0"
securityDefinitions:
basicAuth:
type: "basic"
Whether I specify that my endpoint have individual security settings or whether I specify this at the root level in the YAML for all endpoints, it makes no difference.
security:
- basicAuth: []
I take my YAML, export to JSON, then run the following to rebuild the swagger_server code:
java -jar swagger-codegen-cli-2.3.1.jar generate -l python-flask -
DsupportPython2=true -i swagger.json -a "Authorization: Basic
Base64encodedstring"
What I'm expecting is for the controller or model code to validate that a basic auth header has been passed that matches the authrization specified in the generation code but I see no references anywhere. Not sure if I've just read this wrong or if there's an issue with the way I'm doing it or some of the options I'm using?
Python server generated by Swagger Codegen uses Connexion, and Connexion only supports OAuth 2 out of the box. As explained in the linked issue,
users always can add custom mechanisms by decorating their handler functions (see https://github.com/zalando/connexion/blob/master/examples/basicauth/app.py)

Categories

Resources