GCP Cloud Functions config file - python

Let's say I want to create a simple cloud function to run a python script, where the main.py is in a github repository mirrored via Cloud Source Repositories. My questions is, if I need to reference information that I don't want to add to the repository - is there another way to access that information? For example, let's say I want to have a config.py which I reference in main.py. Is it possible to save and reference config.py somewhere in GCP instead? (e.g. Storage)?
Thanks!

Another answer that came to mind is the use of GCP's Runtime Configurator. This is an API within the Google Cloud Platform that lets you store information to use within GCE resources, e.g. cloud functions. Note that as we speak, this feature is still in Beta! Here is a small demo:
Create your project config:
gcloud beta runtime-config configs create my-project-config
Set a variable in your project config:
gcloud beta runtime-config configs variables set --config-name my-project-config --is-text my-variable "hello world"
The service account running the cloud function needs the following permissions:
runtimeconfig.configs.get
runtimeconfig.variables.list
Use that variable in a cloud function (Python):
from google.cloud import runtimeconfig
client = runtimeconfig.Client()
config = client.config('my-config')
print(config.get_variable('variable-name'))
# <Variable: my-config, variable-name>
print(config.get_variable('does-not-exist'))
# None

It seems like what you might want is Environment Variables for Cloud Functions or possibly even Secrets in Cloud Functions.
Other than that, Cloud Functions are completely stateless, so you'd need to connect to some external datastore like a database to load private configuration.

Look into variable substitution in Cloud Build where a 'build trigger' would contain non-repository values that would be inserted in 'build steps' into your Cloud Function as environment variables.
https://cloud.google.com/cloud-build/docs/configuring-builds/substitute-variable-values
https://cloud.google.com/functions/docs/env-var

In addition to the other answers, we use a somewhat different approach. It boils down in having a public repo which contains all the cloud function Python code. We have another private repository which only contains configuration, like config.py. Let's describe this as an example:
Create 2 repositories, for example:
github.com/organization/cloud-function (public)
github.com/organization/config (private)
Set a cloudbuild trigger on the config repository, and set a cloudbuild trigger on the cloud-function repository to trigger the build on the config repository. Here is some documentation about creating cloudbuild triggers.
In the last step everything comes together. Remember, your configuration is private, so not accessible to anyone else. Everytime someone pushes changes to one of the repositories, it should trigger the cloudbuild.yaml in your private repo. That cloudbuild.yaml looks something like this:
---
timeout: 1800s
steps:
# Clone public repo
- name: 'gcr.io/cloud-builders/git'
args:
- 'clone'
- 'https://github.com/organization/cloud-function.git'
# Copy config
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |
cp config.py cloud-function/
# Deploy cloud-function
- name: 'gcr.io/cloud-builders/gcloud'
dir: 'cloud-function'
entrypoint: 'bash'
args:
- '-c'
- |
gcloud functions deploy ...
In addition, you can put references (secret_id) to Google Secret Manager secrets in your config.py. You could also use --env-vars-file for which the actual file is stored in the private repository. Another bonus is that you can have directories in your private repo which represent a $BRANCH_NAME or $PROJECT_ID, which makes it easy to create multiple environments (test, development, production etc.). This way you are sure the correct configuration for the environment is injected in the cloud function. We use this as follows:
my-dev-gcp-project > build trigger on development branch
my-prd-gcp-project > build trigger on production branch
In the cloudbuild.yaml we clone the public repo with ${BRANCH_NAME}
and copy the config from a source directory called
${PROJECT_ID}/config.py. With this setup you have clear separation
between development and production config and code.

Related

Move python code from Azure Devops repo to windows Azure VM

We have a python application on a window azure VM that reads data from an API and loads the data into an onprem DB. The application works just fine and the code is source controlled in an Azure devops repo. The current deployment process is for someone to pull the main branch and copy the application from their local machine to c:\someapplication\ on the STAGE/PROD server. We would like to automate this process. There are a bunch of tutorials on how to do a build for API and Web applications which require your azure subscription and app name (which we dont have). My two questions are:
is there a way to do a simple copy to the c:\someapplication folder from azure devops for this application
if there is, would the copy solution be the best solution for this or should I consider something else?
Should i simply clone the main repo to each folder location above and then automate the git pull via the azure pipeline? Any advice or links would be greatly appreciated.
According to your description, you could try to use the CopyFiles#2 task and set the local folder as shared folder, so that use it as TargetFolder. The target folder or UNC path that will contain the copied files.
YAML like:
- task: CopyFiles#2
inputs:
SourceFolder:$(Build.SourcesDirectory) # string. Source Folder.
Contents: '**' # string. Required. Contents. Default: **.
TargetFolder: 'c:\\someapplication' # string. Required. Target Folder.
Please check if it meets your requirements.

Passing Github Action Workflow Secret to Local Python Environment

I've seen a few similar questions here but I don't think this specific one has been answered yet. I am on a machine learning team and we do a LOT of discovery/exploratory analysis in a local environment.
I am trying to pass secrets stored in my github enterprise account to my local environment the same way that Azure Keyvault does.
Here is my workflow file:
name: qubole_access
on: [pull_request, push]
env:
## Sets environment variable
QUBOLE_API_TOKEN: ${{secrets.QUBOLE_API_TOKEN}}
jobs:
job1:
runs-on: self-hosted
steps:
- name: step 1
run: echo "The API key is:${{env.QUBOLE_API_TOKEN}}"
I can tell it's working because the job runs successfully in the workflow
The workflow file is referencing an API token to access our Qubole database. This token is stored as a secret in the 'secrets' area of my repo
What I want to do now is reference that environment variable in a LOCAL python environment. It's important that it be in a local environment because it's less expensive and I don't want to risk anyone on my team accidentally forgetting and pushing secrets in their code, even if it's in a local git ignore file.
I have fetched/pulled/pushed/restarted etc etc and I can't get the variable into my environment.
When I check the environment variables by running env in the terminal, no environment variables show up there either.
Is there a way to treat github secrets like secrets in azure keyvault? Or am I missing something obvious?

Simple Google Cloud deployment: Copy Python files from Google Cloud repository to app engine

I'm implementing continuous integration and continuous delivery for a large enterprise data warehouse project.
All the code reside in Google Cloud Repository and I'm able to set up Google Cloud Build trigger, so that every time code of specific file type (Python scripts) are pushed to the master branch, a Google Cloud build starts.
The Python scripts doesn't make up an app. They contain an ODBC connection string and script to extract data from a source and store it as a CSV-file. The Python scripts are to be executed on a Google Compute Engine VM Instance with AirFlow installed.
So the deployment of the Python scripts is as simple as can be: The .py files are only to be copied from the Google Cloud repository folder to a specific folder on the Google VM instance. There is not really a traditionally build to run, as all the Python files are separate for each other and not part of an application.
I thought this would be really easy, but now I have used several days trying to figure this out with no luck.
Google Cloud Platform provides several Cloud Builders, but as far as I can see none of them can do this simple task. Using GCLOUD also does not work. It can copy files but only from local pc to VM not from source repository to VM.
What I'm looking for is a YAML or JSON build config file to copy those Python files from source repository to Google Compute Engine VM Instance.
Hoping for some help here.
The files/folders in the Google Cloud repository aren't directly accessible (it's like a bare git repository), you need to first clone the repo then copy the desired files/folders from the cloned repo to their destinations.
It might be possible to use a standard Fetching dependencies build step to clone the repo, but I'm not 100% certain of it in your case, since you're not actually doing a build:
steps:
- name: gcr.io/cloud-builders/git
args: ['clone', 'https://github.com/GoogleCloudPlatform/cloud-builders']
If not you may need one (or more) custom build steps. From Creating Custom Build Steps:
A custom build step is a container image that the Cloud Build worker
VM pulls and runs with your source volume-mounted to /workspace.
Your custom build step can execute any script or binary inside the
container; as such, it can do anything a container can do.
Custom build steps are useful for:
Downloading source code or packages from external locations
...

Google App Engine check update

Is there a way to check if there are updated files from the server using Google App Engine? Then download only the updated files to your local to match it on the server?
It may be possible if you're talking about the standard environment and you always follow disciplined app deployment procedure:
deploy from a git repository (unsure if other VCS systems work)
don't deploy when you have uncommitted changes in the repository from which you deploy
If you meet these requirements then you can access the source code for a particular deployed version of a service via StackDriver, as described in Google Cloud DataStore automatic indexing.
At least in my case in between the files in the root directory I found an automatically generated file called source-context.json, containing the git URL and revisionID of the repository from which the deployment was made. Which you can use to selectively update your local repo.
Another approach, requiring the same deployment discipline mentioned above, plus always deploying from the same repository, which you'll consider the unique master copy of your code or one of its mirrors (needed due to git's distributed nature). Then you only need to compare your local repo against the version being deployed in this master copy repo (or its mirror).
You might want to check out the Google Cloud Source Repositories as a possible such master copy repo. Or mirroring a public repo master copy (see Connecting a Hosted Repository). The advantage would be some convenience and integration with other Google Cloud Platform tools, for example:
gcloud source repos
the Source Browser
deployments from the cloud shell, see for example Google Cloud: How to deploy mirrored Repository
Otherwise direct downloading of the app code from GAE is monolithic - you can't select just specific files, see Downloading Your Source Code.

Python azure module: how to create a new deployment

azure.servicemanagmentservice.py contains:
def create_deployment(self, service_name, deployment_slot, name,
package_url, label, configuration,
start_deployment=False,
treat_warnings_as_error=False,
extended_properties=None):
What is package_url and configuration? Method comment indicates that
package_url:
A URL that refers to the location of the service package in the
Blob service. The service package can be located either in a
storage account beneath the same subscription or a Shared Access
Signature (SAS) URI from any storage account.
....
configuration:
The base-64 encoded service configuration file for the deployment.
All over the internet there are references to Visual Studio and Powershell to create those files. What do they look like? Can I manually create them? Can azure module create them? Why is Microsoft service so confusing and lacks documentation?
I am using https://pypi.python.org/pypi/azure Python Azure SDK. I am running Mac OS X on my dev box, so I don't have Visual Studio or cspack.exe.
Any help appreciated. Thank you.
According to your description, it looks like you are trying to use Python Azure SDK to create a cloud service deployment. Here is the documentation of how to use the create_deployment function.
Can I manually create them? Can azure module create them?
If you mean you wanna know how to create an Azure deployment package of your Python app, based on my experience, there are several options you can leverage.
If you have a visual studio, you can create a cloud project from projects’ template and package the project via 1 click. In VS, create new project -> cloud ->
Then package the project:
Without VS, you can use Microsoft Azure PowerShell cmdlets, or cspack commandline tool to create a deployment package. Similar ask could be found at: Django project to Azure Cloud Services without Visual Studio
After you packaging the project, you would have the .cspkg file like this:
For your better reference, I have uploaded the testing project at:
https://onedrive.live.com/redir?resid=7B27A151CFCEAF4F%21143283
As to the ‘configuration’, it means the base-64 encoded service configuration file (.cscfg) for the deployment
In Python, we can set up the ‘configuration’ via below code
configuration = base64.b64encode(open('E:\\TestProjects\\Python\\YourProjectFolder\\ServiceConfiguration.Cloud.cscfg', 'rb').read())
Hope above info could provide you a quick clarification. Now, let’s go back to Python SDK itself and see how we can use create_deployment function to create a cloud service deployment accordingly.
Firstly, I’d like to suggest you to refer to https://azure.microsoft.com/en-us/documentation/articles/cloud-services-python-how-to-use-service-management/ to get the basic idea of what azure Service Management is and how it’s processing.
In general, we can make create_deployment function work via 5 steps
Create your project’s deployment package and set up a configuration file (.cscfg) – (to have the quick test, you can use the one that I have uploaded)
Store your project’s deployment package in a Microsoft Azure Blob Storage account under the same subscription as the hosted service to which the package is being uploaded. Get the blob file’s URL (or use a Shared Access Signature (SAS) URI from any storage account). You can use Azure storage explorer to upload the package file, and then it will be shown on Azure portal.
Use OpenSSL to create your management certificate. It is needed to create two certificates, one for the server (a .cer file) and one for the client (a .pem file), the article I mentioned just now have provided a detailed info https://azure.microsoft.com/en-us/documentation/articles/cloud-services-python-how-to-use-service-management/
Screenshot of my created certificates
Then, upload .cer certificate to Azure portal: SETTINGS -> management certificate tab -> click upload button (on the bottom of the page).
Create a cloud service in Azure and keep the name in mind.
Create another project to test Azure SDK - create_deployment, code snippet for your reference:
subscription_id = 'Your subscription ID That could be found in Azure portal'
certificate_path = 'E:\YourFoloder\mycert.pem'
sms = ServiceManagementService(subscription_id, certificate_path)
def TestForCreateADeployment():
service_name = "Your Cloud Service Name"
deployment_name = "name"
slot = 'Production'
package_url = ".cspkg file’s URL – from your blob"
configuration = base64.b64encode(open('E:\\TestProjects\\Python\\ YourProjectFolder\\ServiceConfiguration.Cloud.cscfg ', 'rb').read())
label = service_name
result = sms.create_deployment(service_name,
slot,
deployment_name,
package_url,
label,
configuration)
operation = sms.get_operation_status(result.request_id)
print('Operation status: ' + operation.status)
Here is running result’s screenshot:

Categories

Resources