Flask: How to restart Azure App programmatically - python

I have 4 Scrapy spiders that I launch through Flask on Azure. How to restart the application at the click of a button on my website? How to use REST API in Flask function?
restart:
Restart
flask:
#app.route('/restart')
def restart():
# REST API

If you want to restart an azure web app, please follow the steps below:
1.Install the following python packages:
azure-mgmt-resource and azure-mgmt-web.
2.Then create a service principal for authentication. You can use azure cli or azure portal to create it. Here is an example by using azure cli:
az ad sp create-for-rbac --name xxxx
In the output, you can get these items, and write them down:
application id(client id)
directory id(tenant)
client secret(secret)
Then use the code below:
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
from azure.mgmt.web import WebSiteManagementClient
subscription_id ="xxxx" #you can get it from azure portal
client_id ="xxx"
secret="xxx"
tenant="xxx"
credentials = ServicePrincipalCredentials(
client_id= client_id,
secret=secret,
tenant = tenant
)
#resource_client = ResourceManagementClient(credentials,subscription_id)
web_client = WebSiteManagementClient(credentials,subscription_id)
#restart your azure web app
web_client.web_apps.restart("your_resourceGroup_name","your_web_app_name")

Related

how to deploy streamlit with secrets.toml on heroku?

Hi I have an application that I would like to deploy on heroku. The question is how would I deploy a streamlit app with secrets.toml?
Currently the connection can be done locally via this
credentials = service_account.Credentials.from_service_account_info(
st.secrets["gcp_service_account"])
However when I deploy it to heroku, this doesn't seem to connect.
Please help.
On heroku I entered the gcp_service_account credentials as a config var (from the heroku dashboard go to 'Settings' --> 'Reveal Config Vars' as below:
Instead of st.secrets["<key>"], use os.environ["<key>"] in your python code as below:
gsheet_url = os.environ['private_gsheets_url']
For nested secrets like the gcp service account credentials, I first parse the json string as below:
parsed_credentials = json.loads(os.environ["gcp_service_account"])
credentials = service_account.Credentials.from_service_account_info(parsed_credentials,scopes=scopes)
Hope this helps.

Retrive endpoint url of deployed app from google cloud run with Python

I want to send requests to a deployed app on a cloud run with python, but inside the test file, I don't want to hardcode the endpoint; how can I get the URL of the deployed app with python script inside the test file so that I can send requests to that URL?
You can use gcloud to fetch the url of the service like this
gcloud run services describe SERVICE_NAME
--format="value(status.url)"
In a pure Python way, you can use Google's API Client Library for Run.
To my knowledge, there isn't a Cloud Client Library
The method is namespaces.services.get and it is documented by APIs Explorer namespaces.services.get.
One important fact with Cloud Run is that the API endpoint differs by Cloud Run region.
See service endpoint. You will need to override the client configuration (using ClientOptions) with the correct (region-specific) api_endpoint.
The following is from-memory! I've not run this code but it should be (nearly) correct:
import google.auth
import os
from googleapiclient import discovery
from google.api_core.client_options import ClientOptions
creds, project = google.auth.default()
REGION = os.getenv("REGION")
SERVICE = os.getenv("SERVICE")
# Must override the default run.googleapis.com endpoint
# with region-specific endpoint
api_endpoint = "https://{region}-run.googleapis.com".format(
region=REGION
)
options = ClientOptions(
api_endpoint=api_endpoint
)
service = discovery.build("run", "v1",
client_options=options,
credentials=creds
)
name = "namespaces/{namespace}/services/{service}".format(
namespace=project,
service=SERVICE
)
rqst = service.namespaces().services().get(name=name)
resp = rqst.execute()
The resp will be Service and you can grab its ServiceStatus url.

Google App Engine - Start / Stop Flex instance from Python API

Do know if it's possible to Stop & Start a version of one Flex Google App Version directly from apps ?
Not by gcloud client ...
Obviously it's working by the gcloud command : gcloud app versions stop/start <version>
... like in the Google Web Interface ...
I take a look at the Google Web service where the button start/stop is working well. The request is passed to the url :
https://console.cloud.google.com/m/operations?operationType=cloud-console.appengine.stopStartVersions&pid=<PROJECT-NAME>&hl=fr
With this data :
{"pid":"<PROJECT-NAME>","serviceId":"<SERVICE-ID>","versionId":"<VERSION-ID>","serving":true,"descriptionLocalizationKey":"gaeStopStartVersions","descriptionLocalizationArgs":{"serving":"true","versionId":"<VERSION-ID>"}}
... but by Python API
I try to follow the documentation about patching version. Whith this code or example :
from googleapiclient import discovery
from oauth2client.client import GoogleCredentials
credentials = GoogleCredentials.get_application_default()
appengine = discovery.build('appengine', 'v1', credentials=credentials)
apps = appengine.apps()
apps.services().versions().patch(appsId='<ID-APP>',servicesId='<ID-SERVICE>',versionsId="<ID-VERSION>",body = {'servingStatus': 'STOPPED'},updateMask='servingStatus').execute()
This is working. The version stopping and instances shutting down. But when I do the same with SERVING the version turn on but instances are not created.
Did someone already succed to do that correctly ?
Thanks in advance for your help !

How to solve azure keyvault secrets (Unauthorized) AKV10032: Invalid issuer. error in Python

I am using the azure-keyvault-secrets package to manage my resources secrets in Python 3.8, developping in PyCharm.
But when I am running the following:
import os
from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
VAULT_URL = os.environ["VAULT_URL"]
credential = DefaultAzureCredential()
client = SecretClient(
vault_url=VAULT_URL,
credential=credential
)
client.set_secret('my-secret-name', 'my-secret-value')
I get the following error:
HttpResponseError: azure keyvault secrets (Unauthorized) AKV10032: Invalid issuer. error
I have set the environment variables correct, according to the Microsoft Docs. I also restarted the runtime environment in PyCharm multiple times.
What to do?
I also faced the same issue. Following solution worked for me:
Login into Azure portal and check how many subscriptions you have. Check under which subscription/ Resource Group the Key Vault is under.
Login into Azure CLI and execute the following command:
az account list --output table
Make the subscription which has the KeyVault as the default one:
az account set --subscription "subscription name"
Re execute the code :
import os
from azure.keyvault.secrets import SecretClient
from azure.identity import DefaultAzureCredential
VAULT_URL = os.environ["VAULT_URL"]
credential = DefaultAzureCredential()
client = SecretClient(
vault_url=VAULT_URL,
credential=credential
)
client.set_secret('my-secret-name', 'my-secret-value')
It should work.

Trigger Azure Runbook whenever put new object in Azure bucket

I want to automate a azure resources(ex- start/stop VM) currently I am using Automation Account runbook and its working fine but I need to implement a framework something lie this :
1)Trigger runbook whenever put a new object(excel sheet) in azure bucket.
2)Read the excel sheet for input variables
Below is the runbook code
Somebody please tell me best way to trigger runbook which suits the above framework
"""
Azure Automation documentation : https://aka.ms/azure-automation-python-documentation
Azure Python SDK documentation : https://aka.ms/azure-python-sdk
"""
import os
import sys
from azure.mgmt.compute import ComputeManagementClient
import azure.mgmt.resource
import automationassets
def get_automation_runas_credential(runas_connection):
from OpenSSL import crypto
import binascii
from msrestazure import azure_active_directory
import adal
# Get the Azure Automation RunAs service principal certificate
cert = automationassets.get_automation_certificate("AzureRunAsCertificate")
pks12_cert = crypto.load_pkcs12(cert)
pem_pkey = crypto.dump_privatekey(crypto.FILETYPE_PEM,pks12_cert.get_privatekey())
# Get run as connection information for the Azure Automation service principal
application_id = runas_connection["ApplicationId"]
thumbprint = runas_connection["CertificateThumbprint"]
tenant_id = runas_connection["TenantId"]
# Authenticate with service principal certificate
resource ="https://management.core.windows.net/"
authority_url = ("https://login.microsoftonline.com/"+tenant_id)
context = adal.AuthenticationContext(authority_url)
return azure_active_directory.AdalAuthentication(
lambda: context.acquire_token_with_client_certificate(
resource,
application_id,
pem_pkey,
thumbprint)
)
Authenticate to Azure using the Azure Automation RunAs service principal
runas_connection = automationassets.get_automation_connection("AzureRunAsConnection")
azure_credential = get_automation_runas_credential(runas_connection)
Initialize the compute management client with the RunAs credential and specify the subscription to work against.
compute_client = ComputeManagementClient(
azure_credential,
str(runas_connection["SubscriptionId"])
)
print('\nStart VM')
async_vm_start = compute_client.virtual_machines.start(
'resource1', 'vm1')
async_vm_start.wait()
'''
print('\nStop VM')
async_vm_stop=compute_client.virtual_machines.power_off(resource_group_name, vm_name)
async_vm_stop.wait()'''
I believe one way to accomplish your requirement of triggering runbook whenever a new blob (or in your words 'object') is added in an Azure Storage container (on in your words 'bucket') is by leveraging Event Subscription (Event Grid). For related information, refer this document.
To illustrate it in a better way, you would have to go to Azure Portal -> your Storage account (that is of StorageV2 kind) -> Events tile -> More options -> Logic Apps -> Have 2 Steps as shown in below screenshot that does validate if a new storage blob is added and then runs the required runbook
You may also add next steps like sending mail after runbook execution is completed, etc.
Hope this helps!

Categories

Resources