Trying to deploy an App Engine instance from Python by using service account. The goal is to spin up a lot of instances, do some heavy network task (download and upload files) and shut them down afterwords.
I'm trying to do it with service account from Python runtime, but getting the following error
TypeError: Missing required parameter "servicesId"
What could be wrong or is there a better solution for such task? Thanks and the code is below:
SCOPES = ['https://www.googleapis.com/auth/cloud-platform']
SERVICE_ACCOUNT_FILE = 'service.json'
credentials = service_account.Credentials.from_service_account_file(
SERVICE_ACCOUNT_FILE, scopes=SCOPES)
gcp = build('appengine', 'v1', credentials=credentials)
res = gcp.apps().create(body={"id":"251499913983"})
app_json = {
"deployment": {
"files": {
"my-resource-file1": {
"sourceUrl": "https://storage.googleapis.com/downloader_sources/hello-world/main.py"
}
}
},
"handlers": [
{
"script": {
"scriptPath": "main.app"
},
"urlRegex": "/.*"
}
],
"runtime": "python27",
"threadsafe": True
}
res2 = gcp.apps().services().versions().create(body=app_json)
I guess you need to specify the service you want to deploy to. You could use default:
gcp.apps().services().versions().create(serviceID=default, body=app_json)
See doc for more details.
Related
TLDR: azurerm_function_app_function will work fine on Terraform Apply, but disappears from Azure Portal afterwards.
I am trying to deploy an Azure Function via Terraform for months now and have not had any luck with it.
The Terraform apply will run fine. I will then go into the Azure Portal and look at the function app functions and this function will be there. However when I refresh the blade the function will disappear. I have made the same function and deployed it via VS Code no issues, but with Terraform there is no luck.
resource "azurerm_linux_function_app" "main" {
name = "tf-linux-app"
location = azurerm_resource_group.main.location
resource_group_name = azurerm_resource_group.main.name
service_plan_id = azurerm_service_plan.main.id
storage_account_name = azurerm_storage_account.main.name
storage_account_access_key = azurerm_storage_account.main.primary_access_key
site_config {
app_scale_limit = 200
elastic_instance_minimum = 0
application_stack {
python_version = "3.9"
}
}
app_settings = {
"${azurerm_storage_account.main.name}_STORAGE" = azurerm_storage_account.main.primary_connection_string
}
client_certificate_mode = "Required"
identity {
type = "SystemAssigned"
}
}
resource "azurerm_function_app_function" "main" {
name = "tf-BlobTrigger"
function_app_id = azurerm_linux_function_app.main.id
language = "Python"
file {
name = "__init__.py"
content = file("__init__.py")
}
test_data = "${azurerm_storage_container.container1.name}/{name}"
config_json = jsonencode({
"scriptFile" : "__init__.py",
"disabled": false,
"bindings" : [
{
"name" : "myblob",
"type" : "blobTrigger",
"direction" : "in",
"path" : "${azurerm_storage_container.container1.name}/{name}",
"connection" : "${azurerm_storage_container.container1.name}_STORAGE"
}
]
})
}
As far as the Python script, I'm literally just trying
the demo found here
that Azure provides.
__init__.py:
import logging
import azure.functions as func
def main(myblob: func.InputStream):
logging.info('Python Blob trigger function processed %s', myblob.name)
I tried running Terraform apply, I expected the function app to appear and stay there, but it appears and then disappears. I also tried deploying a C# function to a Windows app. This worked as expected, but I now need the script in Python.
What is the current "standard" way to create a gcp compute client in python? I have seen both:
import googleapiclient.discovery
service = googleapiclient.discovery.build(
'container', 'v1', credentials=credentials)
body = {
"autoCreateSubnetworks": False,
"description": "",
"mtu": 1460.0,
"name": "test_network",
"routingConfig": {
"routingMode": "REGIONAL"
}
}
network = compute.networks().insert(project=project_id, body=body, requestId=str(uuid.uuid4())).execute()
and:
from google.cloud import compute_v1
compute = compute_v1.InstancesClient(credentials=credentials)
net = compute.Network()
net.auto_create_subnetworks = False
net.description = ""
net.mtu = 1460.0
net.name = "test_network"
net.routing_config = {
"routingMode": "REGIONAL"
}
request = InsertNetworkRequest()
request.project = project_id
request.request_id = str(uuid.uuid4())
request.network_resource = net
network = compute.NetworksClient().insert(request=request)
Does Google plan on only supporting one somewhere down the road?
According to this repository google-api-python-client libraries where it's explained that it's supported for now, but there is no date yet for it to stop being updated.
This library is considered complete and is in maintenance mode. This means that we will address critical bugs and security issues but will not add any new features.
It's recommended to use the repository google-cloud-python which has 3 development branches, GA (General Availability), Beta Support and Alpha Support.
I've tried looking online but could not find an answer as the documentation (and the API) for Azure Python SDK is just horrible.
I have a Container Registery on Azure with a list of allowed IPs for public access. I'd like to modify that list by adding a new IP using Python.
I'm not sure the API supports it or how to achieve this using ContainerRegistryManagementClient.
Can't agree more that documentation (and the API) for Azure Python SDK is just horrible :)
If you want to add a list of allowed IPs for public access to your Container Registery on Azure, just try the code below using REST API:
from azure.identity import ClientSecretCredential
import requests
TENANT_ID= ""
CLIENT_ID = ""
CLIENT_SECRET = ""
SUBSCRIPTION_ID = ""
GROUP_NAME = ""
REGISTRIES = ""
#your public ip list here
ALLOWED_IPS = [{
"value": "167.220.255.1"
},
{
"value": "167.220.255.2"
}
]
clientCred = ClientSecretCredential(TENANT_ID,CLIENT_ID,CLIENT_SECRET)
authResp = clientCred.get_token("https://management.azure.com/.default")
requestURL = 'https://management.azure.com/subscriptions/'+SUBSCRIPTION_ID+'/resourceGroups/'+GROUP_NAME+'/providers/Microsoft.ContainerRegistry/registries/'+REGISTRIES+'?api-version=2020-11-01-preview'
requestBody = {
"properties": {
"publicNetworkAccess": "Enabled",
"networkRuleSet": {
"defaultAction": "Deny",
"virtualNetworkRules": [],
"ipRules": ALLOWED_IPS
},
"networkRuleBypassOptions": "AzureServices"
}
}
r = requests.patch(url=requestURL,json=requestBody,headers={"Authorization":"Bearer "+ authResp.token})
print(r.text)
Result:
Before you run this, pls make sure that your client app has been granted the required permissions(Azure subscription roles, such as contributor).
Does anybody here have any experience in using the YouTube API? I'm currently trying to automate uploading my videos by having an OAuth application, but, because its not verified, my videos get locked instantly and set to private. I cannot get my OAuth application verified due to not having a website.
Is there any other way I can do this without using an OAuth application?
def uploadClip(clip):
# Disable OAuthlib's HTTPS verification when running locally.
# *DO NOT* leave this option enabled in production.
os.environ["OAUTHLIB_INSECURE_TRANSPORT"] = "1"
api_service_name = "youtube"
api_version = "v3"
client_secrets_file = "client_secrets.json"
# Get credentials and create an API client
flow = google_auth_oauthlib.flow.InstalledAppFlow.from_client_secrets_file(client_secrets_file, scopes)
credentials = flow.run_console()
youtube = googleapiclient.discovery.build(api_service_name, api_version, credentials=credentials)
request = youtube.videos().insert(
part="snippet,status",
body={
"snippet": {
"categoryId": "22",
"description": ".",
"title": "H "
},
"status": {
"privacyStatus": "public"
}
},
# TODO: For this request to work, you must replace "YOUR_FILE"
# with a pointer to the actual file you are uploading.
media_body=clip
)
response = request.execute()
I would like to set the rules tab of my firebase object to read only.
When I take off the write parcel I got an error on my python when trying to add files to my storage.
Below is my code
import firebase_admin
config={
"apiKey": "xxx",
"authDomain": "xxx",
"databaseURL": "xxx",
"projectId": "xxx",
"storageBucket": "xxx",
"messagingSenderId": "xxx",
"appId": "xxx",
"measurementId": "xxx"
}
# Initialize connection to firebase_admin
firebase = firebase_admin.initialize_app(config)
storage = firebase.storage()
path_on_cloud ="Data quality report/Data quality report.py"
path_local = "Data_quality_report.py"
storage.child(path_on_cloud).put(path_local)
My rules tab is
service firebase.storage {
match /b/{bucket}/o {
match /{allPaths=**} {
allow read;
allow write: if request.auth != null;
}
}
}
Do you guys know how to fix this problem?
Thank you in advance
Don't initialize it that way!
You are using the firebase_admin package which using the Firebase Admin SDK. To initialize it create a new service account key file which is then used for initializing.
Admin does what it sounds like it would do. It has all privileges.
Refer the official docs for proper explanation on setup. (I am really bad at explaining things).
https://firebase.google.com/docs/storage/admin/start#python