Changing AWS Lambda environment variables when running test - python

I've got a few small Python functions that post to twitter running on AWS. I'm a novice when it comes to Lambda, knowing only enough to get the functions running.
The functions have environment variables set in Lambda with various bits of configuration, such as post frequency and the secret data for the twitter application. These are read into the python script directly.
It's all triggered by an Event Bridge cron job that runs every hour.
I'm wanting to create a test event that will allow me to invoke the function manually, but would like to be able to change the post frequency variable when run like this.
Is there a simple way to change environment variables when running a test event?

That is very much possible and there are multiple ways to do it. One is to use AWS CLI's aws lambda update-function-configuration: https://docs.aws.amazon.com/cli/latest/reference/lambda/update-function-configuration.html
Alternatively, depending on programming language that you prefer, you can use AWS SDK that also has a similar method, you can find an example with JS SDK in this doc: https://docs.aws.amazon.com/sdk-for-javascript/v3/developer-guide/javascript_lambda_code_examples.html

Related

How to launch GCP Compute Engine VM with Startup Script via Python API?

GCP has a published create_instance() code snippet available here, which I've seen on SO in a couple places e.g. here. However, as you can see in the first link, it's from 2015 ("Copyright 2015 Google Inc"), and Google has since published another code sample for launching a GCE instance dated 2022. It's available on github here, and this newer create_instance function is what's featured in GCP's python API documentation here.
However, I can't figure out how to pass a startup script via metadata to run on VM startup using the modern python function. I tried adding
instance_client.metadata.items = {'key': 'startup-script',
'value': job_script}
to the create.py function (again, available here along with supporting utility functions it calls) but it threw an error that the instance_client doesn't have that attribute.
GCP's documentation page for starting a GCE VM with a startup script is here, where unlike most other similar pages, it contains code snippets only for console, gcloud and (REST)API; not SDK code snippets for e.g. Python and Ruby that might show how to modify the python create_instance function above.
Is the best practice for launching a GCE VM with a startup script from a python process really to send a post request or just wrap the gcloud command
gcloud compute instances create VM_NAME \
--image-project=debian-cloud \
--image-family=debian-10 \
--metadata-from-file=startup-script=FILE_PATH
...in a subprocess.run()? To be honest I wouldn't mind doing things that way since the code is so compact (the gcloud command at least, not the POST request way), but since GCP provides a create_instance python function I had assumed using/modifying-as-necessary that would be the best practice from within python...
Thanks!
So, the simplest (!) way with the Python library to create the equivalent of --metadata-from-file=startup-scripts=${FILE_PATH} is probably:
from google.cloud import compute_v1
instance = compute_v1.Instance()
metadata = compute_v1.Metadata()
metadata.items = [
{
"key":"startup-script",
"value":'#!/usr/bin/env bash\necho "Hello Freddie"'
}
]
instance.metadata = metadata
And another way is:
metadata = compute_v1.Metadata()
items = compute_v1.types.Items()
items.key = "startup-script"
items.value = """
#!/usr/bin/env bash
echo "Hello Freddie"
"""
metadata.items = [items]
NOTE In the examples, I'm embedding the content of the FILE_PATH in the script for convenience but you could, of course, use Python's open to achieve a more comparable result.
It is generally always better to use a library|SDK if you have one to invoke functionality rather than use subprocess to invoke the binary. As mentioned in the comments, the primary reason is that language-specific calls give you typing (more in typed languages), controlled execution (e.g. try) and error handling. When you invoke a subprocess its string-based streams all the way down.
I agree that the Python library for Compute Engine using classes feels cumbersome but, when you're writing a script, the focus could be on the long-term benefits of more explicit definitions vs. the short-term pain of the expressiveness. If you just wanna insert a VM, by all means using gcloud compute instances create (I do this all the time in Bash) but, if you want to use a more elegant language like Python, then I encourage you to use Python entirely.
CURIOSITY gcloud is written in Python. If you use Python subprocess to invoke gcloud commands, you're using Python to invoke a shell that runs Python to make a REST call ;-)

How to run a Python code in Apache Druid as a UDF?

I am trying to run a Python UDF directly on Druid. Running the Python function directly on the machines has many advantages, not the least of which avoiding huge data transfers from and to the remote database server.
For simplicity sake, let I have a simple Python function that I would like to run directly inside the Druid system. Here's a sample function:
# Calculates the Inverse of a Matrix
def matrix_inverse(A):
return numpy.linalg.inv(A)
I would like to run this function remotely and directly in Druid (and not on the client's side). The data used in the parameters (A) would be obtained from the database.
How could that be done?
No. Python UDFs are not available...yet.
There are JavaScript user defined functions:
https://druid.apache.org/docs/latest/development/javascript.html
Also consider creating a new feature request at: https://github.com/apache/druid/issues
and/or comment on this one: https://github.com/apache/druid/issues/10180

What is an event.json in AWS SAM?

When you complete this tutorial https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-getting-started-hello-world.html you download the AWS SAM CLI and run the commands in order to create a simple AWS hello-world application. When you run the program it triggers what AWS calls a lambda function and at the end of the tutorial you can open it in your browser in the url window using: http://127.0.0.1:3000/hello, if you see a message here that shows curly braces and the words 'hello-world' that means it is successful.
Running the AWS SAM commands generates a lot of boiler plate code which is a bit confusing. This can all be seen inside a code editor. One of them is called event.json, which of course is a JSON object but why is it there? what does it represent in relation to this program? I am trying to understand what this AWS SAM application is ultimately doing and what the files generated mean and represent here.
Can someone simply break down what AWS SAM is doing and the meaning behind the boiler plate code it generates?
Thank you
event.json contain the input your lambda function will get in json format. Regardless of how a lambda is triggered, it will always have 2 fixed parameters: Event and Context. Context contains additional information about the trigger like source, while Event Contains any input parameters that your lambda needs to run.
You can test this out by yourself by editing the event.json and giving your own values. If you open the lambda code file you will see this event object being used in the lambda_handler.
Other boilerplate stuff is your template where you can define the configuration of your lambdas as well as any other services you might use like layers or a database or api gateway.
You also get a requirements.txt file which contains names of any third party libraries that your function requires. These will be packaged along with the code.
Ninad's answer is spot on. I just want to add a practical application of how these json files are used. One way the event.json is used is when you are invoking your lambdas using the command sam local invoke. When you are invoking the lambda locally, you pass the event.json (or what ever you decide to call the file, you will likely have multiples) as a parameter. As Ninad mentioned, the event file has everything your lambda needs to run in terms of input. When the lambdas are hooked up to other services and running live, these inputs would be being feed to your lambda from that service.

Is it possible to automate the execution of a Python script using Microsoft Flow?

I want to execute a snippet of python code based on some trigger using Microsoft-Flow. Is there a way to do this?
Basically I am exploring on Powerapps and Microsoft-Flow. I have data in powerapp, I can do basic operations there. But, I want to execute a python script whenever a user press button in the powerapp and display the result on powerapp again.
In theory you can do with Azure Functions. The steps you need are the following:
Create an Azure function
Create the API definition using Python as the language
Export the definition to PowerApps/Flow
Add the function to your app as a data source OR
Add the function to Flow
It is still a little bit experimental, but you should be able to make it work.

Python program on server - control via browser

I have to setup a program which reads in some parameters from a widget/gui, calculates some stuff based on database values and the input, and finally sends some ascii files via ftp to remote servers.
In general, I would suggest a python program to do the tasks. Write a Qt widget as a gui (interactively changing views, putting numbers into tables, setting up check boxes, switching between various layers - never done something as complex in python, but some experience in IDL with event handling etc), set up data classes that have unctions, both to create the ascii files with the given convention, and to send the files via ftp to some remote server.
However, since my company is a bunch of Windows users, each sitting at their personal desktop, installing python and all necessary libraries on each individual machine would be a pain in the ass.
In addition, in a future version the program is supposed to become smart and do some optimization 24/7. Therefore, it makes sense to put it to a server. As I personally rather use Linux, the server is already set up using Ubuntu server.
The idea is now to run my application on the server. But how can the users access and control the program?
The easiest way for everybody to access something like a common control panel would be a browser I guess. I have to make sure only one person at a time is sending signals to the same units at a time, but that should be doable via flags in the database.
After some google-ing, next to QtWebKit, django seems to the first choice for such a task. But...
Can I run a full fledged python program underneath my web application? Is django the right tool to do so?
As mentioned previously, in the (intermediate) future ( ~1 year), we might have to implement some computational expensive tasks. Is it then also possible to utilize C as it is within normal python?
Another question I have is on the development. In order to become productive, we have to advance in small steps. Can I first create regular python classes, which later on can be imported to my web application? (Same question applies for widgets / QT?)
Finally: Is there a better way to go? Any standards, any references?
Django is a good candidate for the website, however:
It is not a good idea to run heavy functionality from a website. it should happen in a separate process.
All functions should be asynchronous, I.E. You should never wait for something to complete.
I would personally recommend writing a separate process with a message queue and the website would only ask that process for statuses and always display a result immediatly to the user
You can use ajax so that the browser will always have the latest result.
ZeroMQ or Celery are useful for implementing the functionality.
You can implement functionality in C pretty easily. I recomment however that you write that functionality as pure c with a SWIG wrapper rather that writing it as an extension module for python. That way the functionality will be portable and not dependent on the python website.

Categories

Resources