I am writing a serverless script using AWS Lambda function (runtime Python 3.5) to connect to a SOAP server, get some data, process that data then update some records in storage.
I have written the script on my local machine where I had to install the 'suds' SOAP client. I have it all working correctly, however AWS doesn't have suds installed and i'm not sure how to get it installed or whether I can.
Has anyone tried writing a soap client in aws lambda using python and if so can they give me some suggestions on how to progress further?
Thanks
Kevin
This is really a more general question about packaging dependencies in a Python AWS Lambda function deployment. This documentation details the process of including your dependencies in your AWS Lambda zip file deployment artifact.
Related
Is it practically possible to simulate AWS environment locally using Moto and Python?
I want to write a aws gluejob that will fetch record from my local database and will upload to S3 bucket for data quality check and later trigger a lambda function for cronjob run using Moto Library using moto.lock_glue decorator.Any suggestion or document would be highly appreciated as I don't see much clue on same.Thank you in advance.
AFAIK, moto is meant to patch boto modules for testing.
I have experience working with LocalStack, a docker you can run locally, and it acts as a live service emulator for most AWS services (some are only available for paying users).
https://docs.localstack.cloud/getting-started/
You can see here which services are supported by the free version.
https://docs.localstack.cloud/user-guide/aws/feature-coverage/
in order to use it, you need to change the endpoint-url to point to the local service running on docker.
As it's a docker, you can incorporate it with remote tests as well e.g., if you're using k8s or a similar orchestrator
I got a question about python, maven and aws lambda. Basically I am trying to build dependency trees for my repos using the terminal command
mvn dependency:tree
This command is being ran via python using the os library, i.e.
import os
os.system('mvn dependency:tree')
Now comes the issue - I need to run this on AWS Lambda.
Being aware that AWS Lambda is serverless and that the layers of each lambda can only be 250mb, 1) is it possible to run terminal commands via lambda without spinning up any sort of server? and 2) maven usually needs to be installed on a system, thus is it possible, or even viable, to run maven on AWS Lambda?
Any input will be appreciated.
Thanks
is it possible to run terminal commands via lambda without spinning up any sort of server?
Yes, you can run terminal commands in a Lambda function.
maven usually needs to be installed on a system, thus is it possible, or even viable, to run maven on AWS Lambda?
You can create a custom Lambda container image that includes dependencies.
Additional AWS Blog Post: https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/
I’m building an App that runs on K8s version 1.21 and the container already includes Python 3.92. Do I still need to install https://github.com/kubernetes-client/python if I want to interact with Kubernetes using Python or am I good?
Thanks,
Kubernetes Python client is the library that helps you interact with Kubernetes API.
So if you want to do anything with Kubernetes API from inside your Python program (e.g. query the currently running Pods), then you need to install Kubernetes python client.
However, if your application is just deployed in Kubernetes, but does not need interact with Kubernetes API, then you don't need it.
Would like more info on what are planning to do. If you going to just run Python programs then you don't need this library at all. This is for accessing Kubernetes (K8S) REST APIs.Also even for REST API you can do normal REST API calls and handle Requests/Responses or take help from this library for doing the heavy lifting. Whether python is running outside or inside a container or pod, you need the library for accessing REST APIs to do specific API functionalities not related to actual python to work.
Ref: https://kubernetes.io/docs/reference/using-api/client-libraries/
I have a Python AWS Lambda running on a Linux, but due to some dependencies, I need it to be deployed on a Windows. I have tried using Python Azure Functions and have successfully deployed it on a Linux as well, but found out they cannot be deployed on Windows. Is it possible to do it with AWS Lambda?
Basically my solution has a few .exe that need to be run by a python library (Tesseract OCR and pytesseract)
AWS Lambda and Azure Functions are considered Function as a Service (FaaS) solutions, where the developer worries about the code and the cloud provider worries about availability, scalability and the platform underneath to run the code.
In that aspect, you can't run any of them on a server. If you need specific Windows dependencies, you must create a Python project as you normally would, install the dependencies and configure the Windows Server, being responsible for infrastructure and OS configurations and management.
I am searching for this for a few days, found some approaches like Serverless or Localstack, but what I would really like to do is be able to code everything using AWS API Gateway and Lambdas for a cloud-based version of my software (which is solved) and not manage my deployments.
Then...
A customer wants to host a copy of it inside its own private network, so... I wanna use the very same Lambda code (which makes no use of other AWS 'magic' services like DynamoDB ... only "regular" dependencies) injecting it into a container running "an API Gateway"-like software (perhaps a python/flask parsing the exported API Gateway config?).
I am willing to build this layer unless a better idea shows up. So I would be able to put my lambdas on a folder lets say "aws_lambda", and my container would know how to transform the HTTP payload to an AWS event payload, import the module, call 'lambda_handler' ... and hopefully that is it. Having another container with MySQL and another with Nginx (emulating CloudFront for static website) and I will be done. The whole solution in a can.
Any suggestions? Am I crazy?
Does anyone know some existing software solution to solve this?
If you are willing to use AWS SAM, the AWS SAM CLI offers what you're looking for.
The AWS SAM CLI implements its own API Gateway equivalent and runs the AWS Lambda functions in Docker containers. While it's mainly intended for testing, I don't see any reason, why you shouldn't be able to use it for your use-case as well.
Besides different serverless plugins and localstack you can try AWS SAM Cli to run local api gateway . The command is start-api https://docs.aws.amazon.com/lambda/latest/dg/test-sam-cli.html . It probably would not scale, never tried myself and it is intended for testing.
Curiosly what you are consider to do (transform lambda into normal flask server is oppozite to zappa, which is serverless package that convert normal flask server into a lambda function and uploads it to AWS). If you succed in your original idea of converting a flask request into lambda event and care to package you code, it can be called unzappa. While zappa is mature and large package, probably it would be easier to 'invert' some light-weight thing like awsgi https://github.com/slank/awsgi/blob/master/awsgi/init.py
#Lovato, I have been using https://github.com/lambci/docker-lambda Which is a docker image that mimics lambda environment, lambci seems to be maintaining a good version of the lambda images for nodejs, java, python, .net and even go lang. So you can technically reuse your entire lambda code in a docker running lambda "like" environment. I call it lambda-like mostly because aws doesn't fully publish every piece of information about how lambda works. But this is the nearest approximation I've seen. I use this for local development and testing of lambda. And I have tested a trail "offline" lambda. Let me know if this works for you.
I do prefer to use the docker files and create my docker images for my use.