Serverless AWS Lambda : no module named `secret_manager` - python

Installed serverless-python-requirements using npm.
I use secret-manager library in handler.py
I am able to successfully deploy lambda function using serverless (no errors).
I have secret-manager listed (along with other pypi packages) in requirements.txt
In order to package it, I include following lines in serverless.yml
pythonRequirements:
dockerizePip: true
To verify if secret-manager is packaged with the other PyPi binaries:
I downloaded the deployed lambda as a zip file and verified it does have secretmanager along with other pypi binaries.
But, still for some reason, it still fails saying secret-manager module not found..
{
"errorMessage": "Unable to import module 'handler': No module named 'secret_manager'",
"errorType": "Runtime.ImportModuleError"
}
Note the name of folder of PyPi is secretmanager and the name of file inside it is secret_manager.py
For reference :

I forgot to include a separate user-written file secret_manager.py that had the logic for getting secret from secret-manager. It had nothing to do with the PyPi binary.

Related

Python Azure Functions: arcgis package has egg-info instead of dist-info in Lib\site-packages

I am testing an Azure Durable Function with the following requirements.txt:
azure-functions
azure-functions-durable
datetime
requests==2.23.0
arcgis==1.8.0.post1
openpyxl==3.0.3
aiohttp==3.7.3
numpy
When I debug it using VS Code, the venv is created adding all the relevant packages, but the Terminal complains saying Exception: ModuleNotFoundError: No module named 'arcgis' despite this module being in the requirements file.
When I check the Troubleshooting Guide at https://aka.ms/functions-modulenotfound, one of the possible reasons they mention is:
However, when I check my venv, there is no dist-info folder, but only:
arcgis-1.8.0.post1-py3.9.egg-info.
Is there a way I can install the wheel instead?

Cannot import name 'cygrpc' from 'grpc._cython' - Google Ads API

I want to deploy working python project in pycharm to aws lambda. The project is using google-ads library to get some report data from google ads.
I tried deploying lambda by importing complete project as a zip file by zipping all the folders/files inside the project and not the project folder itself. But i got the following error:
{
"errorMessage": "Unable to import module 'main': cannot import name 'cygrpc' from 'grpc._cython' (/var/task/grpc/_cython/__init__.py)",
"errorType": "Runtime.ImportModuleError",
"stackTrace": []
}
Assuming that google-ads library is working and that something is wrong with grpc(btw google-ads includes grpcio and stuff on its own), i tried to create a layer for grpcio, cython, cygrpc but the error remains same.
I create projects/layers in aws lambda and they work. I dont know what i am doing wrong here.
Any help would be really appreciated!
versions: google-ads-14.1.0, python-3.9, grpcio-1.43.0
Answering my own question after a lot of workaround. I have made it generic so anyone can use it.
I believe you can fix any type of ImportModuleError as long as your deployment package's file structure, your code and architecture is ok, only then you can deploy and run your code successfully. To fix your structure and architecture, follow steps below:
1- Install "ubuntu 18.04 LTS" from microsoft store (Windows 10).
2- Open CMD and run following commands:
ubuntu1804
Enter password or create user if asked.
cd /mnt/c You can choose any of your drive. I chose C.
mkdir my-lambda-folder Create project folder.
cd my-lambda-folder Enter into project folder.
touch lambda_function.py Create file called lambda_function.py
Now copy and paste your code into file you just created i.e lambda_function.py
pip install --target ./package your-module-name
For Example: pip install --target ./package google-ads will install
google-ads module inside folder 'package'. The folder 'package' will be
created automatically if not found.
cd package
zip -r ../my-deployment-package.zip . This will create deployment package with the installed library at the root of your project folder i.e my-lambda-folder.
cd .. go back to the root of your project folder.
zip -g my-deployment-package.zip lambda_function.py Add your lambda function to the deployment package you just created i.e my-deployment-package.zip.
(Optional) In my case i was using google-ads and to run my code i needed google-ads.yaml file too in my deployment package. So i ran additional command zip -g my-deployment-package.zip google-ads-yaml (i already pasted this file in my project folder).
3- Upload my-deployment-package.zip to your lambda function in AWS console and you are good to go.
For me, it worked just by downloading the packages with pip on ubuntu on docker and packing and uploading them on AWS.

"Unable to import module 'lambda_function': libasound.so.2: cannot open shared object file: No such file or directory",

I have created a lambda layer with the following python packages using pip3:
google-cloud-texttospeech and
azure-cognitiveservices-speech
When I use this layer with python3.8 lambda function, I get the error saying,
{
"errorMessage": "Unable to import module 'lambda_function': libasound.so.2: cannot open shared object file: No such file or directory",
"errorType": "Runtime.ImportModuleError"
}
I removed the azure-cognitiveservices-speech package and the layer works fine with lambda. This means that the culprit is the azure-cognitiveservices-speech package. I couldn't find a way to solve the problem.
Any kind of help will be grately appreciated. Thank you!
So you need to install the package manually on your system and package the same with your lambda zip file as per the documentation
Once installed you can package your python code and the .so files together and upload to AWS lambda. The folder structure for your reference should look like this.
myawesomefunction.py
libasound.so.2

How to package Scrapy dependency to lambda?

I am writing a python application which dependents on Scrapy module. It works fine locally but failed when I run it from aws lambda test console. My python project has a requirements.txt file with below dependency:
scrapy==1.6.0
I packaged all dependencies by following this link: https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html. And also, I put my source code *.py at the root level of in the zip file. My package script can be found https://github.com/zhaoyi0113/quote-datalake/blob/master/bin/deploy.sh.
It basically does two things, first run command pip install -r requirements.txt -t dist to download all dependencies to dist directory. second, copy app python source code to dist directory.
The deployment is done via terraform and below is the configuration file.
provider "aws" {
profile = "default"
region = "ap-southeast-2"
}
variable "runtime" {
default = "python3.6"
}
data "archive_file" "zipit" {
type = "zip"
source_dir = "crawler/dist"
output_path = "crawler/dist/deploy.zip"
}
resource "aws_lambda_function" "test_lambda" {
filename = "crawler/dist/deploy.zip"
function_name = "quote-crawler"
role = "arn:aws:iam::773592622512:role/LambdaRole"
handler = "handler.handler"
source_code_hash = "${data.archive_file.zipit.output_base64sha256}"
runtime = "${var.runtime}"
}
It zip the directory and upload the file to lambda.
I found I get the runtime error in lambda Unable to import module 'handler': cannot import name 'etree' when there is a statement import scrapy. I didn't use etree in my code so I believe there is something used by scrapy.
My source code can be found at https://github.com/zhaoyi0113/quote-datalake/tree/master/crawler. There are only two simple python files.
It works fine if I run them locally. The error only appears in lambda. Is there a different way to package scrapy to lambda?
Based on the communication with Tim, the issue is caused by incompatible library versions between local and lambda.
The easiest way to resolve this issue is to use the docker image lambci/lambda to build a package with the command:
$ docker run -v $(pwd):/outputs -it --rm lambci/lambda:build-python3.6 pip install scrapy -t /outputs/
You need to provide the entire dependency tree, scrapy also has a set of dependencies (and they may also have dependencies).
The easiest way to download all the required dependencies is to use pip
$ pip -t packages/ install scrapy
This will download scrapy and all its dependencies into the folder packages.
Scrapy has lxml and pyOpenSSL as dependencies that include compiled components. Unless they are statically compiled they will likely require that the c-libraries they require are also installed on the lambda VM.
From the lxml documentation it requires:
libxml2 version 2.9.2 or later.
libxslt version 1.1.27 or later.
We recommend libxslt 1.1.28 or later.
Maybe try adding installation of these to your deploy script. You should be able to use (I'm making a guess at the package names) yum -y install libxml2 libxslt
Another good idea is to test your scripts on an Amazon Linux EC2 instance as this is close to the environment that Lambda executes in.

sphinx-doc documentation on readthedocs.org depends on private sphinxcontrib-package

I host a sphinx-doc website on readthedocs.org which depends on a private sphinx-doc extension/sphinxcontrib-package [extension_name]. On my local pc this extension is located in the subdirectory [sphinx-doc project root]/exts/sphinxcontrib-[extension name] and "referenced" in conf.py with
sys.path.append(os.path.abspath('exts/sphinxcontrib-<extension name>/sphinxcontrib'))
extensions = [
'sphinx.ext.graphviz',
'[extension name]'
]
The .py file is located in the "source" directory sphinxcontrib. The local sphinx-doc build output is as expected. However the build on readthedocs.org fails due to an
ImportError: No module named 'sphinxcontrib'
...
Extension error:
Could not import extension sphinxcontrib.p3 (exception: No module named 'sphinxcontrib')
As the extension is in beta phase i do not want to distribute it online. (If i would do so i should be able to install it as readthedocs.org pre-build from the web source i guess.) How can i handle this issue?

Categories

Resources