I have followed all the steps in the documentation:
https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
create a directory.
Save all of your Python source files (the .py files) at the root level of this directory.
Install any libraries using pip at the root level of the directory.
Zip the content of the project-dir directory)
But after I uploaded the zip-file to lambda function, I got the error message when I test the script
my code:
import psycopg2
#my code...
the error:
Unable to import module 'myfilemane': No module named 'psycopg2._psycopg'
I don't know where is the suffix '_psycopg' from...
Any help regarding this?
You are using native libraries with lambda. We had this similar problem and here is how we solved it.
Spin a machine with AWS supported AMI that runs your real lambda.
https://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html
As this writing, it is,
AMI name: amzn-ami-hvm-2017.03.1.20170812-x86_64-gp2
Full documentation in installing native modules your python lambda.
https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
Install the required modules required for your lambda,
pip install module-name -t /path/to/project-dir
and prepare your package to upload along with the native modules under lambda ami environment.
Hope this helps.
I believe this is caused because psycopg2 needs to be build an compiled with statically linked libraries for Linux. Please reference Using psycopg2 with Lambda to Update Redshift (Python) for more details on this issue. Another [reference][1] of problems of compiling psycopg2 on OSX.
There are a few solutions, but basically it comes down to installing the library on a Linux machine and using that as the Psycopg2 Library in your upload package.
Related
I am working on aws lambda function, i install the package but i got error
Unable to import module 'lambda_function': cannot import name 'cygrpc' from 'grpc._cython' (/var/task/grpc/_cython/init.py)
How to solve this type of error?
Are you trying to import it in code without haveing it installed on the lambda container? If you gonna use none std libs you should have you function and its libs installed locally and when zip it and upload it to Lambda.
Docs:
https://docs.aws.amazon.com/lambda/latest/dg/gettingstarted-package.html
In addition to an answer marked correct I wanted to mention that for libraries using C-extensions is important that the layer was built on the same type of system as it runs in lambda. For example, if you're trying to build the layer including grpc on MacOS most likely it won't work since lambda is running on Linux. To make the layer working you have to build it in a Linux system (for example, on the virtual machine or in Docker)
I need to run some long running job via Azure Web Job in Python.
I am facing below error trying to import pandas.
File "D:\local\Temp\jobs\triggered\demo2\eveazbwc.iyd\pandas_init_.py", line 13
missing_dependencies.append(f"{dependency}: {e}")
The Web app (under which I will run the web job) also has python code using pandas, and it does not throw any error.
I have tried uploading pandas and numpy folder inside the zip file (creating venv, installing packages and zipping Lib/site-packages content), (for 32 bit and 64 bit python) as well as tried appending 'D:/home/site/wwwroot/my_app_name/env/Lib/site-packages' to sys.path.
I am not facing such issues in importing standard python modules or additional package modules like requests.
Error is also thrown in trying to import numpy.
So, I am assuming some kind of version mismatch is happening somewhere.
Any pointers to solve this will be really useful.
I have been using Python 3.x, not sure if I should try Python 2.x (virtual env, install package and zip content of Lib/site-packages).
Regards
Kunal
The key to solving the problem is to add Python 3.6.4 x64 Extension on the portal.
Steps:
Add Extensions on portal.
Create a requirements.txt file.
pandas==1.1.4
numpy==1.19.3
Create run.cmd.
Create below file and zip them into a single zip.
Upload the zip for the webjob.
For more details, please read this post.
Webjobs Running Error (3587fd: ERR ) from zipfile
I created a lambda function which upload data to snowflake. I installed a all requirements in folder and zipped along with my main python file. While running in AWS it shows an error:
no module found. Cryptography.hamtaz.bindings._constant_time.
But I have this module at specified path. I don't know why it shows an error. I don't know why the error is arise.
Here is the code:
main(event, context):
import snowflake.connector
cnx = snowflake.connector.connect( user='xxx', password='yyyyy', account='zzzz', database="db Name", schema = "schema Name" )
try:
query = "SELECT * FROM Table_Name"
cnx.cursor().execute(query)
finally:
cnx.close()
I recently encountered the same issue. It turned out my Lambda function runtime was Python 3.8 but the 'cffi' library had been compiled for Python 3.6. I created a new Lambda function with the Python 3.6 runtime and uploaded my deployment package to it and it began working right away.
I faced same issue recently and found it is a problem with windows environment, try to create linux environment, install Python, packages, zip your code with all libraries and then throw back to AWS lambda, hopefully it will work.
i needed to set up a virtualenv for my lambda package to work. i also found pip install snowflake-connector-python did not install some cryptography libraries, although if i navigated to the directory i wanted them to be put in, adding --target . did cause those libraries to get installed.
For python 3.6, when I encountered the error "Unable to import module 'main': No module named '_cffi_backend'" in an AWS Lambda Function, I was able to run mv _cffi_backend.cpython-36m-x86_64-linux-gnu.so _cffi_backend.so in my linux docker image with virtualenv and the issue was resolved. Like mentioned above, some dependencies might be better placed with --target, to get them where you need them
I am trying to execute a python function on AWS Lambda. In my function I am trying to import the mysql.connector module. But an error is throwing up :
errorMessage": "No module named 'mysql.connector'"
I had written my python code initially in my EC2 Instance. I installed mysql-connector there in my python file directory using pip.
pip install mysql-connector -t /path/to/file/dir
I uploaded the zip file of the only the file and not any folder containing the file.
Pattern that I have been using to deploy Python libs to lambda is following
Firstly, prior to packaging lambda function install all of the requirements into $SOURCE_ROOT/lib folder
pip install -r requirements.txt -t ./lib
Secondly, dd automatic import of this folder in your lambda entrypoint (that is lambda handler)
import os
import sys
# if running in lambda
if 'LAMBDA_TASK_ROOT' in os.environ:
sys.path.append(f"{os.environ['LAMBDA_TASK_ROOT']}/lib")
# this will render all of your packages placed as subdirs available
sys.path.append(os.path.dirname(os.path.realpath(__file__)))
Extending sys.path with your own packaged path is the crucial for this to work.
Note on native compiled extensions
Note that if you are packaging any natively compiled extensions, their compilation should be done on linux-compatible O/S, ideally on EC2 instance created from Amazon Linux AMIs Lambda is running from (currently amzn-ami-hvm-2017.03.1.20170812-x86_64-gp2, but up to date information can be always obtained from Amazon Official Docs. According to my experience, extensions built from official Docker python container worked on lambda, without need to compile on actual EC2 instance created from AMIs, but there is no guarantee as offical docs state that
If you are using any native binaries in your code, make sure they are compiled in this environment. Note that only 64-bit binaries are supported on AWS Lambda.
MySQL Connector
Quick look at MySQL connector for Python gives impression that by default package will use native Python implementation, so no C extension is loaded
Lambda is just like an EC2 instance with only python installed. You need to include all the packages that are required to run the python code with the deployment package itself. None of the packages will be pre-installed when you run the code.
There are two ways to pack python dependencies with Lambda.
Pack it with the function itself. To do this you just go to the root folder of your function, and install dependencies in that folder itself, not into any other folder.
$ pip install -t <some-package> .
Then zip that folder from root and upload, your zip should look something like this:
.
├── lambda_function.py
├── pymysql
│ └── ...
└── PyMySQL-0.9.3.dist-info
└── ...
Second method is a standard that I always follow,I never ship libraries or external packages with my lambda function, I always create Layers.
A layer is a ZIP archive that contains libraries, a custom runtime, or
other dependencies. With layers, you can use libraries in your
function without needing to include them in your deployment package.
Learn More about Lambda Layers in the docs
How do we create a pymssql package for lambda. I tried creating it using
pip install pymssql -t . When I run my lambda function it complaints saying that
Unable to import module 'lambda_function': No module named lambda_function
I follow the steps on this link
http://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
I have a windows machine
Glad that it worked for you, could you please share the working process for your, me too tried different trial and error steps and ended up with the following one which is working fine in AWS Lambda, I am using PYMSSQL package only.
1) did 'pip install pymssql' on amazon EC2 instance as under the hood Amazon uses Linux AMIs to run their Lambda functions.
2) copied the generated .so files and packaged inside the Lambda deployment package hope this will helps others as well who are searching for the solution.
Hope this will help you further, can you please share what you did to connect to MSSQL server using AWS Lambda.
Below is the folder structure of my lambda deployment package
Windows "pymssql" file not supported in AWS lambda since lambda running on Amazon Linux 2.
So we can get Linux-supported wheel files from the following official "pymssql" website download link. You look for "manylinux_2_24_x86_64" version file with the required python version. At the time of writing python 3.9 is the latest version. So download file will be "pymssql-2.2.2-cp39-cp39-manylinux_2_24_x86_64.whl".
Once download the wheel file execute the below command
pip install {path of downloaded wheel file} -t {target folder to store this package}
Example
pip install pymssql-2.2.2-cp39-cp39-manylinux_2_24_x86_64.whl -t /package
Finally i could do it. It didnt worked with windows packages so used ubuntu to package freetds.so file and it worked.
Along with pymssql try to import cypthon.
Just a quick update for new changes.
There seems to be some issue with python 3.9 AWS lambda, it throws the below error on pymssql==2.2.3.
{
"errorMessage": "Unable to import module 'index': No module named
'pymssql._pymssql'",
"errorType": "Runtime.ImportModuleError",
"stackTrace": []
}
but if I change the python version to 3.8, the error vanishes.