I am attempting to import the google-cloud-firestore module in a python script on AWS Lambda. I have the module installed in my virtual environment and upload the code/packages in a zip folder, but receive the following error.
I have successfully imported and executed a script with the requests module using the same approach.
I have attempted to uninstall and reinstall the following modules to my virtual environment and still have no luck: google-cloud-firestore, grpcio, google-cloud-core.
Does anyone have a solution to this problem? What could I be missing here?
This thread gives me the impression that the google-cloud-firestore module cannot be used in a zip file. If this is the case, can I install the module at runtime?
Thank you for any and all advice!
Related
I want to use grequests on AWS Lambda. I created venv and reqs.txt and then pip install in a folder named python, then zipped the python folder and uploaded it to a Lambda Layer. After that, I am facing this error:
Gevent is required for grequests.
I already tried this answer on AWS Linux but nothing changed.
Any solution or advice?
Edit: A solution was written, "make zip process on ec2 server, not get the files from ec2 and zip on your local" and I did so. The error messages changed to:
Unable to import module 'lambda_function': No module named 'grequests'
Edit 2: I followed this guide, and facing a new error :)
Unable to import module 'lambda_function': No module named 'zope.interface'
and zope.interface already installed.
I downloaded a project and am running through and installing all dependencies. At first I had errors regarding No module named utm and No module named paho. I solved these issues by going to C:\Users\me and using pip install paho-mqtt and pip install utm. Easy enough.
I then have this line "from mfa_msgs import Mission, WaypointList, MissionControl, ControlCommand, Status" and am getting a No module error here. mfa_msgs is a folder found in the project I downloaded that contains the Mission, WaypointList, etc. files. Where do I need to put the mfa_msgs folder in order to be able to access them?
Appreciate your time!
I created a lambda function which upload data to snowflake. I installed a all requirements in folder and zipped along with my main python file. While running in AWS it shows an error:
no module found. Cryptography.hamtaz.bindings._constant_time.
But I have this module at specified path. I don't know why it shows an error. I don't know why the error is arise.
Here is the code:
main(event, context):
import snowflake.connector
cnx = snowflake.connector.connect( user='xxx', password='yyyyy', account='zzzz', database="db Name", schema = "schema Name" )
try:
query = "SELECT * FROM Table_Name"
cnx.cursor().execute(query)
finally:
cnx.close()
I recently encountered the same issue. It turned out my Lambda function runtime was Python 3.8 but the 'cffi' library had been compiled for Python 3.6. I created a new Lambda function with the Python 3.6 runtime and uploaded my deployment package to it and it began working right away.
I faced same issue recently and found it is a problem with windows environment, try to create linux environment, install Python, packages, zip your code with all libraries and then throw back to AWS lambda, hopefully it will work.
i needed to set up a virtualenv for my lambda package to work. i also found pip install snowflake-connector-python did not install some cryptography libraries, although if i navigated to the directory i wanted them to be put in, adding --target . did cause those libraries to get installed.
For python 3.6, when I encountered the error "Unable to import module 'main': No module named '_cffi_backend'" in an AWS Lambda Function, I was able to run mv _cffi_backend.cpython-36m-x86_64-linux-gnu.so _cffi_backend.so in my linux docker image with virtualenv and the issue was resolved. Like mentioned above, some dependencies might be better placed with --target, to get them where you need them
I have followed all the steps in the documentation:
https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
create a directory.
Save all of your Python source files (the .py files) at the root level of this directory.
Install any libraries using pip at the root level of the directory.
Zip the content of the project-dir directory)
But after I uploaded the zip-file to lambda function, I got the error message when I test the script
my code:
import psycopg2
#my code...
the error:
Unable to import module 'myfilemane': No module named 'psycopg2._psycopg'
I don't know where is the suffix '_psycopg' from...
Any help regarding this?
You are using native libraries with lambda. We had this similar problem and here is how we solved it.
Spin a machine with AWS supported AMI that runs your real lambda.
https://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html
As this writing, it is,
AMI name: amzn-ami-hvm-2017.03.1.20170812-x86_64-gp2
Full documentation in installing native modules your python lambda.
https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html
Install the required modules required for your lambda,
pip install module-name -t /path/to/project-dir
and prepare your package to upload along with the native modules under lambda ami environment.
Hope this helps.
I believe this is caused because psycopg2 needs to be build an compiled with statically linked libraries for Linux. Please reference Using psycopg2 with Lambda to Update Redshift (Python) for more details on this issue. Another [reference][1] of problems of compiling psycopg2 on OSX.
There are a few solutions, but basically it comes down to installing the library on a Linux machine and using that as the Psycopg2 Library in your upload package.
I'm trying to deploy django in AWS ElasticBeanstalk.
While I was following the steps as shown here, I'm stuck with the command, "eb init".
I'm using Python 2.7 in Ubuntu 12.10 (vmware)
I'm getting the error as below:
eb init
.....
from lib.aws.http_client import HTTP_GET, HTTP_POST
File "/home/g/Documents/Files/AWS/AWS-ElasticBeanstalk-CLI-2.4.0/eb/linux/python2.7/lib/aws/http_client.py", line 17, in <module>
from httplib import HTTPSConnection
ImportError: cannot import name HTTPSConnection
Two possibilities spring to mind...
The Python installation on AWS doesn't include SSL support.
You've created a file called httplib.py which is shadowing the one in the standard Python library.
Try doing import ssl, and if you get ImportError: No module named _ssl, then it's #1, otherwise it's #2.
I had installed python via homebrew and was getting this error. For some reason the solution was to uninstall and reinstall it:
brew uninstall python
brew install python
I had the same problem with a virtual environment. I deleted the virtual environment and recreated it and the problem disappeared.