I am new to AWS lambda function and i am trying to add my existing code to AWS lambda. My existing code looks like :
import boto3
import slack
import slack.chat
import time
import itertools
from slacker import Slacker
ACCESS_KEY = ""
SECRET_KEY = ""
slack.api_token = ""
slack_channel = "#my_test_channel"
def gather_info_ansible():
.
.
def call_snapshot_creater(data):
.
.
def call_snapshot_destroyer(data):
.
.
if __name__ == '__main__':
print "Calling Ansible Box Gather detail Method first!"
ansible_box_info = gather_info_ansible()
print "Now Calling the Destroyer of SNAPSHOT!! BEHOLD THIS IS HELL!!"
call_snapshot_destroyer(ansible_box_info)
#mapping = {i[0]: [i[1], i[2]] for i in data}
print "Now Calling the Snapshot Creater!"
call_snapshot_creater(ansible_box_info)
Now i try to create a lambda function from scratch on AWS Console as follows (a hello world)
from __future__ import print_function
import json
print('Loading function')
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
print("value1 = " + event['key1'])
print("value2 = " + event['key2'])
print("value3 = " + event['key3'])
print("test")
return event['key1'] # Echo back the first key value
#raise Exception('Something went wrong')
and the sample test event on AWS console is :
{
"key3": "value3",
"key2": "value2",
"key1": "value1"
}
I am really not sure how to put my code in AWS lambda coz if i even add the modules in lambda console and run it it throws me error :
Unable to import module 'lambda_function': No module named slack
How to solve this and import my code in lambda?
You have to make a zipped package consisting of your python script containing the lambda function and all the modules that you are importing in the python script. Upload the zipped package on aws.
Whatever module you want to import, you have to include that module in the zip package. Only then the import statements will work.
For example your zip package should consist of
test_package.zip
|-test.py (script containing the lambda_handler function)
|-boto3(module folder)
|-slack(module folder)
|-slacker(module folder)
You receive an error because AWS lambda does not have any information about a module called slack.
A module is a set of .py files that are stored somewhere on a computer.
In case of lambda, you should import all your libraries by creating a deployment package.
Here is an another question that describes similar case and provides several solutions:
AWS Lambda questions
Related
I am using the below azure function which takes http trigger as input.
import logging
import azure.functions as func
from . import test
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
strng = req.params.get('strng')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.sum = {test.testfunc(strng)}")
else:
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
status_code=200
)
Below is the test.py file which I imported in the init.py.
import json
import pandas as pd
from pandas import DataFrame
from azure.storage.blob import AppendBlobService
from datetime import datetime
def testfunc(strng):
# return strng
API = json.loads(strng)
test = pd.json_normalize(parse, record_path='Vol', meta=['studyDate'])
df = pd.DataFrame(test)
df["x"] = df["Vol"] * 2
df["y"] = df["Vol"] * 50
df1 = df[['Date', 'x', 'y']]
df2 = df1.to_json(orient='records')
append_blob_service = AppendBlobService(account_name='actname',
account_key='key')
date = datetime.now()
blobname = f"test_cal{date}.json"
append_blob_service.create_blob('container', blobname, if_none_match="*")
append_blob_service.append_blob_from_text('container', blobname, text=df2)
return df2
The above function works when I run it in pycharm and databricks . But when I run the Azure function via visuals studio code, I get the below error.
Exception: ImportError: cannot import name 'AppendBlobService' from 'azure.storage.blob'
I have tried below and still get the same error.
pip install azure-storage --upgrade
pip install azure-storage-blob
Kindly provide assistance with the error.
Is there any other way I can save the df2 variable to Azure storage.
Thank you.
According to Document it says,
If the module (for example, azure-storage-blob) cannot be found, Python throws the ModuleNotFoundError. If it is found, there may be an issue with loading the module or some of its files. Python would throw a ImportError in those cases.
Try setting python to environment variable FUNCTIONS_WORKER_RUNTIME or
Try adding azure.core to your requirements.txt file.
Taken References from:
ImportError: cannot import name 'BlockBlobService' from 'azure.storage.blob'
Azure Functions fails running when deployed
The current version library contains the blobserviceclient instead of the blockblockblockservice. This works for me by using version 2.1.0 can solve it:
pip install azure-storage-blob==2.1.0
I came across this python class ResourcesMoveInfo for moving resources(Azure Images) from one subscription to another with Azure python SDK.
But it's failing when I use it like below:
Pattern 1
reference from https://buildmedia.readthedocs.org/media/pdf/azure-sdk-for-python/v1.0.3/azure-sdk-for-python.pdf
Usage:
metadata = azure.mgmt.resource.resourcemanagement.ResourcesMoveInfo(resources=rid,target_resource_group='/subscriptions/{0}/resourceGroups/{1}'.format(self.prod_subscription_id,self.resource_group))
Error:
AttributeError: module 'azure.mgmt.resource' has no attribute 'resourcemanagement'
Pattern 2
reference from - https://learn.microsoft.com/en-us/python/api/azure-mgmt-resource/azure.mgmt.resource.resources.v2019_07_01.models.resourcesmoveinfo?view=azure-python
Usage:
metadata = azure.mgmt.resource.resources.v2020_06_01.models.ResourcesMoveInfo(resources=rid,target_resource_group='/subscriptions/{0}/resourceGroups/{1}'.format(self.prod_subscription_id,self.resource_group))
Error:
AttributeError: module 'azure.mgmt.resource.resources' has no attribute 'v2020_06_01'
Any help on this requirement/issue would be helpful. Thanks!
Adding code snippet here:
import sys
import os
import time
from azure.common.credentials import ServicePrincipalCredentials
from azure.mgmt.resource import ResourceManagementClient
import azure.mgmt.resource
#from azure.mgmt.resource.resources.v2020_06_01.models import ResourcesMoveInfo
from azure.identity import ClientSecretCredential
from cred_wrapper import CredentialWrapper
class Move():
def __init__(self):
self.nonprod_subscription_id = "abc"
self.prod_subscription_id = "def"
self.credential = ClientSecretCredential(
client_id= os.environ["ARM_CLIENT_ID"],
client_secret= os.environ["ARM_CLIENT_SECRET"],
tenant_id= os.environ["ARM_TENANT_ID"]
)
#resource client for nonprod
self.sp = CredentialWrapper(self.credential)
self.resource_client = ResourceManagementClient(self.sp,self.nonprod_subscription_id)
self.resource_group = "imgs-rg"
def getresourceids(self):
resource_ids = list(resource.id for resource in self.resource_client.resources.list_by_resource_group("{0}".format(self.resource_group)) if resource.id.find("latest")>=0)
return resource_ids
def getresourcenames(self):
resource_names = list(resource.name for resource in self.resource_client.resources.list_by_resource_group("{0}".format(self.resource_group)) if resource.id.find("latest")>=0)
return resource_names
def deleteoldimages(self,name):
#resource client id for prod
rc = ResourceManagementClient(self.sp,self.prod_subscription_id)
for resource in rc.resources.list_by_resource_group("{0}".format(self.resource_group)):
if resource.name == name:
#2019-12-01 is the latest api_version supported for deleting the resource type "image"
rc.resources.begin_delete_by_id(resource.id,"2020-06-01")
print("deleted {0}".format(resource.name))
def moveimages(self):
rnames = self.getresourcenames()
for rname in rnames:
print(rname)
#self.deleteoldimages(rname)
time.sleep(10)
rids = self.getresourceids()
rid = list(rids[0:])
print(rid)
metadata = azure.mgmt.resource.resources.v2020_06_01.models.ResourcesMoveInfo(resources=rid,target_resource_group='/subscriptions/{0}/resourceGroups/{1}'.format(self.prod_subscription_id,self.resource_group))
#moving resources in the rid from nonprod subscription to prod subscription under the resource group avrc-imgs-rg
if rid != []:
print("moving {0}".format(rid))
print(self.resource_client.resources.move_resources(source_resource_group_name="{0}".format(self.resource_group),parameters=metadata))
#self.resource_client.resources.move_resources(source_resource_group_name="{0}".format(self.resource_group),resources=rid,target_resource_group='/subscriptions/{0}/resourceGroups/{1}'.format(self.prod_subscription_id,self.resource_group))
#self.resource_client.resources.begin_move_resources(source_resource_group_name="{0}".format(self.resource_group),parameters=metadata)
if __name__ == "__main__":
Move().moveimages()
From your inputs we can see that the code looks fine. From your error messages, the problem is with importing the modules.
Basically when we import a module few submodules will get installed along with and few will not. This will depend on the version of the package, to understand which modules are involved in a specific version we need to check for version-releases in official documentation.
In your case, looks like some resource modules were missing, if you could see the entire error-trace, there will be a path with sitepackages in our local. Try to find that package and its subfolder(Modules) and compare them with Azure SDK for Python under Resource module, you can find this here.
In such situation we need to explicitly add those sub modules under our package. In your case you can simple download the packaged code from Git link which I gave and can merge in your local folder.
I had a def hello() function in my home/file.py file. I created a home/common/utils.pyfile and moved the function there.
Now, I want to import it in my file file.py.
I imported it like this: from utils import hello and from common.utils import hello and the import in my file doesn't throw an error. However, when I run it on AWS Lambda, I get an error that:
Runtime.ImportModuleError: Unable to import module 'file': No module named 'utils'
How can I fix this? without having to use Ec2 or something...
data "archive_file" "file_zip" {
type = "zip"
source_file = "${path.module}/src/file.py"
output_file_mode = "0666"
output_path = "${path.module}/bin/file.zip"
}
The deployment package that you're uploading only contains your main Python script (file.py). Specifically, it does not include any dependencies such as common/utils.py. That's why the import fails when the code runs in Lambda.
Modify the creation of your deployment package (file.zip) so that it includes all needed dependencies.
For example:
data "archive_file" "file_zip" {
type = "zip"
output_file_mode = "0666"
output_path = "${path.module}/bin/file.zip"
source {
content = file("${path.module}/src/file.py")
filename = "file.py"
}
source {
content = file("${path.module}/src/common/utils.py")
filename = "common/utils.py"
}
}
If all of your files happen to be in a single folder then you can use source_dir instead of indicating the individual files.
Note: I don't use Terraform so the file(...) with embedded interpolation may not be 100% correct, but you get the idea.
First of all, properly follow this standard URL:- https://docs.aws.amazon.com/lambda/latest/dg/python-package.html (refer section with title:- Deployment package with dependencies)
Now, if you notice, in the end of section,
zip -g my-deployment-package.zip lambda_function.py
Follow the same command for your utils file:-
zip -g my-deployment-package.zip common/
zip -g my-deployment-package.zip common/utils.py
Ensure that, in lambda_function, you are using proper import statement like:-
from common.utils import util_function_name
Now, you can upload this zip and test for yourself. It should run.
Hope this helps.
I am trying to deploy python lambda function with serverless framework. This function need to run for 15 min (AWS Lambda Timeout). I want to simulate 100 IoT devices using AWS Lambda.
I have following code device_status.py
import os
import time
from uptime import uptime
import requests
from random import randrange
from configparser import ConfigParser, ExtendedInterpolation
class DeviceStatus:
def __init__(self):
self.config_file = 'config.ini'
self.config_dict = None
self.read_device_config()
self.dr_ins = DeviceRegistration(self.config_dict)
....
if __name__ == '__main__':
init_ds = DeviceStatus()
status_interval = init_ds.config_dict['status']['interval']
while True:
init_ds.send_device_status()
time.sleep(int(status_interval))
and serverless.yml
service: lambda-device
plugins:
- serverless-python-requirements
provider:
name: aws
runtime: python3.6
region: us-east-1
functions:
lambda-device:
handler: main.device_status
when I try to invoke it I get "errorMessage": "Unable to import module 'main'"
How to refer to main function in serverless.yml ?
The error message that you are receiving is saying that there is no main.py file in your serverless structure.
Referring to your serverless.yml:
functions:
lambda-device:
handler: main.device_status
The explanation from the above section is that you have a serverless-function that is named lambda-device which is having a structure with a filename main.py that in its definition would require to have a method:
def device_status(event, context):
# TODO
pass
So make sure you have main.py file with a method device_status(event, context)
My directory looks like this :
- HttpExample:
- __init__.py
- DBConnection.py
- getLatlong.py
I want to import DBConnection and import getLatlong in __init__.py. There is no error in my __init__.py until I run it, I received :
System.Private.CoreLib: Exception while executing function: Functions.HttpExample. System.Private.CoreLib: Result: Failure
Exception: ModuleNotFoundError: No module named 'getLatlong'
I'm trying to use function in getLatlong to use the information input by user from __init__.py to getLatlong. Below is the code:
__init__.py :
from getLatlong import result
from DBConnection import blobService, container_name, account_key, file_path
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
section = req.params.get('section')
bound = req.params.get('bound')
km_location = req.params.get('km_location')
location = req.params.get('location')
if not section:
try:
req_body = req.get_json()
except ValueError:
pass
else:
section = req_body.get('section')
if section and bound and km_location:
result(section, km_location, bound, location).getResult() #HERE
return func.HttpResponse(f"Hello {section}, {bound}!")
#elif section is None or bound is None or km_location is None:
# return func.HttpResponse("All values are mandatory!")
I am also receiving compile error at getLatlong to import DBConnection to this class. The following values will pass to getLatlong.py. The code :
from DBConnection import blobService, container_name, account_key, file_path #Another import error here says : Unable to import DBConnection
class result:
def __init__(self, section, bound, km_location, location):
self.section = section
self.bound = bound
self.km_location = km_location
self.location = location
def getResult(self):
print(self.section)
print(self.bound)
print(self.km_location)
print(self.location)
I've tried every way to import these files before I lost my mind..
You get these errors, because Python does not know where to look for the files you want to import. Depending on which Python version you are using, I see three ways to solve this:
You could add HttpExample to your PYTHONPATH and than your imports should work as you have them currently.
Another way would be to use the sys module and append the path to HttpExample, e.g.
import sys
sys.path.append('PATH/TO/HttpExample')
But you would have to do this in all files, where you want to import something from the parent folder.
Or you use relative imports, which have been available since Python 2.5 (See PEP238). Those are only available in modules, but as you have your __init__.py file, it should work. For relative imports you are using dots . to tell Python where to look for the import. One dot . tells Python to look for the desired import in the parent folder. You could also use .. to go up two levels. But one level should be enough in your case.
So in your case changing your code to this, should solve your problem.
In __init.py__:
from .getLatlong import result
from .DBConnection import blobService, container_name, account_key, file_path
In getLangLong.py:
from .DBConnection import blobService, container_name, account_key, file_path
You could try from __app__.HttpExample import getLatlong.
There is a document about how to import module in the Shared Code folder. Check this doc:Folder structure.
It says Shared code should be kept in a separate folder in __app__. And in my test this could work for me.