I am trying to run a python code in code inline an AWS Lambda function.
I am not zipping any file just pasting the below code in the Lambda function.
And I am getting this error:
errorMessage": "Unable to import module 'UpdateHost_Python'
import psycopg2
def lambda_handler(event,context):
conn_string = "dbname='myfirstdb' port='5432' user='db28' password='######' host='#####.ck0zbnniqteb.us-east-2.rds.amazonaws.com'"
conn = psycopg2.connect(conn_string)
cursor = conn.cursor()
cursor.execute("select * from unnmesh")
conn.commit()
cursor.close()
print("working")
For non-standard Python libraries (like psycopg2), you will need to create a Deployment Package.
This involves creating a Zip file with the libraries, then uploading the Zip file to Lambda.
See: AWS Lambda Deployment Package in Python - AWS Lambda
For a worked-through example, see also: Tutorial: Using AWS Lambda with Amazon S3 - AWS Lambda (I know you're not using Amazon S3, but the tutorial gives an example of building a package with dependencies.)
Related
I am using the below azure function which takes http trigger as input.
import logging
import azure.functions as func
from . import test
def main(req: func.HttpRequest) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
name = req.params.get('name')
strng = req.params.get('strng')
if not name:
try:
req_body = req.get_json()
except ValueError:
pass
else:
name = req_body.get('name')
if name:
return func.HttpResponse(f"Hello, {name}. This HTTP triggered function executed successfully.sum = {test.testfunc(strng)}")
else:
return func.HttpResponse(
"This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.",
status_code=200
)
Below is the test.py file which I imported in the init.py.
import json
import pandas as pd
from pandas import DataFrame
from azure.storage.blob import AppendBlobService
from datetime import datetime
def testfunc(strng):
# return strng
API = json.loads(strng)
test = pd.json_normalize(parse, record_path='Vol', meta=['studyDate'])
df = pd.DataFrame(test)
df["x"] = df["Vol"] * 2
df["y"] = df["Vol"] * 50
df1 = df[['Date', 'x', 'y']]
df2 = df1.to_json(orient='records')
append_blob_service = AppendBlobService(account_name='actname',
account_key='key')
date = datetime.now()
blobname = f"test_cal{date}.json"
append_blob_service.create_blob('container', blobname, if_none_match="*")
append_blob_service.append_blob_from_text('container', blobname, text=df2)
return df2
The above function works when I run it in pycharm and databricks . But when I run the Azure function via visuals studio code, I get the below error.
Exception: ImportError: cannot import name 'AppendBlobService' from 'azure.storage.blob'
I have tried below and still get the same error.
pip install azure-storage --upgrade
pip install azure-storage-blob
Kindly provide assistance with the error.
Is there any other way I can save the df2 variable to Azure storage.
Thank you.
According to Document it says,
If the module (for example, azure-storage-blob) cannot be found, Python throws the ModuleNotFoundError. If it is found, there may be an issue with loading the module or some of its files. Python would throw a ImportError in those cases.
Try setting python to environment variable FUNCTIONS_WORKER_RUNTIME or
Try adding azure.core to your requirements.txt file.
Taken References from:
ImportError: cannot import name 'BlockBlobService' from 'azure.storage.blob'
Azure Functions fails running when deployed
The current version library contains the blobserviceclient instead of the blockblockblockservice. This works for me by using version 2.1.0 can solve it:
pip install azure-storage-blob==2.1.0
I'm using google oozie to airflow converter to convert some oozie workflow that are running on AWS EMR. Managed to get a first version, but when I try to upload the DAG, airflow throws an error:
Broken DAG: No module named 'o2a'
I have tried to deploy the pypi package o2a, both using command
gcloud composer environments update composer-name --update-pypi-packages-from-file requirements.txt --location location
And from google cloud console. Both failed.
requirements.txt
o2a==1.0.1
Here is the code
from airflow import models
from airflow.operators.subdag_operator import SubDagOperator
from airflow.utils import dates
from o2a.o2a_libs import functions
from airflow.models import Variable
import subdag_validation
import subdag_generate_reports
CONFIG = {}
JOB_PROPS = {
}
dag_config = Variable.get("coordinator", deserialize_json=True)
cdrPeriod = dag_config["cdrPeriod"]
TASK_MAP = {"validation": ["validation"], "generate_reports": ["generate_reports"] }
TEMPLATE_ENV = {**CONFIG, **JOB_PROPS, "functions": functions, "task_map": TASK_MAP}
with models.DAG(
"workflow_coordinator",
schedule_interval=None, # Change to suit your needs
start_date=dates.days_ago(0), # Change to suit your needs
user_defined_macros=TEMPLATE_ENV,
) as dag:
validation = SubDagOperator(
task_id="validation",
trigger_rule="one_success",
subdag=subdag_validation.sub_dag(dag.dag_id, "validation", dag.start_date, dag.schedule_interval),
)
generate_reports = SubDagOperator(
task_id="generate_reports",
trigger_rule="one_success",
subdag=subdag_generate_reports.sub_dag(dag.dag_id, "generate_reports", dag.start_date, dag.schedule_interval,
{
"cdrPeriod": "{{cdrPeriod}}"
}),
)
validation.set_downstream(generate_reports)
There is a section in the o2a docs that cover how to deploy o2a:
https://github.com/GoogleCloudPlatform/oozie-to-airflow#the-o2a-libraries
With started to failed because another dependency:lark-parser
Just installed using pypi package manager for Composer did the trick.
I am trying to run a simple test code to connect ti a mysql db using sql alchemy.
The code is as follows:
from sqlalchemy import (create_engine, Table, Column, Integer, String, MetaData)
import settings
import sys
try:
db = create_engine('mysql://daniel:dani#localhost/test')
db.connect()
except:
print('opps ', sys.exc_info()[1])
I get the following error:
dlopen(//anaconda/lib/python3.5/site-packages/_mysql.cpython-35m-darwin.so, 2): Library not loaded: libssl.1.0.0.dylib
Referenced from: //anaconda/lib/python3.5/site-packages/_mysql.cpython-35m-darwin.so
Reason: image not found
[Finished in 1.4s]
But running on terminal:
locate libssl.1.0.0.dylib
I get:
/Applications/Dtella.app/Contents/Frameworks/libssl.1.0.0.dylib
/Applications/XAMPP/xamppfiles/lib/libssl.1.0.0.dylib
/Users/dpereira14/anaconda/envs/dato-env/lib/libssl.1.0.0.dylib
/Users/dpereira14/anaconda/lib/libssl.1.0.0.dylib
/Users/dpereira14/anaconda/pkgs/openssl-1.0.1k-1/lib/libssl.1.0.0.dylib
/anaconda/lib/libssl.1.0.0.dylib
/anaconda/pkgs/openssl-1.0.2g-0/lib/libssl.1.0.0.dylib
/opt/local/lib/libssl.1.0.0.dylib
/usr/local/Cellar/openssl/1.0.1j/lib/libssl.1.0.0.dylib
I have no clue how to fix this error.
Thanks!
I also had some problems with SQLAlchemy with mysql, i changed localhost in the create_engine to 127.0.0.1:port, and also had to use pymysql. Ended up working with this:
engine = create_engine('mysql+pymysql://user:password#127.0.0.1:port/db')
pymysql is installed via pip.
You are using python , so you have add mysqldb with mysql. Try the below code.
try:
db = create_engine('mysql+mysqldb://daniel:dani#localhost/test')
db.connect()
except:
print('opps ', sys.exc_info()[1])
I am new to AWS lambda function and i am trying to add my existing code to AWS lambda. My existing code looks like :
import boto3
import slack
import slack.chat
import time
import itertools
from slacker import Slacker
ACCESS_KEY = ""
SECRET_KEY = ""
slack.api_token = ""
slack_channel = "#my_test_channel"
def gather_info_ansible():
.
.
def call_snapshot_creater(data):
.
.
def call_snapshot_destroyer(data):
.
.
if __name__ == '__main__':
print "Calling Ansible Box Gather detail Method first!"
ansible_box_info = gather_info_ansible()
print "Now Calling the Destroyer of SNAPSHOT!! BEHOLD THIS IS HELL!!"
call_snapshot_destroyer(ansible_box_info)
#mapping = {i[0]: [i[1], i[2]] for i in data}
print "Now Calling the Snapshot Creater!"
call_snapshot_creater(ansible_box_info)
Now i try to create a lambda function from scratch on AWS Console as follows (a hello world)
from __future__ import print_function
import json
print('Loading function')
def lambda_handler(event, context):
#print("Received event: " + json.dumps(event, indent=2))
print("value1 = " + event['key1'])
print("value2 = " + event['key2'])
print("value3 = " + event['key3'])
print("test")
return event['key1'] # Echo back the first key value
#raise Exception('Something went wrong')
and the sample test event on AWS console is :
{
"key3": "value3",
"key2": "value2",
"key1": "value1"
}
I am really not sure how to put my code in AWS lambda coz if i even add the modules in lambda console and run it it throws me error :
Unable to import module 'lambda_function': No module named slack
How to solve this and import my code in lambda?
You have to make a zipped package consisting of your python script containing the lambda function and all the modules that you are importing in the python script. Upload the zipped package on aws.
Whatever module you want to import, you have to include that module in the zip package. Only then the import statements will work.
For example your zip package should consist of
test_package.zip
|-test.py (script containing the lambda_handler function)
|-boto3(module folder)
|-slack(module folder)
|-slacker(module folder)
You receive an error because AWS lambda does not have any information about a module called slack.
A module is a set of .py files that are stored somewhere on a computer.
In case of lambda, you should import all your libraries by creating a deployment package.
Here is an another question that describes similar case and provides several solutions:
AWS Lambda questions
I'm trying to access AWS using Boto, and it's not working. I've installed Boto, and the boto.cfg in /etc. Here's my code:
import requests, json
import datetime
import hashlib
import boto
conn = boto.connect_s3()
Here's the error:
Traceback (most recent call last):
File "boto.py", line 4, in <module>
import boto
File "/home/mydir/public_html/boto.py", line 6, in <module>
conn = boto.connect_s3()
AttributeError: 'module' object has no attribute 'connect_s3'
What the hell? This isn't complicated.
It looks like the file you're working on is called boto.py. I think what's happening here is that your file is importing itself--Python looks for modules in the directory containing the file doing the import before it looks on your PYTHONPATH. Try changing the name to something else.
Use the Connection classes.
e.g.
from boto.s3.connection import S3Connection
from boto.sns.connection import SNSConnection
from boto.ses.connection import SESConnection
def connect_s3(self):
return S3Connection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
def connect_sns(self):
return SNSConnection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
def connect_ses(self):
return SESConnection(AWS_ACCESS_KEY_ID,AWS_SECRET_ACCESS_KEY)
#valdogg21
I am following your instructions and put this into my code:
from boto.s3.connection import S3Connection
conn = S3Connection('<aws access key>', '<aws secret key>')
But despite my good intentions, it results in a small error. I just did
sudo pip install boto --upgrade to ensure I have the latest version installed.
This is the error message. Just wondering if I am a lone wolf or if others encounter this issue...
from boto.s3.connection import S3Connection ImportError: cannot import
name S3Connection
You may need to do something similar to how I had to utilize the EC2Connection class in some of my code, which looks like this:
from boto.ec2.connection import EC2Connection
conn = EC2Connection(...)
Also, from their docs (http://boto.s3.amazonaws.com/s3_tut.html):
>>> from boto.s3.connection import S3Connection
>>> conn = S3Connection('<aws access key>', '<aws secret key>')
EDIT: I know that doc page has the shortcut function you're trying to use, but I saw a similar problem when trying to do the same type of shortcut with EC2.
I have tried all of your solutions, but none of them seem to work. I keep going over StackOverFlow as I cannot see anyone else not having this rather small issue. Kind of weird fact is that in the server it works like a charm. The issue is on my Mac
I had this issue and was facing the same error when using boto3 and moto to mock s3 bucket.
boto3.connect_s3()
I switched back my library to boto and it worked fine. It looks like boto3 has migrated connect_s3() to resources():
boto.connect_s3() //works
boto3.resources('s3') //works
I could resolve similar issue for AWS Lambda too:
boto.connect_awslambda() //works
boto3.client('lambda') //works