Connect to mysql rds using python lambda - python

I am very new to python code and doing some small tests to verify functionality
Currently trying to establish the connection between an RDS MySQL and a python lambda function.
However, it seems to fail in the code itself and I am not sure why this happens.
There are a couple of guides out there but they all seem to be outdated and fail to work for me.
These are the steps I took to get it working(using MAC-12.3.1 and VS-Studio 1.62.3):
created MYSQL RDS
connected to the MYSQL RDS and created a database called "igor" with table name "lumigor", with 2 columns: id and name (populated with random data).
created on the local machine a folder to contain the code and the package.
installed version Python 3.8.9
created lambda function file app.py with the following code:
import pymysql.cursors
# Connect to the database
connection = pymysql.connect(host='rds end point',
user='user',
password='pswrd',
database='igor',
cursorclass=pymysql.cursors.DictCursor)
with connection:
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * FROM `Lumigor`"
cursor.execute(sql, ('webmaster#python.org',))
result = cursor.fetchone()
print(result)
```
I added requirement.txt file with the following command
python3 -m pip install PyMySQL && pip3 freeze > requirements.txt --target...
But now I get an error from the visual studio:
"Import "pymysql.cursors" could not be resolved from sourcePylance"
When I zip the file and upload it to lambda, run a test it returns an error
{
"errorMessage": "Unable to import module 'app': No module named 'pymysql.cursors'",
"errorType": "Runtime.ImportModuleError",
"stackTrace": []
}
It seems like the dependcies are missing even though installed them and they exist in the directory

The proper way to add pymysql to your lambda is by creating a dedicated layer a as described in the AWS blog:
How do I create a Lambda layer using a simulated Lambda environment with Docker?
Create empty folder, e.g. mylayer.
Go to the folder and create requirements.txt file with the content of
PyMySQL
Run the following docker command (new image from
lambci/docker-lambda for Python 3.9):
docker run --rm --volume "$PWD:/var/task" --workdir /var/task senorcoder/aws-lambda-env:python3.9_build pip install -Ur requirements.txt --target python
Create layer as zip:
zip -r mypymysqllayer.zip python > /dev/null
Create lambda layer based on mypymysqllayer.zip in the AWS Console. Don't forget to specify Compatible runtimes to python3.9.
Add the layer to your function:
Alternatively, create your function as Lambda container image

Related

How to properly connect to SQL Server from a python script when python packages are based on github?

Suppose that due to an HTTP 403 Error it's not possible to download the packages from the PyPi repo (nor pip install <package> commands) which causes me to install the pyodbc by cloning the repo from Github (https://github.com/mkleehammer/pyodbc) and by running the next .cmd windows file:
cd "root_folder"
git activate
git clone https://github.com/mkleehammer/pyodbc.git --depth 1
Note that this package is downloaded to the same root folder where my python script is, after this I try to set a connection to Microsoft SQL Server:
import pyodbc as pyodbc
# set connection settings
server="servername"
database="DB1"
user="user1"
password="123"
# establishing connection to db
conn = pyodbc.connect("DRIVER={SQL Server};SERVER="+server+";DATABASE="+database+";UID="+user+";PWD="+password)
cursor=conn.cursor()
print("Succesful connection to sql server")
However, when I run the above code the next traceback error arises:
Traceback (most recent call last):
File "/dcleaner.py", line 47, in
conn = pyodbc.connect("DRIVER={SQL Server};SERVER="+server+";DATABASE="+database+";UID="+user+";PWD="+password)
AttributeError: module 'pyodbc' has no attribute 'connect'
Do you know how can I properly connect from a py script to a sql-server based database?
After you have cloned PYODBC
cd "root_folder"
git activate
git clone https://github.com/mkleehammer/pyodbc.git --depth 1
On your Local Machine, Go into the cloned directory and open terminal and run below command
python setup.py build
if it errors then try to install appropriate C++ compiler (the error might reveal this detail/ on VSCode it gave the URL to open and download, which I have shared below) install it from here link - choose this one
reboot machine and run this again
python setup.py build #if success then continue with below one
python setup.py install
after that you should be able to import and run the below from your local machine
import pyodbc as pyodbc

in a postgres container, where are the imported extensions located?

i would like to create a volume on that directory, and bring in plpython and postgis extensions.
for some reason i am unable to create extensions from within a container.
i have tried to run the postgres container by using the local version and just connecting to it, since the local one has the extensions...but to no avail. \dx shows nothing.
i know that in /usr/share/postgresql/14/extension i can find plpython3u.control
which has the following:
# plpython3u extension
comment = 'PL/Python3U untrusted procedural language'
default_version = '1.0'
module_pathname = '$libdir/plpython3'
relocatable = false
schema = pg_catalog
superuser = true
but i cant find what it's referring to...
my error, after i went inside the container and made that file:
CREATE EXTENSION plpython3u;
FATAL: extension "plpython3u" has no installation script nor update path for version "1.0"
server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
The connection to the server was lost. Attempting reset: Succeeded.
Postgres extensions are implemented as shared library modules (.so) files, generally located in /usr/lib/postgresql/14/lib. So for example, the control file for the plpgsql extension (/usr/share/postgresql/14/extension/plpgsql.control) looks like this:
# plpgsql extension
comment = 'PL/pgSQL procedural language'
default_version = '1.0'
module_pathname = '$libdir/plpgsql'
relocatable = false
schema = pg_catalog
superuser = true
trusted = true
In the module_pathname value, $libdir refers to /usr/lib/postgresql/14/lib, where we find /usr/lib/postgresql/14/lib/plpgsql.so.
If you want to enable a new extension, like plpython3u, you need both the control and sql files as well as the shared library (which needs to be built for the Postgres version that you're running).
Fortunately, it looks like the plpython extension is already packaged and installable using the stock postgres image. The following Dockerfile will produce an image that has the plpython extension installed:
FROM postgres:14
RUN apt-get update && \
apt-get -y install postgresql-plpython3-14 && \
apt-get clean all
If we build an image from that Dockerfile:
docker build -t postgres-plpython .
And then start a container:
docker run -d --name postgres -e POSTGRES_PASSWORD=secret postgres-plpython
We can docker exec into the container and add the extension to a
database:
$ docker exec -it postgres psql -U postgres
psql (14.3 (Debian 14.3-1.pgdg110+1))
Type "help" for help.
postgres=# create extension plpython3u;
CREATE EXTENSION

chat history between a chatbot and user ( Microsoft botframework) using python language

I want to save the chat conversation in the COSMOS database in the azure portal.
I have created a Cosmos account, and created a new database and container, after which I followed the steps on the official Microsoft docs, but whenever I test the bot and run it in the emulator nothing from the conversation is added to the items, like the shown in the docs. So, is there anything else I should do to save that conversation?
Note: the project is using python.
these are the steps that I did in this project to save the conversation:
create an Cosmos DB Account
add a database
add the Cosmos DB configuration information in the config file.
installing the Cosmos package using pip
Cosmos DB implementation like the code shown bellow.
run the bot locally
interact with the bot
then the conversation should be displayed in the DB, but nothing is there
``this code in the bot file
def init(self,config:DefaultCon):
cosmos_config = CosmosDbPartitionedConfig(
DBendpoint = config.COSMOS_DB_URI,
PrimaryKey = config.COSMOS_DB_PRIMARY_KEY,
DataBaseID = config.COSMOS_DB_DATABASE_ID,
ContainerID = config.COSMOS_DB_CONTAINER_ID,
compatibility_mode=False
)
self.storage = CosmosDbPartitionedStorage(cosmos_config)
``
** this code is in the config file but with the values of them
COSMOS_DB_URI="<your-CosmosDb-URI>"
COSMOS_DB_PRIMARY_KEY="your-primary-key"
COSMOS_DB_DATABASE_ID="<your-database-id>"
COSMOS_DB_CONTAINER_ID="bot-storage"
Kindly check with the following things if possible:
Cosmos DB Configuration information
COSMOS_DB_URI=""
COSMOS_DB_PRIMARY_KEY="your-primary-key"
COSMOS_DB_DATABASE_ID=""
COSMOS_DB_CONTAINER_ID="bot-storage"
Proper installation of cosmos packages. (Suggested to use virtual environment to install the packages)
pip install botbuilder-azure
from botbuilder.azure import CosmosDbPartitionedStorage, CosmosDbPartitionedConfig
from config import DefaultConfig
CONFIG = DefaultConfig()
def __init__(self):
cosmos_config = CosmosDbPartitionedConfig(
cosmos_db_endpoint=CONFIG.COSMOS_DB_URI,
auth_key=CONFIG.COSMOS_DB_PRIMARY_KEY,
database_id=CONFIG.COSMOS_DB_DATABASE_ID,
container_id=CONFIG.COSMOS_DB_CONTAINER_ID,
compatibility_mode = False
)
self.storage = CosmosDbPartitionedStorage(cosmos_config)
Note: Creating a virtual environment in python
$ pip install virtualenv
$ virtualenv --version
$ virtualenv my_name - Giving suitable name for virtual environment
$ virtualenv -p /usr/bin/python3 virtualenv_name
$ source virtualenv_name/bin/activate - For activation
(virtualenv_name)$ deactivate - For deactivation

pysftp library not working in AWS lambda layer

I want to upload files to EC2 instance using pysftp library (Python script). So I have created small Python script which is using below line to connect
pysftp.Connection(
host=Constants.MY_HOST_NAME,
username=Constants.MY_EC2_INSTANCE_USERNAME,
private_key="./mypemfilelocation.pem",
)
some code here .....
pysftp.put(file_to_be_upload, ec2_remote_file_path)
This script will upload files from my local Windows machine to EC2 instance using .pem file and it works correctly.
Now I want to do this action using AWS lambda with API Gateway functionality.
So I have uploaded Python script to AWS lambda. Now I am not sure how to use pysftp library in AWS lambda, so I found solution that add pysftp library Layer in AWS lambda Layer. I did it with
pip3 install pysftp -t ./library_folder
And I make zip of above folder and added in AWS lambda Layer.
But still I got so many errors like one by one :-
No module named 'pysftp'
No module named 'paramiko'
Undefined Symbol: PyInt_FromLong
cannot import name '_bcrypt' from partially initialized module 'bcrypt' (most likely due to a circular import)
cffi module not found
I just fade up of above errors I didn't find the proper solution. How can I can use pysftp library in my AWS lambda seamlessly?
I build pysftp layer and tested it on my lambda with python 3.8. Just to see import and basic print:
import json
import pysftp
def lambda_handler(event, context):
# TODO implement
print(dir(pysftp))
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
I used the following docker tool to build the pysftp layer:
https://github.com/lambci/docker-lambda
So what I did for pysftp was:
# create pysftp fresh python 3.8 environment
python -m venv pysftp
# activate it
source pysftp/bin/activate
cd pysftp
# install pysftp in the environemnt
pip3 install pysftp
# generate requirements.txt
pip freeze > requirements.txt
# use docker to construct the layer
docker run --rm -v `pwd`:/var/task:z lambci/lambda:build-python3.8 python3.8 -m pip --isolated install -t ./mylayer -r requirements.txt
zip -r pysftp-layer.zip .
And the rest is uploading the zip into s3, creating new layer in AWS console, setting Compatible runtime to python 3.8 and using it in my test lambda function.
You can also check here how to use this docker tool (the docker command I used is based on what is in that link).
Hope this helps

How to connect to Azure MySQL from Azure Functions by Python

I am trying to;
Run the python code triggered by Cosmos DB when cosmos DB receives the data..
The python code in Azure Functions has code to ingest data from Azure MySQL.
What I have done are;
. Wrote python in Azure Functions and run it with triggered by Cosmos
DB. This was successful.
. Installed mysql.connector referred to
https://prmadi.com/running-python-code-on-azure-functions-app/ and
run the code to connect to Azure MySQL, but It does not work.
Do you know how to install mysql module for Python to Azure Functions and connect to the database?
Thanks!
According to your description ,I think your issue is about how to install the Python third-party module in the Azure function app.
Please refer to the steps as below :
Step 1 :
login kudu : https://Your_APP_NAME.scm.azurewebsites.net/DebugConsole.
Run Below command in d:/home/site/wwwroot/<your function name> folder.(will take some time)
python -m virtualenv myvenv
Step 2 :
Load the env via the below command in env/Scripts folder.
activate.bat
Step 3 :
Your shell should be now prefixed by (env).
Update pip
python -m pip install -U pip
Install what you need
python -m pip install MySQLdb
Step 4 :
In your code, update the sys.path to add this venv:
import sys, os.path
sys.path.append(os.path.abspath(os.path.join(os.path.dirname( __file__ ), 'env/Lib/site-packages')))
Then connect to mysql db via the snippet of code below
#!/usr/bin/python
import MySQLdb
# Connect
db = MySQLdb.connect(host="localhost",
user="appuser",
passwd="",
db="onco")
cursor = db.cursor()
# Execute SQL select statement
cursor.execute("SELECT * FROM location")
# Commit your changes if writing
# In this case, we are only reading data
# db.commit()
# Get the number of rows in the resultset
numrows = cursor.rowcount
# Get and display one row at a time
for x in range(0, numrows):
row = cursor.fetchone()
print row[0], "-->", row[1]
# Close the connection
db.close()
Hope it helps you.

Categories

Resources