How to connect to Azure MySQL from Azure Functions by Python - python

I am trying to;
Run the python code triggered by Cosmos DB when cosmos DB receives the data..
The python code in Azure Functions has code to ingest data from Azure MySQL.
What I have done are;
. Wrote python in Azure Functions and run it with triggered by Cosmos
DB. This was successful.
. Installed mysql.connector referred to
https://prmadi.com/running-python-code-on-azure-functions-app/ and
run the code to connect to Azure MySQL, but It does not work.
Do you know how to install mysql module for Python to Azure Functions and connect to the database?
Thanks!

According to your description ,I think your issue is about how to install the Python third-party module in the Azure function app.
Please refer to the steps as below :
Step 1 :
login kudu : https://Your_APP_NAME.scm.azurewebsites.net/DebugConsole.
Run Below command in d:/home/site/wwwroot/<your function name> folder.(will take some time)
python -m virtualenv myvenv
Step 2 :
Load the env via the below command in env/Scripts folder.
activate.bat
Step 3 :
Your shell should be now prefixed by (env).
Update pip
python -m pip install -U pip
Install what you need
python -m pip install MySQLdb
Step 4 :
In your code, update the sys.path to add this venv:
import sys, os.path
sys.path.append(os.path.abspath(os.path.join(os.path.dirname( __file__ ), 'env/Lib/site-packages')))
Then connect to mysql db via the snippet of code below
#!/usr/bin/python
import MySQLdb
# Connect
db = MySQLdb.connect(host="localhost",
user="appuser",
passwd="",
db="onco")
cursor = db.cursor()
# Execute SQL select statement
cursor.execute("SELECT * FROM location")
# Commit your changes if writing
# In this case, we are only reading data
# db.commit()
# Get the number of rows in the resultset
numrows = cursor.rowcount
# Get and display one row at a time
for x in range(0, numrows):
row = cursor.fetchone()
print row[0], "-->", row[1]
# Close the connection
db.close()
Hope it helps you.

Related

How to properly connect to SQL Server from a python script when python packages are based on github?

Suppose that due to an HTTP 403 Error it's not possible to download the packages from the PyPi repo (nor pip install <package> commands) which causes me to install the pyodbc by cloning the repo from Github (https://github.com/mkleehammer/pyodbc) and by running the next .cmd windows file:
cd "root_folder"
git activate
git clone https://github.com/mkleehammer/pyodbc.git --depth 1
Note that this package is downloaded to the same root folder where my python script is, after this I try to set a connection to Microsoft SQL Server:
import pyodbc as pyodbc
# set connection settings
server="servername"
database="DB1"
user="user1"
password="123"
# establishing connection to db
conn = pyodbc.connect("DRIVER={SQL Server};SERVER="+server+";DATABASE="+database+";UID="+user+";PWD="+password)
cursor=conn.cursor()
print("Succesful connection to sql server")
However, when I run the above code the next traceback error arises:
Traceback (most recent call last):
File "/dcleaner.py", line 47, in
conn = pyodbc.connect("DRIVER={SQL Server};SERVER="+server+";DATABASE="+database+";UID="+user+";PWD="+password)
AttributeError: module 'pyodbc' has no attribute 'connect'
Do you know how can I properly connect from a py script to a sql-server based database?
After you have cloned PYODBC
cd "root_folder"
git activate
git clone https://github.com/mkleehammer/pyodbc.git --depth 1
On your Local Machine, Go into the cloned directory and open terminal and run below command
python setup.py build
if it errors then try to install appropriate C++ compiler (the error might reveal this detail/ on VSCode it gave the URL to open and download, which I have shared below) install it from here link - choose this one
reboot machine and run this again
python setup.py build #if success then continue with below one
python setup.py install
after that you should be able to import and run the below from your local machine
import pyodbc as pyodbc

Connect to mysql rds using python lambda

I am very new to python code and doing some small tests to verify functionality
Currently trying to establish the connection between an RDS MySQL and a python lambda function.
However, it seems to fail in the code itself and I am not sure why this happens.
There are a couple of guides out there but they all seem to be outdated and fail to work for me.
These are the steps I took to get it working(using MAC-12.3.1 and VS-Studio 1.62.3):
created MYSQL RDS
connected to the MYSQL RDS and created a database called "igor" with table name "lumigor", with 2 columns: id and name (populated with random data).
created on the local machine a folder to contain the code and the package.
installed version Python 3.8.9
created lambda function file app.py with the following code:
import pymysql.cursors
# Connect to the database
connection = pymysql.connect(host='rds end point',
user='user',
password='pswrd',
database='igor',
cursorclass=pymysql.cursors.DictCursor)
with connection:
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT * FROM `Lumigor`"
cursor.execute(sql, ('webmaster#python.org',))
result = cursor.fetchone()
print(result)
```
I added requirement.txt file with the following command
python3 -m pip install PyMySQL && pip3 freeze > requirements.txt --target...
But now I get an error from the visual studio:
"Import "pymysql.cursors" could not be resolved from sourcePylance"
When I zip the file and upload it to lambda, run a test it returns an error
{
"errorMessage": "Unable to import module 'app': No module named 'pymysql.cursors'",
"errorType": "Runtime.ImportModuleError",
"stackTrace": []
}
It seems like the dependcies are missing even though installed them and they exist in the directory
The proper way to add pymysql to your lambda is by creating a dedicated layer a as described in the AWS blog:
How do I create a Lambda layer using a simulated Lambda environment with Docker?
Create empty folder, e.g. mylayer.
Go to the folder and create requirements.txt file with the content of
PyMySQL
Run the following docker command (new image from
lambci/docker-lambda for Python 3.9):
docker run --rm --volume "$PWD:/var/task" --workdir /var/task senorcoder/aws-lambda-env:python3.9_build pip install -Ur requirements.txt --target python
Create layer as zip:
zip -r mypymysqllayer.zip python > /dev/null
Create lambda layer based on mypymysqllayer.zip in the AWS Console. Don't forget to specify Compatible runtimes to python3.9.
Add the layer to your function:
Alternatively, create your function as Lambda container image

chat history between a chatbot and user ( Microsoft botframework) using python language

I want to save the chat conversation in the COSMOS database in the azure portal.
I have created a Cosmos account, and created a new database and container, after which I followed the steps on the official Microsoft docs, but whenever I test the bot and run it in the emulator nothing from the conversation is added to the items, like the shown in the docs. So, is there anything else I should do to save that conversation?
Note: the project is using python.
these are the steps that I did in this project to save the conversation:
create an Cosmos DB Account
add a database
add the Cosmos DB configuration information in the config file.
installing the Cosmos package using pip
Cosmos DB implementation like the code shown bellow.
run the bot locally
interact with the bot
then the conversation should be displayed in the DB, but nothing is there
``this code in the bot file
def init(self,config:DefaultCon):
cosmos_config = CosmosDbPartitionedConfig(
DBendpoint = config.COSMOS_DB_URI,
PrimaryKey = config.COSMOS_DB_PRIMARY_KEY,
DataBaseID = config.COSMOS_DB_DATABASE_ID,
ContainerID = config.COSMOS_DB_CONTAINER_ID,
compatibility_mode=False
)
self.storage = CosmosDbPartitionedStorage(cosmos_config)
``
** this code is in the config file but with the values of them
COSMOS_DB_URI="<your-CosmosDb-URI>"
COSMOS_DB_PRIMARY_KEY="your-primary-key"
COSMOS_DB_DATABASE_ID="<your-database-id>"
COSMOS_DB_CONTAINER_ID="bot-storage"
Kindly check with the following things if possible:
Cosmos DB Configuration information
COSMOS_DB_URI=""
COSMOS_DB_PRIMARY_KEY="your-primary-key"
COSMOS_DB_DATABASE_ID=""
COSMOS_DB_CONTAINER_ID="bot-storage"
Proper installation of cosmos packages. (Suggested to use virtual environment to install the packages)
pip install botbuilder-azure
from botbuilder.azure import CosmosDbPartitionedStorage, CosmosDbPartitionedConfig
from config import DefaultConfig
CONFIG = DefaultConfig()
def __init__(self):
cosmos_config = CosmosDbPartitionedConfig(
cosmos_db_endpoint=CONFIG.COSMOS_DB_URI,
auth_key=CONFIG.COSMOS_DB_PRIMARY_KEY,
database_id=CONFIG.COSMOS_DB_DATABASE_ID,
container_id=CONFIG.COSMOS_DB_CONTAINER_ID,
compatibility_mode = False
)
self.storage = CosmosDbPartitionedStorage(cosmos_config)
Note: Creating a virtual environment in python
$ pip install virtualenv
$ virtualenv --version
$ virtualenv my_name - Giving suitable name for virtual environment
$ virtualenv -p /usr/bin/python3 virtualenv_name
$ source virtualenv_name/bin/activate - For activation
(virtualenv_name)$ deactivate - For deactivation

Adaptive server connection failed (DB-Lib error message 20002, severity 9)

I'm sure this issue has been raised an uncountable number of times before but perhaps, someone could still help me.
I am using pymssql v2.1.3 with Python 2.7.12 and the code that I used several times until yesterday to write data to my Azure SQL DB has somehow decided not to work anymore - for no apparent reason.
The firewall is set, my IP is in the whitelist, I can connect to the database using SQL Server Management Studio and query the data but I still keep getting this error when attempting to connect using pymssql.
The app is a Flask web-app and following is how I connect to the DB:
conn = pymssql.connect(server='myserver.database.windows.net', user='myusername#mydatabase', password='mypassword', database='mydatabase')
This is likely due to the pymssql version. Did you upgrade pymssql? If yes, try reverting back to 2.1.1
sudo pip install pymssql==2.1.1
Not really a solution to the issue I raised, but using pypyodbc instead of pymssql works.
conn = pypyodbc.connect(driver='{SQL Server}',server='tcp:myserver.database.windows.net,1433',database='mydatabase', uid='myusername', pwd='mypassword')
freetds-dev might be missing on linux:
apt-get update && apt-get install freetds-dev
unbelievable the bug is still present...
ENV: WSL2 on Windows 10
Fix -> switch to pyodbc:.
sudo apt-get install unixodbc-dev && pip3 install pyodbc
follow instruction for ubuntu
https://learn.microsoft.com/de-de/sql/connect/odbc/linux-mac/installing-the-microsoft-odbc-driver-for-sql-server?view=sql-server-ver15
import pyodbc
server = 'tcp:myserver.database.windows.net'
database = 'mydb'
username = 'myusername'
password = 'mypassword'
cnxn = pyodbc.connect('DRIVER={ODBC Driver 17 for SQL Server};SERVER='+server+';DATABASE='+database+';UID='+username+';PWD='+ password)
cursor = cnxn.cursor()
I uninstall and install fresh pymssql and it works for me now.

Python module PyMySQL not taking arguments

I wrote a python script that communicate with MySQL server on OSX 10.10, it runs fine Mac until I put it on an VPS running Ubuntu 14.04
The problem mainly lies on the pyMySQL module. :I can't even run their example script on the git page here
for running the following code:
import pymysql.cursors
# Connect to the database
connection = pymysql.connect(host='localhost',
user='user',
password='passwd',
db='db',
charset='utf8mb4',
cursorclass=pymysql.cursors.DictCursor)
try:
with connection.cursor() as cursor:
# Create a new record
sql = "INSERT INTO `users` (`email`, `password`) VALUES (%s, %s)"
cursor.execute(sql, ('webmaster#python.org', 'very-secret'))
# connection is not autocommit by default. So you must commit to save
# your changes.
connection.commit()
with connection.cursor() as cursor:
# Read a single record
sql = "SELECT `id`, `password` FROM `users` WHERE `email`=%s"
cursor.execute(sql, ('webmaster#python.org',))
result = cursor.fetchone()
print(result)
finally:
connection.close()
The result is
Traceback (most recent call last):
File "1.py", line 9, in <module>
cursorclass=pymysql.cursors.DictCursor)
File "build/bdist.linux-x86_64/egg/pymysql/__init__.py", line 93, in Connect
TypeError: __init__() got an unexpected keyword argument 'password'
Environment:
Ubuntu 14.04 x86_64
Python 2.7.10
PyMySQL 0.6.7
Mac is running the above script fine but not the ubuntu.
Thanks in advance.
I didn't do much checking on the error message. Instead I uninstall all the packages on the unbuntu server and follow the package list on the Mac to install them again one by one to the same version. Amazingly, the problem is solved. Unfortunately I can't tell which module is causing the problem
You might want to give it a try if you don't want to spend much time on updating modules one by one.
It seems that this is a ubuntu problem, which is solved if you install the latest pypi version.
I removed the package from ubuntu in dist-packages and installed it in site-packages:
sudo apt-get purge python3-pymysql
sudo pip3 install --upgrade pymysql
You don't (really) want to do this, especially on a server. Alternatively you use a virtualenv:
virtualenv -p python3.4 venv3
source venv3/bin/activate
pip3 install --update pip
pip3 install --update pymysql
You need to learn a bit more about virtualenv's to use them, but the idea is that you have a complete and separate python3.4 environment installed in venv3 which you activate with source venv3/bin/activate. This way it is harder to break your ubuntu (which heavily depends on a working python). Moreover, you can install different versions of packages in different virtualenv's. Also, when debugging, you can add print-statements to python-code in the site-packages folder of the virtualenv, without risking breaking your system severely. This is great, as many good python libraries have poor error reporting.
Faced this error with Python:3.6, PyMySQL3:0.5.
solution: change 'password' parameter to 'passwd'
Below is the parameters list with default values
host="localhost", user=None, passwd="", db=None, port=3306, unix_socket=None, charset='', sql_mode=None, read_default_file=None, conv=decoders, use_unicode=None, client_flag=0, cursorclass=Cursor, init_command=None, connect_timeout=None, ssl=None, read_default_group=None, compress=None, named_pipe=None

Categories

Resources