I have an python application which reads the python scripts and runs it and returns the values:
main.py
def Exec(id):
try:
connection = mysql.connector.connect(host='localhost',
user='root',
password='',
database='mydb')
# Fetch the python script from pythontbl table
sql_select_Query = "SELECT python FROM mydb.pythontbl WHERE id={}".format(id)
cursor = connection.cursor()
cursor.execute(sql_select_Query)
# get all records
script = cursor.fetchall()
# execute the python script with arguments
??
# return value should be saved in out
out= ???
print("output",out) ??
except mysql.connector.Error as e:
print("Error reading data from MySQL table", e)
finally:
if connection.is_connected():
connection.close()
cursor.close()
print("MySQL connection is closed")
how can I execute the python script which I fetched from my main and sending the arguments to script and get back the result ?
I can not use import script.py as I am fetching the script through my main.py
create another file name fetchscript.py and in this file create the connection and fetch the script from your table , call fetchscript.py from main.py and then import pythonscript and call the desired function from pythonscript.py
Related
I'm attempting to run an executable, "main.exe", that was built from three python modules (with "main.py" being the main script) using the pyinstaller module. The command that was used to build an executable from the scripts is,
pyinstaller --onefile main.py
This script invokes functions from the "tictactoe_office_release.py" script which establishes connection to a MySQL 8.0.31 server database for performing CRUD operations. When running the executable from the command line, I receive the following string of errors:
Error: 'Authentication plugin 'caching_sha2_password' is not supported'
Traceback (most recent call last):
File "main.py", line 24, in <module>
File "tictactoe_office_release.py", line 42, in __init__
File "mysql_python_lib.py", line 124, in __init__
File "mysql_python_lib.py", line 96, in read_query
AttributeError: 'NoneType' object has no attribute 'cursor'
[25743] Failed to execute script 'main' due to unhandled exception!
It is important to note, however, that my main.py script executes without errors when run outside of the executable. Now, I have troubleshooted the errors using numerous comments from Authentication Plugin 'caching_sha2_password' is not supportedincluding the following
uninstalling 'mysql-connector' and installing 'mysql-connector-python'
2)Setting the 'auth_plugin' parameter to 'mysql_native_password' in the 'mysql.connector.connect()' function calls
3)Modifying the mysql encryption by running
ALTER USER 'root'#'localhost' IDENTIFIED WITH caching_sha2_password BY 'Panther021698';
but am receiving the same error after I re-build and run the executable.
The relevant code in my "tictactoe_office_release.py" module that depicts the function definitions for enabling communication between the Python interpreter and the MySQL server, and database is provided below:
from distutils.util import execute
import mysql.connector
from mysql.connector import Error
from mysql.connector.locales.eng import client_error
class mysql_python_connection:
''' Provide class definition for creating connection to MySQL server,
initializing database, and executing queries '''
def __init__(self):
self.host_name = "localhost"
self.user_name = "root"
self.passwd = "Panther021698"
def create_server_connection(self):
''' This function establishes a connection between Python
interpreter and the MySQL Community Server that we are attempting
to connect to '''
self.connection = None # Close any existing connections
try:
self.connection = mysql.connector.connect(
host = self.host_name,
user = self.user_name,
passwd = self.passwd
)
print("MySQL connection successful")
except Error as err:
print(f"Error: '{err}'")
def create_database(self, query):
''' This function initializes a new database
on the connected MySQL server'''
cursor = self.connection.cursor()
try:
cursor.execute(query)
print("Database created successfully")
except Error as err:
print(f"Error: '{err}'")
def create_db_connection(self, db_name):
''' This function establishes a connection between Python
the MySQL Community Server and a database that we
are initializing on the server '''
self.connection = None # Close any existing connections
self.db_name = db_name
try:
self.connection = mysql.connector.connect(
host = self.host_name,
user = self.user_name,
passwd = self.passwd,
database = self.db_name
)
print("MySQL Database connection successful")
except Error as err:
print(f"Error: '{err}'")
def execute_query(self, query):
''' This function takes SQL queries stored
in Python as strings and passes them
to the "cursor.execute()" method to
execute them on the server'''
cursor = self.connection.cursor()
try:
cursor.execute(query)
self.connection.commit() # Implements commands detailed in SQL queries
print(query + "Query successful")
except Error as err:
print(f"Error: '{err}'")
def read_query(self,query):
''' This function reads and returns data from
a MySQL database using the specified query '''
cursor = self.connection.cursor()
print("cursor datatype is ")
print(type(cursor))
#result = None
try:
cursor.execute(query)
result = cursor.fetchall()
return result
except Error as err:
print(f"Error: '{err}'")
Additionally, my MySQL environment variables are provided in the image below.
I am trying to achieve the same thing as in earlier question psycopg2: How to execute vacuum postgresql query in python script; however, the recommendation to open an autocommit connection includes a link which is broken.
The below code runs without error BUT the table is not vacuumed.
How does this need to be written to call the Vacuum Full correctly?
#!/usr/bin/python
import psycopg2
from config import config
def connect():
""" Connect to the PostgreSQL database server """
conn = None
try:
# read connection parameters
params = config()
# connect to the PostgreSQL server
conn = psycopg2.connect(**params)
conn.autocommit=1
# create a cursor
cur = conn.cursor()
# execute Vacuum Full
cur.execute('Vacuum Full netsuite_display')
# close the communication with the PostgreSQL
cur.close()
except (Exception, psycopg2.DatabaseError) as error:
print(error)
finally:
if conn is not None:
conn.close()
print('Database connection closed.')
if __name__ == '__main__':
connect()
I tried a lot however I am unable to copy data available as json file in S3 bucket(I have read only access to the bucket) to Redshift table using python boto3. Below is the python code which I am using to copy the data. Using the same code I was able to create the tables in which I am trying to copy.
import configparser
import psycopg2
from sql_queries import create_table_queries, drop_table_queries
def drop_tables(cur, conn):
for query in drop_table_queries:
cur.execute(query)
conn.commit()
def create_tables(cur, conn):
for query in create_table_queries:
cur.execute(query)
conn.commit()
def main():
try:
config = configparser.ConfigParser()
config.read('dwh.cfg')
# conn = psycopg2.connect("host={} dbname={} user={} password={} port={}".format(*config['CLUSTER'].values()))
conn = psycopg2.connect(
host=config.get('CLUSTER', 'HOST'),
database=config.get('CLUSTER', 'DB_NAME'),
user=config.get('CLUSTER', 'DB_USER'),
password=config.get('CLUSTER', 'DB_PASSWORD'),
port=config.get('CLUSTER', 'DB_PORT')
)
cur = conn.cursor()
#drop_tables(cur, conn)
#create_tables(cur, conn)
qry = """copy DWH_STAGE_SONGS_TBL
from 's3://udacity-dend/song-data/A/A/A/TRAAACN128F9355673.json'
iam_role 'arn:aws:iam::xxxxxxx:role/MyRedShiftRole'
format as json 'auto';"""
print(qry)
cur.execute(qry)
# execute a statement
# print('PostgreSQL database version:')
# cur.execute('SELECT version()')
#
# # display the PostgreSQL database server version
# db_version = cur.fetchone()
# print(db_version)
print("Executed successfully")
cur.close()
conn.close()
# close the communication with the PostgreSQL
except Exception as error:
print("Error while processing")
print(error)
if __name__ == "__main__":
main()
I don't see any error in the Pycharm console but I see Aborted status in the redshift query console. I don't see any reason why it has been aborted(or I don't know where to look for that)
Other thing that I have noticed is when I run the copy statement in Redshift query editor , it runs fine and data gets moved into the table. I tried to delete and recreate the cluster but no luck. I am not able to figure what I am doing wrong. Thank you
Quick read - it looks like you haven't committed the transaction and the COPY is rolled back when the connection closes. You need to either change the connection configuration to be in "autocommit" or add an explicit "commit()".
Relatively new to python scripts, so bare with.
I have used speedtest-cli before. I have edited the script so it will insert the values into a sql table as below, however having an issue with one of the inserts. It will insert ping, and download ok, however, the upload is always 2.74 or 2.75 for example, but ONLY when run from a crontab.. very weird.
If I run the python script from cli it will insert values fine.
This is my query, and the values ping, download and upload are coming from the speedtest-cli script.
Here is the full script
import re
import subprocess
import time
import mysql.connector
from mysql.connector import Error
from mysql.connector import errorcode
print "----------------------------------"
print 'Started: {} {}'.format(time.strftime('%d/%m/%y %H:%M:%S'), "")
response = subprocess.Popen('speedtest-cli --simple', shell=True, stdout=subprocess.PIPE).stdout.read()
ping = re.findall('Ping:\s(.*?)\s', response, re.MULTILINE)
download = re.findall('Download:\s(.*?)\s', response, re.MULTILINE)
upload = re.findall('Upload:\s(.*?)\s', response, re.MULTILINE)
ping[0] = ping[0].replace(',', '.')
download[0] = download[0].replace(',', '.')
upload[0] = upload[0].replace(',', '.')
try:
if os.stat('/var/www/html/speed/log.txt').st_size == 0:
print 'Date,Time,Ping (ms),Download (Mbit/s),Upload (Mbit/s)'
except:
pass
print 'PING: {}, DOWN: {}, UP: {}'.format(ping[0], download[0], upload[0])
try:
connection = mysql.connector.connect(host='localhost',
database='dev',
user='dev',
password='dev1')
sql_insert_query = ("""INSERT INTO speedtest(ping, download, upload) VALUES (%s,%s,%s)""", (ping[0], download[0], upload[0]))
cursor = connection.cursor()
result = cursor.execute(*sql_insert_query)
connection.commit()
print ("Insert success into speedtest tbl")
except mysql.connector.Error as error :
connection.rollback() #rollback if any exception occured
print("Failed inserting record into speedtest table {}".format(error))
finally:
#closing database connection.
if(connection.is_connected()):
cursor.close()
connection.close()
print("MySQL conn closed")
print 'Finished: {} {}'.format(time.strftime('%d/%m/%y %H:%M:%S'), "")
Manual script runs ok, just from crontab I get unexpected values. Not sure how to solve.
I am trying to run MySQL commands through Python source code and have it display in my web browser.
When I compile the code in an editor (Komodo Edit) it complies and returns the expected set of rows, but when I try to view in a browser the source code is displayed, not the query result I seek...
Here's what I have...
import mysql.connector
conn = None
try:
conn = mysql.connector.Connect(host="localhost", user="jump", password="secret")
print('Connected to MySQL!')
except Exception as ex:
print('cannot connect to MySQL : exception : ' + str(ex))
cursor = conn.cursor()
cursor.close()
print('terminating connection with MySQL')
conn.close()
...I'm pretty sure the source is good, and that somewhere I missed a configuration step or some such.
I can get my command prompt to open the source file and display the query results, so I've got a connection between Python and MySQL server, which I verified through the MySQL workbench also.
Thanks for any help y'all can give!
-M