I am trying to run a simple test code to connect ti a mysql db using sql alchemy.
The code is as follows:
from sqlalchemy import (create_engine, Table, Column, Integer, String, MetaData)
import settings
import sys
try:
db = create_engine('mysql://daniel:dani#localhost/test')
db.connect()
except:
print('opps ', sys.exc_info()[1])
I get the following error:
dlopen(//anaconda/lib/python3.5/site-packages/_mysql.cpython-35m-darwin.so, 2): Library not loaded: libssl.1.0.0.dylib
Referenced from: //anaconda/lib/python3.5/site-packages/_mysql.cpython-35m-darwin.so
Reason: image not found
[Finished in 1.4s]
But running on terminal:
locate libssl.1.0.0.dylib
I get:
/Applications/Dtella.app/Contents/Frameworks/libssl.1.0.0.dylib
/Applications/XAMPP/xamppfiles/lib/libssl.1.0.0.dylib
/Users/dpereira14/anaconda/envs/dato-env/lib/libssl.1.0.0.dylib
/Users/dpereira14/anaconda/lib/libssl.1.0.0.dylib
/Users/dpereira14/anaconda/pkgs/openssl-1.0.1k-1/lib/libssl.1.0.0.dylib
/anaconda/lib/libssl.1.0.0.dylib
/anaconda/pkgs/openssl-1.0.2g-0/lib/libssl.1.0.0.dylib
/opt/local/lib/libssl.1.0.0.dylib
/usr/local/Cellar/openssl/1.0.1j/lib/libssl.1.0.0.dylib
I have no clue how to fix this error.
Thanks!
I also had some problems with SQLAlchemy with mysql, i changed localhost in the create_engine to 127.0.0.1:port, and also had to use pymysql. Ended up working with this:
engine = create_engine('mysql+pymysql://user:password#127.0.0.1:port/db')
pymysql is installed via pip.
You are using python , so you have add mysqldb with mysql. Try the below code.
try:
db = create_engine('mysql+mysqldb://daniel:dani#localhost/test')
db.connect()
except:
print('opps ', sys.exc_info()[1])
Related
Let's see if you could help me with this:
I try to upload a gpkg (geo-referenced) file (table) to a MariadB server (mysql).
Not in postgres, Mariadb-(Mysql)
If I import from an OSGeo4W in the console, it works perfectly for me, being a string, for example:
string=ogr2ogr -f MySQL MySQL:ServerX,host=10.00.00.00,user=userX,password=passX E:\QGIS\cup.shp -nln datatable_name -update -overwrite -lco engine=MYISAM
but I don't see how to integrate it in code from python.
What I do is using os.system,, call that previous string and run it: os.system(string),
I have imported in the module:
import mysql.connector
from osgeo import ogr
etc
but it always returns an error:
ERROR 1: Unable to find driver `MySQL'.
Thank You
I'm working on a legacy script upgrade to Python 3 however the script is hanging during a database delete command (DELETE FROM). The script is showing no error and the logger contains only the is_connected result which is true. Here's my test script based on the main.py file but with only the call to delete the contents of a table and reset it's auto increment prior to repopulating the table.
Here's my test.py file.
#!/usr/bin/env python3
import json
import hashlib
from pprint import pprint
import mysql.connector
import configparser
import re
import random
import requests
import sys
import logging
# Load config for database variables
config = configparser.ConfigParser()
config.read("config.ini")
logFile = "logger.log"
# Set up logging
logging.basicConfig(format='%(asctime)s %(message)s', filename=logFile,level=logging.DEBUG)
# Connect to the MySQL database
cnx = mysql.connector.connect(
host=config["MySQL"]["host"],
user=config["MySQL"]["user"],
port=config["MySQL"]["port"],
passwd=config["MySQL"]["password"],
database=config["MySQL"]["database"]
)
cur = cnx.cursor()
logging.debug(cnx.is_connected())
# Clear the database ready to re-import
# Clear lookup tables first
cur.execute("DELETE FROM member_additional_skills")
logging.debug("Delete Done")
cur.execute("ALTER TABLE member_additional_skills AUTO_INCREMENT = 1")
cnx.commit()
logging.debug("Finished!")
print("Done")
I've left this running for 20 minutes and still nothing else is logged after it declares Teue to being connectedand the process is still running. Is there anything I've missed here?
*** EDIT ***
The process is still in htop but is not using cpu so seems crashed right? And as i write this I have the following as the output to my python3 test.py command line:
client_loop: send disconnect: Broken pipe and the process is no longer in htop
I should point out that this table has no more than 30 rows in it so would expect it to complete in milliseconds
thanks
You can use TrUNCAtE member_additional_skills OR DROP member_additional_skills and CREATE member_additional_skills(....)
that is much faster
You might be having a lock issue:
Sometime with the MySQL connector, the table lock's aren't released properly. Making the table unchangeable.
Try resetting the database and trying again!
I am trying to run a python code in code inline an AWS Lambda function.
I am not zipping any file just pasting the below code in the Lambda function.
And I am getting this error:
errorMessage": "Unable to import module 'UpdateHost_Python'
import psycopg2
def lambda_handler(event,context):
conn_string = "dbname='myfirstdb' port='5432' user='db28' password='######' host='#####.ck0zbnniqteb.us-east-2.rds.amazonaws.com'"
conn = psycopg2.connect(conn_string)
cursor = conn.cursor()
cursor.execute("select * from unnmesh")
conn.commit()
cursor.close()
print("working")
For non-standard Python libraries (like psycopg2), you will need to create a Deployment Package.
This involves creating a Zip file with the libraries, then uploading the Zip file to Lambda.
See: AWS Lambda Deployment Package in Python - AWS Lambda
For a worked-through example, see also: Tutorial: Using AWS Lambda with Amazon S3 - AWS Lambda (I know you're not using Amazon S3, but the tutorial gives an example of building a package with dependencies.)
I have a directory (/home/usuario/Desktop/Example) with one database (MyBBDD.db) and the file (script.py) that run the command "UPDATE".
If in the terminal I'm in the directory "Example", script.py work fine but if Im not in the directory "Example" and I execute script.py like this:
"python /home/usuario/Desktop/Example/script.py" doesnt work fine, the error is: "no such table: name_table".
Somebody know what is the problem?
Thanks in advance.
Best regards.
code as of comments script.py
import urllib
import sqlite3
conn = sqlite3.connect('MyBBDD.db')
c = conn.cursor()
c.execute ("UPDATE...")
conn.commit()
c.close()
conn.close()
When you create a connection object with sqlite3 in script.py, use the absolute file path, i.e.
con = sqlite3.connect('/home/usuario/Desktop/Example/MyBBDD.db')
i'm trying to make a group of defs in one file so then i just can import them whenever i want to make a script in python
i have tried this:
def get_dblink( dbstring):
"""
Return a database cnx.
"""
global psycopg2
try
cnx = psycopg2.connect( dbstring)
except Exception, e:
print "Unable to connect to DB. Error [%s]" % ( e,)
exit( )
but i get this error: global name 'psycopg2' is not defined
in my main file script.py
i have:
import psycopg2, psycopg2.extras
from misc_defs import *
hostname = '192.168.10.36'
database = 'test'
username = 'test'
password = 'test'
dbstring = "host='%s' dbname='%s' user='%s' password='%s'" % ( hostname, database, username, password)
cnx = get_dblink( dbstring)
can anyone give me a hand?
You just need to import psycopg2 in your first snippet.
If you need to there's no problem to 'also' import it in the second snippet (Python makes sure the modules are only imported once). Trying to use globals for this is bad practice.
So: at the top of every module, import every module which is used within that particular module.
Also: note that from x import * (with wildcards) is generally frowned upon: it clutters your namespace and makes your code less explicit.