Closing connection to hbase database using happybase in python - python

def hbasePopulate(self,table="abc",MachineIP="xx.xx.xx.xx"):
connection=happybase.Connection(MachineIP,autoconnect=True)
tablename=Reptype.lower()+'rep'
print "Connecting to table "
print tablename
try:
table=connection.table(tablename)
for key,data in table.scan():
print key,data
print table
#except IOError as e:
except:
print "Table does not exists,creating"
self.createTable(table=table,machineIP=machineIP)
with table.batch() as b:
with open('xxx.csv','r') as queryFile:
for lines in queryFile:
lines=lines.strip("\n")
splitRecord=lines.split(",")
key=splitRecord[0]
key=key.replace("'","")
val=",".join(splitRecord[1:])
val=ast.literal_eval(val)
table.put(splitRecord[0],val)
for key,data in table.scan():
print key,data
def createTable(self,table="abc",MachineIP=""):
connection=happybase.Connection(MachineIP,autoconnect=True)
print "Connection Handle",connection
tname=table.lower()
tablename=str(tname)
print "Creating table : "+table+", On Hbase machine : "+MachineIP
families={"cf":{} ,} #using default column family
connection.create_table(table,families=families)
print "Creating table done "
Every time I run this script it populated data to hbase table but it leaves a connection open. When I check using netstat -an I see the connection count has increased which persists even after the script completes.
Am I missing something? Do we need to explicitly close connection?
Thanks for helping.

Got the solution .Turns out to be this
try:
connection.close()
except Exception as e:
print "Unable to close connection to hbase "
print e

If the program exited, any network connections are automatically closed. What you're likely seeing is the TIME_WAIT state for already closed connections.

Related

Asnycpg PostgreSQL connection issues

I'm using Python with asnycpg to interact with my PostgreSQL database.
After some time, if I don't interact with it and then try to do so, I get an connection is closed error. Is this a server side config or client side config issue?
How to solve it?
The database automatically closes the connection due to security reasons.
So I suggest you to open the connection to db just before running queries with asyncpg, and then to reclose the connection right afterwards.
Furthermore, you can manage the possible errors that you get when the connection is closed by properly rising exceptions.
Take a look at this example:
Import asyncpg
print("I'm going to run a query with asyncpg")
# if the connection to db is not opened, then open it
if not connection:
# try to open the connection to db
try:
connection = await asyncpg.connect(
host=YOUR_DATABASE_HOST,
user=YOUR_DATABASE_USER,
password=YOUR_DATABASE_PASS,
database=YOUR_DATABASE_DB_NAME
)
except (Exception, asyncpg.ConnectionFailureError) as error:
Print("Error while connecting to db: {}".format(error))
else:
#connection already up and running
pass
QUERY_STRING = """
INSERT INTO my_table(field_1, field_2)
VALUES ($1, $2);
"""
try:
await connection.execute(QUERY_STRING, value_to_assign_to_field_1, value_to_assign_to_field_2)
return None
# except (Exception, asyncpg.UniqueViolationError) as integrError:
# print("You are violating unique constraint.")
except (Exception, asyncpg.ConnectionFailureError) as error:
print("Connection to db has failed. Cannot add data.")
return "{}".format(error)
finally:
if (connection):
await Utils.close_connection(connection)
print("data has been added. closing the connection.")

SSH twice with Python [duplicate]

This question already has an answer here:
Connecting to a server via another server using Paramiko
(1 answer)
Closed 1 year ago.
I would like to SSH in host1 first then SSH to host2 to get some files. SSH to host one by using Paramiko was success. But when I did the same as host1, it cannot SSH to host2. It shows 'Unable to establish SSH connection: Server 'host2' not found in known_hosts'
import paramiko
from paramiko.ssh_exception import AuthenticationException, SSHException, BadHostKeyException
try:
client = paramiko.SSHClient()
client.load_system_host_keys()
client.connect('host1', username='user1', password='pass1', timeout=5)
print ("Accessed host1 already")
try:
client2 = paramiko.SSHClient()
client2.load_system_host_keys()
client2.connect('host2', username='user2', password='pass2', timeout=5)
print ("Accessed host2 already")
except AuthenticationException:
print("Authentication failed, please verify your credentials: %s")
except SSHException as sshException:
print("Unable to establish SSH connection: %s" % sshException)
except BadHostKeyException as badHostKeyException:
print("Unable to verify server's host key: %s" % badHostKeyException)
except Exception as e:
print("Operation error: %s" % e)
except :
print ("SSH to host1 failed!!!")
Also I tried using command to get into host2 but it is still in host1 all the time. Not sure is this the right way to do. Please recommend how can I do. Thank you.
stdin1, stdout1, stderr1 = client.exec_command('ssh user2#host2;pass2;cd /;ls')
rawd = stdout1.read().decode('ascii').strip("\n")
print(rawd)
For an initial connection, SSH asks if you trust the remote computer. When you type yes, it gets stored in ~/.ssh/known_hosts.
On the system where you run the script, try making a SSH connection manually in console, let it store the server's info in that file, then start your program.

Connecting to Postgres Database via Python

I am trying to connect to my database via Python 2.7 with this code:
import csv
import psycopg2
try:
conn = psycopg2.connect("dbname='student', user='postgres',password='password', host='localhost'")
cursor = conn_cursor()
reader = csv.reader(open('last_file.csv', 'rb'))
print "connected"
except:
print "not Connected"
It did work last week and we don't think we've changed anything, but now it won't connect.
We've tried using it with the database open and closed, nothing worked.
The database does exist in Postgres.
import psycopg2
try:
conn = psycopg2.connect("dbname='database_name' user='postgres_user_name' host='localhost' password='user_passwd'")
except:
print "I am unable to connect to the database"
cur = conn.cursor()
cur.execute("""SELECT * from table_name""")
rows = cur.fetchall()
print "\nShow me the data:\n"
for row in rows:
print " ", row[0]
print " ", row[1]
Exception part add like this to see what is error
except Exception as ex:
print "not Connected"
print "Error: "+ str(ex)
Try this:
import csv
import psycopg2
try:
conn = psycopg2.connect("dbname='student', user='postgres',password='password', host='localhost'")
except:
print "I am unable to connect to the database."
cursor = conn.cursor()
try:
reader = csv.reader(open('last_file.csv', 'rb'))
print "connected"
except:
print "not Connected"
Seems like there are something wrong with your postgres.
Try and see postgres log.
Location of postgres log by default :
tail -f /var/log/postgresql/<>/main/postgresql.log
something like this.
Also don't forget to check firewall. Maybe someone disable it by accident.
Also try for pip install PyGreSQL package. Since psycopg2 (some of versions) is under GPL license. It could be tricky for open source license. Just for your information.

Parallel connection database data saving with python

I want to make parallel database saving in my system. I have two destination that the database will be saved. Firstly in the local computer (so I'm using localhost instead of the IPaddress) and the remote PC (i'm using the IP address to access it). Here is my code in python :
import serial
import MySQLdb
dbConn = MySQLdb.connect(host="localhost",user="root",passwd="rooting",db="database") or die ("could not connect to database")
dbConn2 = MySQLdb.connect(host="192.168.1.132",user="bobi",passwd="rooting",db="database") or die ("could not connect to database")
cursor = dbConn.cursor()
cursor2 = dbConn2.cursor()
device = '/dev/ttyACM0'
while 1:
try:
arduino = serial.Serial(device,9600)
except:
print "Failed to connect on",device
try:
data = arduino.readline()
try:
print("Frequency : " + data)
cursor.execute("INSERT INTO frequency (Value) VALUES (%s)",(data))
dbConn.commit()
except dbConn.IntegrityError:
print "failed to insert data to localhost"
try:
cursor2.execute("INSERT INTO frequency (Value) VALUES (%s)",(data))
dbConn2.commit()
except dbConn2.IntegrityError:
print "failed to insert data to remote computer"
except:
print "Failed to get data from Arduino!"
I want to ignore the dbConn2 althought the connection is not connect. The connection is using ethernet LAN. So when I unplug the LAN, I hope my program still running (still send to localhost).
The result of my code above is when i'm unplugging the ethernet LAN, data sending to localhost has stopped. I try to make changes but still get the problem. I'm using python 2.7. I need your help, thank you :)

How to insert async into MySQL using Python

solved,every greenlet should have one connection rather than share the same connection.
I want to insert a lot data into a MySQL database. I use gevent to download data from the internet, then insert the data into MySQL. I found umysqldb to insert into MySQL async. However I am getting the following error: Mysql Error 0: Concurrent access in query method.
My code is following:
def insert_into_mysql(conn,cur,pid,user,time,content):
try:
value=[pid,user,time,content]
#print 'value is', value
print 'hi'
cur.execute('insert into post(id,user,time,content) values(%s,%s,%s,%s)',value)
print 'after execute'
conn.commit()
# except MySQLdb.Error,e:
except umysqldb.Error,e:
print "Mysql Error %d: %s" % (e.args[0], e.args[1])
insert_into_mysql is included in download_content:
while len(ids_set) is not 0:
id = ids_set.pop()
print 'now id is', id
pool.spawn(download_content,conn,cur,int(id))
r.sadd('visited_ids',id)
pool.join()
ultramysql doesn't allow you to make multiple queries on the same mysql connection, it just makes it async friendly. So you will either need a new mysql connection for each greenlet or use locking primitives to makes sure only one greenlet is using the connection at a time.

Categories

Resources