Python multiprocessing with importlib - python

I want to use multiprocessing for importing modules. In 'conf' variable I have large list of modules names. Function loads classes names from module and checks if tables for those classes are created, if don't - they will be created in the database.
import importlib
from mymodule import exceptions
from datetime import datetime
from multiprocessing import Process
def process_import(ormModule):
print "importing ...", ormModule
t1 = datetime.now()
try:
importlib.import_module(ormModule)
except ImportError:
exceptions.log("Unable to import ORM module %s", ormModule)
failed.append(ormModule)
except:
exceptions.log("Exception while importing ORM module %s", ormModule)
failed.append(ormModule)
t2 = datetime.now()
total = t2 - t1
print "finished importing %s %s" % (ormModule, total.total_seconds())
def loadClasses(conf):
for ormModule in conf:
Process(target=process_import, args=(ormModule,)).start()
Some of the modules are importing fine, but some don't
sqlalchemy.exc.OperationalError: (OperationalError) server closed the connection unexpectedly
This probably means the server terminated abnormally
before or while processing the request.
'select relname from pg_class c join pg_namespace n on n.oid=c.relnamespace where n.nspname=current_sch() and relname=%(name)s' {'name': u'document1'}
How can I fix this problem? I'm using PostgreSQL.

Related

Python 3 not completing mysql delete query

I'm working on a legacy script upgrade to Python 3 however the script is hanging during a database delete command (DELETE FROM). The script is showing no error and the logger contains only the is_connected result which is true. Here's my test script based on the main.py file but with only the call to delete the contents of a table and reset it's auto increment prior to repopulating the table.
Here's my test.py file.
#!/usr/bin/env python3
import json
import hashlib
from pprint import pprint
import mysql.connector
import configparser
import re
import random
import requests
import sys
import logging
# Load config for database variables
config = configparser.ConfigParser()
config.read("config.ini")
logFile = "logger.log"
# Set up logging
logging.basicConfig(format='%(asctime)s %(message)s', filename=logFile,level=logging.DEBUG)
# Connect to the MySQL database
cnx = mysql.connector.connect(
host=config["MySQL"]["host"],
user=config["MySQL"]["user"],
port=config["MySQL"]["port"],
passwd=config["MySQL"]["password"],
database=config["MySQL"]["database"]
)
cur = cnx.cursor()
logging.debug(cnx.is_connected())
# Clear the database ready to re-import
# Clear lookup tables first
cur.execute("DELETE FROM member_additional_skills")
logging.debug("Delete Done")
cur.execute("ALTER TABLE member_additional_skills AUTO_INCREMENT = 1")
cnx.commit()
logging.debug("Finished!")
print("Done")
I've left this running for 20 minutes and still nothing else is logged after it declares Teue to being connectedand the process is still running. Is there anything I've missed here?
*** EDIT ***
The process is still in htop but is not using cpu so seems crashed right? And as i write this I have the following as the output to my python3 test.py command line:
client_loop: send disconnect: Broken pipe and the process is no longer in htop
I should point out that this table has no more than 30 rows in it so would expect it to complete in milliseconds
thanks
You can use TrUNCAtE member_additional_skills OR DROP member_additional_skills and CREATE member_additional_skills(....)
that is much faster
You might be having a lock issue:
Sometime with the MySQL connector, the table lock's aren't released properly. Making the table unchangeable.
Try resetting the database and trying again!

How to use an import to only one function?

How can I import a module in Python, but only to one function?
My imagination:
def func():
localize from myModule import helperFunc # localize means "to this scope only: "
helperFunc("do something magical") # but it doesn't exist
try:
helperFunc("do something magical")
except NameError:
print("'helperFunc' doesn't exist at this scope") # this would get run
The problem here is that localize doesn't exist. Is there something in Python to simulate that?
You can just import modules normally:
def choose5(lst):
from random import choices
# choices is imported here
return choices(lst, k=5)
print(choose5([1, 2, 3]))
# choices is not imported here
def my_now():
from datetime import datetime
return datetime.now()
try:
print("Success", datetime.now())
except NameError:
print("NameError occurred", my_now())
Running the above code will give you the output
NameError occurred 2020-11-06 21:20:19.930863
datetime was scoped to the my_now function. Trying to calling datetime.now() directly failed, so the function my_now was called and that was able to call datetime.now() successfully.

py.test - How to inherit other tests

So let's say that I have two files (test_file1.py, test_file2.py) for integration testing using py.test.
The test_file1.py is something like this:
import datetime
import pytest
Datetime = datetime.datetime.now()
def test_connect():
#1st Query to a mysql database
#2nd Query to a mysql database
..
#N Query to a mysql database
Now I'm writing the test_file2.py which is an extention of test_file1.py but I don't want to write the same mysql queries that I wrote in the above test.
How can I make py.test to inherit the above test and run both after executing py.test test_file2.py?
Something like this (test_file2.py Contents):
import datetime
import pytest
from testDirectory import test_file1
Datetime = datetime.datetime.now()
def test_connect():
#Here should run all the tests from 'test_file1' somehow...
#1st new additional Query to a mysql database
#2nd new additional Query to a mysql database
..
#N new additional Query to a mysql database
Thanks!!
When you import a module, it will execute all of the code inside it. So just write the code you want executed in your original file. For example add the call to the function in your file like this:
test_file1.py:
import datetime
import pytest
Datetime = datetime.datetime.now()
def test_connect():
#1st Query to a mysql database
#2nd Query to a mysql database
..
#N Query to a mysql database
test_connect() # This will run your function when you import
So then in your py.test when you call import test_file1, it will execute the test_connect() and any other code you would like without doing anything else.
In other words, here is a really simple example with 3 files:
File 1: hello_world.py:
def hello_world():
print('hello world!')
hello_world()
File 2: print_text.py:
def print_text():
print('foo bar baz')
print_text()
File 3: run_everything.py:
import hello_world
import print_text
Result when you run run_everything.py:
>>>hello world!
>>>foo bar baz
If you want the function to be executed when the file is executed directly, but not imported as a module, you can do this:
test_file1.py:
import datetime
import pytest
Datetime = datetime.datetime.now()
def test_connect():
#1st Query to a mysql database
#2nd Query to a mysql database
..
#N Query to a mysql database
def main():
# This will _not_ run your function when you import. You would
# have to use test_file1.test_connect() in your py.test.
test_connect()
if __name__ == '__main__':
main()
So in this example, your py.test would be:
import test_file1
test_file1.test_connect()
First one create a fixture in conftest.py:
import pytest
import MySQLdb
def db_cursor(request):
db = MySQLdb.connect(host="localhost", user="root")
cursor = db.cursor()
cursor.execute("SELECT USER()")
data = cursor.fetchone()
assert 'root#localhost' in data
yield cursor
db.close()
Then use it in your test modules:
# test_file1.py
def test_a(db_cursor)
pass
# test_file2.py
def test_b(db_cursor)
res = db_cursor.execute("SELECT VERSION()")
assert '5.5' in res.fetchone()
P.S.
It possible to use any other modules, just inject they are into your tests with pytest_plugins directive:
# conftest.py
pytest_plugins = '_mysql.cursor'
# _mysql/__init__.py
# _mysql/cursor.py
import pytest
import MySQLdb
def db_cursor(request):
db = MySQLdb.connect(host="localhost", user="root")
cursor = db.cursor()
cursor.execute("SELECT USER()")
data = cursor.fetchone()
assert 'root#localhost' in data
yield cursor
db.close()

Shebang line #!/usr/bin/python3 preventing server run

Here's the code I have. Basically I have the Shebang line in there because the psycopg2 wasn't working without it.
But now when I have this line in there it doesn't allow me to run the database, it just says "no module named 'flask'"
#!/usr/bin/python3.4
#
# Small script to show PostgreSQL and Pyscopg together
#
from flask import Flask, render_template
from flask import request
from flask import *
from datetime import datetime
from functools import wraps
import time
import csv
import psycopg2
app = Flask(__name__)
app.secret_key ='lukey'
def getConn():
connStr=("dbname='test' user='lukey' password='lukey'")
conn=psycopg2.connect(connStr)
return conn
#app.route('/')
def home():
return render_template(index.html)
#app.route('/displayStudent', methods =['GET'])
def displayStudent():
residence = request.args['residence']
try:
conn = None
conn = getConn()
cur = conn.cursor()
cur.execute('SET search_path to public')
cur.execute('SELECT stu_id,student.name,course.name,home_town FROM student,\
course WHERE course = course_id AND student.residence = %s',[residence])
rows = cur.fetchall()
if rows:
return render_template('stu.html', rows = rows, residence = residence)
else:
return render_template('index.html', msg1='no data found')
except Exception as e:
return render_template('index.html', msg1='No data found', error1 = e)
finally:
if conn:
conn.close()
##app.route('/addStudent, methods =['GET','POST']')
#def addStudent():
if __name__ == '__main__':
app.run(debug = True)
This is an environment problem, not a flask, postgres or shebang problem. A specific version of Python is being called, and it is not being given the correct path to its libraries.
Depending on what platform you are on, changing you shebang to #! /usr/bin/env python3 can fix the problem, but if not (very likely not, though using env is considered better/portable practice these days), then you may need to add your Python3 libs location manually in your code.
sys.path.append("/path/to/your/python/libs")
If you know where your Python libs are (or maybe flask is installed somewhere peculiar?) then you can add that to the path and imports following the line where you added to the path will include it in their search for modules.

python, accessing a psycopg2 form a def?

i'm trying to make a group of defs in one file so then i just can import them whenever i want to make a script in python
i have tried this:
def get_dblink( dbstring):
"""
Return a database cnx.
"""
global psycopg2
try
cnx = psycopg2.connect( dbstring)
except Exception, e:
print "Unable to connect to DB. Error [%s]" % ( e,)
exit( )
but i get this error: global name 'psycopg2' is not defined
in my main file script.py
i have:
import psycopg2, psycopg2.extras
from misc_defs import *
hostname = '192.168.10.36'
database = 'test'
username = 'test'
password = 'test'
dbstring = "host='%s' dbname='%s' user='%s' password='%s'" % ( hostname, database, username, password)
cnx = get_dblink( dbstring)
can anyone give me a hand?
You just need to import psycopg2 in your first snippet.
If you need to there's no problem to 'also' import it in the second snippet (Python makes sure the modules are only imported once). Trying to use globals for this is bad practice.
So: at the top of every module, import every module which is used within that particular module.
Also: note that from x import * (with wildcards) is generally frowned upon: it clutters your namespace and makes your code less explicit.

Categories

Resources