Here's the code I have. Basically I have the Shebang line in there because the psycopg2 wasn't working without it.
But now when I have this line in there it doesn't allow me to run the database, it just says "no module named 'flask'"
#!/usr/bin/python3.4
#
# Small script to show PostgreSQL and Pyscopg together
#
from flask import Flask, render_template
from flask import request
from flask import *
from datetime import datetime
from functools import wraps
import time
import csv
import psycopg2
app = Flask(__name__)
app.secret_key ='lukey'
def getConn():
connStr=("dbname='test' user='lukey' password='lukey'")
conn=psycopg2.connect(connStr)
return conn
#app.route('/')
def home():
return render_template(index.html)
#app.route('/displayStudent', methods =['GET'])
def displayStudent():
residence = request.args['residence']
try:
conn = None
conn = getConn()
cur = conn.cursor()
cur.execute('SET search_path to public')
cur.execute('SELECT stu_id,student.name,course.name,home_town FROM student,\
course WHERE course = course_id AND student.residence = %s',[residence])
rows = cur.fetchall()
if rows:
return render_template('stu.html', rows = rows, residence = residence)
else:
return render_template('index.html', msg1='no data found')
except Exception as e:
return render_template('index.html', msg1='No data found', error1 = e)
finally:
if conn:
conn.close()
##app.route('/addStudent, methods =['GET','POST']')
#def addStudent():
if __name__ == '__main__':
app.run(debug = True)
This is an environment problem, not a flask, postgres or shebang problem. A specific version of Python is being called, and it is not being given the correct path to its libraries.
Depending on what platform you are on, changing you shebang to #! /usr/bin/env python3 can fix the problem, but if not (very likely not, though using env is considered better/portable practice these days), then you may need to add your Python3 libs location manually in your code.
sys.path.append("/path/to/your/python/libs")
If you know where your Python libs are (or maybe flask is installed somewhere peculiar?) then you can add that to the path and imports following the line where you added to the path will include it in their search for modules.
Related
I am diving down into the world of Python, practicing by writing a simple inventory based application for my dvd collection as a means to get aquainted to working with sqlite3.
As part of my project, I am using a ini file for settings, and then reading the values from that in a shared library that is called from another file. I am curious as to feedback on my methods, especially my use of the config file, and best coding practices around it.
The config is formatted as follows, named config.ini
[main]
appversion = 0.1.0
datasource = data\database.db
my utils library is then formatted as follows:
import os
import sqlite3
from configparser import ConfigParser
CONFIG_PATH = os.path.join(os.path.dirname(__file__), 'config/config.ini')
def get_settings(config_path=CONFIG_PATH):
config = ConfigParser()
config.read(config_path)
return config
def db_connect():
config = get_settings()
con = sqlite3.connect(config.get('main', 'datasource'))
return con
Finally, my test lookup, which does function is:
from utils import db_connect
def asset_lookup():
con = db_connect()
cur = con.cursor()
cur.execute("SELECT * FROM dvd")
results = cur.fetchall()
for row in results:
print(row)
My biggest question is in regards to my building of the data connection from within utils.py. First I'm reading the file, then form within the same script, building the data connection from a setting within the ini file. This is then read by other files. This was my method of trying to be modular, but I was not sure if its proper.
Thanks in advance.
To directly answer your question, you could do something like this to cache your objects so you don't create/open them over and over again whenever you call one of the functions in utils.py:
import os
import sqlite3
from configparser import ConfigParser
CONFIG_PATH = os.path.join(os.path.dirname(__file__), 'config/config.ini')
config = None
con = None
def get_settings(CONFIG_PATH):
global config
if config is None:
config = ConfigParser()
config.read(CONFIG_PATH)
return config
def db_connect():
global con
if con is None:
config = get_settings()
con = sqlite3.connect(config.get('main', 'datasource'))
return con
While this might solve your problem, it relies heavily on global variables, which might cause problems elsewhere. Typically, that's where you switch to classes as containers for your code parts that belong together. For example:
import os
import sqlite3
from configparser import ConfigParser
class DVDApp:
CONFIG_PATH = os.path.join(os.path.dirname(__file__), 'config/config.ini')
def __init__(self):
self.config = ConfigParser()
self.config.read(self.CONFIG_PATH)
self.con = sqlite3.connect(self.config.get('main', 'datasource'))
def asset_lookup(self):
cur = self.con.cursor()
cur.execute("SELECT * FROM dvd")
results = cur.fetchall()
for row in results:
print(row)
Initializing config and connection objects held in self reduces to just 3 lines of code. Thereby making it almost unnecessary to split your code over several files. And even if so, it would be enough to share the one instance of DVDApp between modules, which then holds all the other shared objects.
So let's say that I have two files (test_file1.py, test_file2.py) for integration testing using py.test.
The test_file1.py is something like this:
import datetime
import pytest
Datetime = datetime.datetime.now()
def test_connect():
#1st Query to a mysql database
#2nd Query to a mysql database
..
#N Query to a mysql database
Now I'm writing the test_file2.py which is an extention of test_file1.py but I don't want to write the same mysql queries that I wrote in the above test.
How can I make py.test to inherit the above test and run both after executing py.test test_file2.py?
Something like this (test_file2.py Contents):
import datetime
import pytest
from testDirectory import test_file1
Datetime = datetime.datetime.now()
def test_connect():
#Here should run all the tests from 'test_file1' somehow...
#1st new additional Query to a mysql database
#2nd new additional Query to a mysql database
..
#N new additional Query to a mysql database
Thanks!!
When you import a module, it will execute all of the code inside it. So just write the code you want executed in your original file. For example add the call to the function in your file like this:
test_file1.py:
import datetime
import pytest
Datetime = datetime.datetime.now()
def test_connect():
#1st Query to a mysql database
#2nd Query to a mysql database
..
#N Query to a mysql database
test_connect() # This will run your function when you import
So then in your py.test when you call import test_file1, it will execute the test_connect() and any other code you would like without doing anything else.
In other words, here is a really simple example with 3 files:
File 1: hello_world.py:
def hello_world():
print('hello world!')
hello_world()
File 2: print_text.py:
def print_text():
print('foo bar baz')
print_text()
File 3: run_everything.py:
import hello_world
import print_text
Result when you run run_everything.py:
>>>hello world!
>>>foo bar baz
If you want the function to be executed when the file is executed directly, but not imported as a module, you can do this:
test_file1.py:
import datetime
import pytest
Datetime = datetime.datetime.now()
def test_connect():
#1st Query to a mysql database
#2nd Query to a mysql database
..
#N Query to a mysql database
def main():
# This will _not_ run your function when you import. You would
# have to use test_file1.test_connect() in your py.test.
test_connect()
if __name__ == '__main__':
main()
So in this example, your py.test would be:
import test_file1
test_file1.test_connect()
First one create a fixture in conftest.py:
import pytest
import MySQLdb
def db_cursor(request):
db = MySQLdb.connect(host="localhost", user="root")
cursor = db.cursor()
cursor.execute("SELECT USER()")
data = cursor.fetchone()
assert 'root#localhost' in data
yield cursor
db.close()
Then use it in your test modules:
# test_file1.py
def test_a(db_cursor)
pass
# test_file2.py
def test_b(db_cursor)
res = db_cursor.execute("SELECT VERSION()")
assert '5.5' in res.fetchone()
P.S.
It possible to use any other modules, just inject they are into your tests with pytest_plugins directive:
# conftest.py
pytest_plugins = '_mysql.cursor'
# _mysql/__init__.py
# _mysql/cursor.py
import pytest
import MySQLdb
def db_cursor(request):
db = MySQLdb.connect(host="localhost", user="root")
cursor = db.cursor()
cursor.execute("SELECT USER()")
data = cursor.fetchone()
assert 'root#localhost' in data
yield cursor
db.close()
I installed Flask-FlatPages and am trying to run this simple app (to display .md files):
import sys
from flask import Flask, render_template
from flask_flatpages import FlatPages, pygments_style_defs
DEBUG = True
FLATPAGES_AUTO_RELOAD = DEBUG
FLATPAGES_EXTENSION = '.md'
FLATPAGES_ROOT = 'content'
POST_DIR = 'posts'
app = Flask(__name__)
flatpages = FlatPages(app)
app.config.from_object(__name__)
#app.route("/posts/")
def posts():
posts = [p for p in flatpages if p.path.startswith(POST_DIR)]
posts.sort(key=lambda item:item['date'], reverse=False)
return render_template('posts.html', posts=posts)
#app.route('/posts/<name>/')
def post(name):
path = '{}/{}'.format(POST_DIR, name)
post = flatpages.get_or_404(path)
return render_template('post.html', post=post)
if __name__ == "__main__":
app.run(host='0.0.0.0', debug=True)
Whenever I run the app, I get this error:
NameError: name 'unicode' is not defined
The trackback (flask-flatpages) is this:
File "/Library/Frameworks/Python.framework/Versions/3.4/lib/python3.4/site-packages/flask_flatpages/__init__.py", line 290, in _pages
_walk(unicode(self.root))
I know unicode is now str in Python 3 -- can I fix the issue from my app (without modifying the package)?
Well if the package does not support Python 3, then you cannot easily make it work. You can wait for support or find alternative package. If the only problem is missing definition for unicode, then it can be monkeypathed like
import builtins
builtins.unicode = str
before importing flask_flatpages. But I doubt missing unicode is the only problem.
I'm trying to use the database configuration set on settings files to make a database dump using fabric.
There's more than one settings file, so I'd like to be able to do so based on the environment I choose.
by now, my task is like this
def dump_database():
with cd('~/project_folder'), prefix(WORKON_VIRTUALENV):
django.settings_module(env.settings)
from django.conf import settings
dbname = settings.DATABASES['default']['NAME']
dbuser = settings.DATABASES['default']['USER']
dbpassword = settings.DATABASES['default']['PASSWORD']
fname = '/tmp/{0}-backup-{1}.sql.gz'.format(
dbname,
time.strftime('%Y%m%d%H%M%S')
)
run('mysqldump -u %s -p=%s %s | gzip -9 /tmp/backup-%s.sql.gz' % (
dbuser,
dbpassword,
dbname,
fname))
But I'm getting an ImportError:
ImportError: Could not import settings 'project.settings.production'
I've tried to use shell_env() to set the DJANGO_SETTINGS_MODULE instead of django.settings_module(env.settings), with the same result.
I use a task to change the environment based on a environment dict:
def environment(name):
env.update(environments[name])
env.environment = name
This way, I want to be able to create a dump from multiple hosts like:
fab environment:live dump_database
fab environment:otherhost dump_database
Without having to reproduce database settings from all hosts on fabfile.
Importing your Django settings file in fabric is explained here.
http://fabric.readthedocs.org/en/1.3.3/api/contrib/django.html
Quoting from the above link:
from fabric.api import run
from fabric.contrib import django
django.settings_module('myproject.settings')
from django.conf import settings
def dump_production_database():
run('mysqldump -u %s -p=%s %s > /tmp/prod-db.sql' % (
settings.DATABASE_USER,
settings.DATABASE_PASSWORD,
settings.DATABASE_NAME
))
NOTE: I don't answer the question but offer a different solution
I had the same problem.. so I did custom .py script like that:
I created a file named dump_db.py (placed next to fabfile.py for example, that is on the remote machine)
import os
import sys
from datetime import datetime
from django.conf import settings
def dump_mysql():
os.environ.setdefault("DJANGO_SETTINGS_MODULE", SETTINGS_MODULE)
DB_NAME = settings.DATABASES['default']['NAME']
DB_USER = settings.DATABASES['default']['USER']
DB_PASSWORD = settings.DATABASES['default']['PASSWORD']
dump_file_name = '{time}_{db_name}.sql'.format(
time=datetime.now().strftime('%Y_%m_%d'),
db_name=DB_NAME,
)
os.system('mysqldump -u {db_user} -p{db_password} {db_name} > {to_file}'.format(
db_user=DB_USER,
db_password=DB_PASSWORD,
db_name=DB_NAME,
to_file=dump_file_name,
))
return dump_file_name
if __name__ == '__main__':
try:
SETTINGS_MODULE = sys.argv[1:].pop()
except IndexError:
SETTINGS_MODULE = 'project_name.settings'
print dump_mysql()
As you see sys.argv[1:].pop() tries to take optional argument (the setting module in this case).
So in my fabfile:
import os
from fabric.api import env, local, run, prefix, cd
.....
def dump():
current_dir = os.getcwd()
with prefix('source {}bin/activate'.format(env.venv)), cd('{}'.format(env.home)):
dumped_file = run('python dump_db.py {}'.format(env.environment)) # the optional argument given
file_path = os.path.join(env.home, dumped_file)
copy_to = os.path.join(current_dir, dumped_file)
scp(file_path, copy_to)
def scp(file_path, copy_to):
local('scp {}:{} {}'.format(env.host, file_path, copy_to))
where env.environment = 'project_name.settings.env_module'
And this is how I dump my DB and copy it back to me.
Hope it comes handy to someone! :)
i'm trying to make a group of defs in one file so then i just can import them whenever i want to make a script in python
i have tried this:
def get_dblink( dbstring):
"""
Return a database cnx.
"""
global psycopg2
try
cnx = psycopg2.connect( dbstring)
except Exception, e:
print "Unable to connect to DB. Error [%s]" % ( e,)
exit( )
but i get this error: global name 'psycopg2' is not defined
in my main file script.py
i have:
import psycopg2, psycopg2.extras
from misc_defs import *
hostname = '192.168.10.36'
database = 'test'
username = 'test'
password = 'test'
dbstring = "host='%s' dbname='%s' user='%s' password='%s'" % ( hostname, database, username, password)
cnx = get_dblink( dbstring)
can anyone give me a hand?
You just need to import psycopg2 in your first snippet.
If you need to there's no problem to 'also' import it in the second snippet (Python makes sure the modules are only imported once). Trying to use globals for this is bad practice.
So: at the top of every module, import every module which is used within that particular module.
Also: note that from x import * (with wildcards) is generally frowned upon: it clutters your namespace and makes your code less explicit.