I am using the cloudant python library to connect to my cloudant account.
Here is the code I have so far:
import cloudant
class WorkflowsCloudant(cloudant.Account):
def __init__(self):
super(WorkflowsCloudant, self).__init__(settings.COUCH_DB_ACCOUNT_NAME,
auth=(settings.COUCH_PUBLIC_KEY,
settings.COUCH_PRIVATE_KEY))
#blueprint.route('/<workflow_id>')
def get_single_workflow(account_id, workflow_id):
account = WorkflowsCloudant()
db = account.database(settings.COUCH_DB_NAME)
doc = db.document(workflow_id)
resp = doc.get().json()
if resp['account_id'] != account_id:
return error_helpers.forbidden('Invalid Account')
return jsonify(resp)
This Flask controller will have CRUD operations inside of it, but with the current implementation, I will have to set the account and db variables in each method before performing operations on the document I want to view/manipulate. How can I clean up (or DRY up) my code so that I only have to call to my main WorkflowsCloudant class?
I don't know cloudant, so I may be totally off base, but I believe this answers your question:
Delete the account, db, and doc lines from get_single_workflow.
Add the following lines to __init__:
db = account.database(settings.COUCH_DB_NAME)
self.doc = db.document(workflow_id)
Change the resp line in get_single_workflow to:
resp = WorkflowsCloudant().doc.get().json()
Related
I'm working on a Python desktop app using wxPython and SQLite. The SQLite db is basically being used as a save file for my program so I can save and backup and reload the data being entered. I've created separate classes for parts of my UI so make it easier to manage from the "main" window. The problem I'm having is that each control needs to access the database, but the filename, and therefore the connection name, needs to be dynamic. I originally created a DBManager class that hardcoded a class variable with the connection string, which worked but didn't let me change the filename. For example
class DBManager:
conn = sqlite3.Connection('my_file.db')
#This could then be passed to other objects as needed
class Control1:
file = DBManager()
class Control2:
file = DBManager()
etc.
However, I'm running into a lot of problems trying to create this object with a dynamic filename while also using the same connection across all controls. Some examples of this I've tried...
class DBManager:
conn = None
def __init__(self):
pass
def __init__(self, filename):
self.conn = sqlite3.Connection(filename)
class Control1:
file = DBManager()
class Control2:
file = DBManager()
The above doesn't work because Python doesn't allow overloading constructors, so I always have to pass a filename. I tried adding some code to the constructor to act differently based upon whether the filename passed was blank or not.
class DBManager:
conn = None
def __init__(self, filename):
if filename != '':
self.conn = sqlite3.Connection(filename)
class Control1:
file = DBManager('')
class Control2:
file = DBManager('')
This let me compile, but the controls only had an empty connection. The conn object was None. It seems like I can't change a class variable after it's been created? Or am I just doing something wrong?
I've thought about creating one instance of DBManager that I then pass into each control, but that would be a huge mess if I need to load a new DB after starting the program. Also, it's just not as elegant.
So, I'm looking for ideas on achieving the one-connection path with a dynamic filename. For what it's worth, this is entirely for personal use, so it doesn't really have to follow "good" coding convention.
Explanation of your last example
You get None in the last example because you are instantiating DBManager in Control1 and Control2 with empty strings as input, and the DBManager constructor has an if-statement saying that a connection should not be created if filename is just an empty string. This leads to the self.conn instance variable never being set and any referal to conn would resolve to the conn class variable which is indeed set to None.
self.conn would create an instance variable only accessible by the specific object.
DBManager.conn would refer to the class variable and this is what you want to update.
Example solution
If you only want to keep one connection, you would need to do it with e.g. a. class variable, and update the class variable every time you interact with a new db.
import sqlite3
from sqlite3 import Connection
class DBManager:
conn = None
def __init__(self, filename):
if filename != '':
self.filename = filename
def load(self) -> Connection:
DBManager.conn = sqlite3.Connection(self.filename) # updating class variable with new connection
print(DBManager.conn, f" used for {self.filename}")
return DBManager.conn
class Control1:
db_manager = DBManager('control1.db')
conn = db_manager.load()
class Control2:
db_manager = DBManager('control2.db')
conn = db_manager.load()
if __name__ == "__main__":
control1 = Control1()
control2 = Control2()
would output the below. Note that the class variable conn refers to different memory addresses upon instantiating each control, showing that it's updated.
<sqlite3.Connection object at 0x10dc1e1f0> used for control1.db
<sqlite3.Connection object at 0x10dc1e2d0> used for control2.db
I don't know if this can be done but I'm trying to mock my db.session.save.
I'm using flask and flask-alchemy.
db.py
from flask_sqlalchemy import SQLAlchemy
db = SQLAlchemy()
The unit test
def test_post(self):
with app.app_context():
with app.test_client() as client:
with mock.patch('models.db.session.save') as mock_save:
with mock.patch('models.db.session.commit') as mock_commit:
data = self.gen_legend_data()
response = client.post('/legends', data=json.dumps([data]), headers=access_header)
assert response.status_code == 200
mock_save.assert_called()
mock_commit.assert_called_once()
And the method:
def post(cls):
legends = schemas.Legends(many=True).load(request.get_json())
for legend in legends:
db.session.add(legend)
db.session.commit()
return {'message': 'legends saved'}, 200
I'm trying to mock the db.session.add and db.session.commit. I've tried db.session.save and legends.models.db.session.save and models.db.session.save. They all came back with the save error:
ModuleNotFoundError: No module named 'models.db.session'; 'models.db' is not a package
I don't get the error and I'm not sure how to solve it.
Or am I doing something that is totally wrong in wanting to mock a db.session?
Thanks.
Desmond
The problem you're running into here is better served by restructuring your code so that it's more testable rather than mocking out every component, or otherwise making a (very) slow integration test. If you get in the habit of writing tests in that way, then over time you'll end up with a slow build that will take too long to run, and you'll end up with fragile tests (good talk on the subject of why fast tests are important here).
Let's take a look at this route:
def post(cls):
legends = schemas.Legends(many=True).load(request.get_json())
for legend in legends:
db.session.add(legend)
db.session.commit()
return {'message': 'legends saved'}, 200
...and decompose it:
import typing
from flask import jsonify
class LegendsPostService:
def __init__(self, json_args, _session=None) -> None:
self.json_args = json_args
self.session = _session or db.session
def _get_legends(self) -> Legend:
return schemas.Legends(many=True).load(self.json_args)
def post(self) -> typing.List[typing.Dict[str, typing.Any]]:
legends = self._get_legends()
for legend in legends:
self.session.add(legend)
self.session.commit()
return schemas.Legends(many=True).dump(legends)
def post(cls):
service = LegendsPostService(json_args=request.get_json())
service.post()
return jsonify({'message': 'legends saved'})
Notice how we've isolated nearly all the points of failure from post into LegendsPostService, and further, we've removed all the flask internals from it as well (no global request objects floating around, etc). We've even given it the ability to mock out session if we need to for testing later on.
I would recommend you focus your testing efforts on writing test cases for LegendsPostService. Once you've got excellent tests for LegendsPostService, decide if you believe that even more test coverage will add value. If you do, then consider writing one simple integration test for post() to tie it all together.
The next thing you need to consider is how you want to think about SQLAlchemy objects in tests. I recommend just using FactoryBoy for auto-creating "mock" models for you. Here's a full application example for how to setup flask / sqlalchemy / factory-boy in this way: How do I produce nested JSON from database query with joins? Using Python / SQLAlchemy
Here's how I'd write a test for LegendsPostService (apologies as this is a bit hasty and doesn't perfectly represent what you're trying to do - but you should be able to adjust these tests for your use case):
from factory.alchemy import SQLAlchemyModelFactory
class ModelFactory(SQLAlchemyModelFactory):
class Meta:
abstract = True
sqlalchemy_session = db.session
# setup your factory for Legends:
class LegendsFactory(ModelFactory):
logo_url = factory.Faker('image_url')
class Meta(ModelFactory.Meta):
model = Legends
from unittest.mock import MagicMock, patch
# neither of these tests even need a database connection!
# so you should be able to write HUNDREDS of similar tests
# and you should be able to run hundreds of them in seconds (not minutes)
def test_LegendsPostService_can_init():
session = MagicMock()
service = LegendsPostService(json_args={'foo': 'bar'}, _session=session)
assert service.session is session
assert service.json_args['foo'] == 'bar'
def test_LegendsPostService_can_post():
session = MagicMock()
service = LegendsPostService(json_args={'foo': 'bar'}, _session=session)
# let's make some fake Legends for our service!
legends = LegendsFactory.build_batch(2)
with patch.object(service, '_get_legends') as _get_legends:
_get_legends.return_value = legends
legends_post_json = service.post()
# look, Ma! No database connection!
assert legends_post_json[0]['image_url'] == legends[0].image_url
I hope that helps!
I have a working flask application, which looks in the following way:
I have my myFolder/app.py with:
app = flask.Flask('name')
# and then a lot of routes
#app.route('/api/something', methods=['GET'])
def someName():
return flask.jsonify(somefile.somefunc())
the file with functions looks like:
# initialize database connection
client = pymongo.MongoClient(host=os.environ['MONGO_IP'])
db = client[os.environ['MONGO_DB']]
# and then a lot of functions that uses this global db to CRUD something
def somefunc():
# do something with db
return
it works fine and everyone is happy. But I want to write some tests with chai
I want to set up a testing database, populate it with data, make one of the API calls, assert that the data looks ok.
And I do this in the following way:
import chai
from myFolder import app
class App(chai.Chai):
#classmethod
def setUpClass(cls):
# this method is called before all testcases
os.environ['MONGO_DB'] = 'test_db' # changing the db to test one
#classmethod
def tearDownClass(cls):
# this method is called after all testcases
os.environ['MONGO_DB'] = 'normal' # change db to normal
def test_firstTestCase(self):
client = app.app.test_client()
print client.get('/api/something').data # here is the Problem
What surprises me is that the last print actually outputs the results from the real database (I expected it to output nothing, because I changed the database name to testing and have not populated it). So how should I properly set up a testing database?
The test still writes to my MySQL database instead of a sqlite tempfile db. Why does this happen? Thanks!
Here's my code:
class UserTests(unittest.TestCase):
def setUp(self):
self.app = get_app()
#declare testing state
self.app.config["TESTING"] = True
self.db, self.app.config["DATABASE"] = tempfile.mkstemp()
#spawn test client
self.client = self.app.test_client()
#temp db
init_db()
def tearDown(self):
os.close(self.db)
os.unlink(self.app.config["DATABASE"])
def test_save_user(self):
#create test user with 3 friends
app_xs_token = get_app_access_token(APP_ID, APP_SECRET)
test_user = create_test_user(APP_ID, app_xs_token)
friend_1 = create_test_user(APP_ID, app_xs_token)
friend_2 = create_test_user(APP_ID, app_xs_token)
friend_3 = create_test_user(APP_ID, app_xs_token)
make_friend_connection(test_user["id"], friend_1["id"], test_user["access_token"], friend_1["access_token"])
make_friend_connection(test_user["id"], friend_2["id"], test_user["access_token"], friend_2["access_token"])
make_friend_connection(test_user["id"], friend_3["id"], test_user["access_token"], friend_3["access_token"])
save_user(test_user["access_token"])
This line might be the problem:
self.db, self.app.config["DATABASE"] = tempfile.mkstemp()
print out the values of self.db and self.app.config["DATABASE"] and makes sure they are what you expect them to be.
You probably want to investigate where your config self.app.config["DATABASE"] is referenced in your database code.
The Flask example code usually does a lot of work when the module is first imported. This tends to break things when you try to dynamically change values at run time, because by then its too late.
You probably will need to use an application factory so your app isn't built before the test code can run. Also, the app factory pattern implies you are using the Blueprint interface instead of the direct app reference, which is acquired using a circular import in the example code.
I'm struggling with implementing the Bridge design pattern (or an alternative such as Adapter) in Python
I want to be able to write code like this to dump database schemas based on a supplied URL:
urls = ['sqlite://c:\\temp\\test.db', 'oracle://user:password#tns_name'];
for url in urls:
db = Database(url);
schema = db.schema()
I've got classes defined as
class Database():
def __init__(self, url):
self.db_type = string.split(self.url, "://")[0]
class Oracle():
def schema(self):
# Code to return Oracle schema
class SQLite():
def schema(self):
# Code to return SQLite schema
How can I "glue" these 3 classes together so I can get the first code block to execute correctly? I've Googled around, but must be having a thick day as it's just not coming together in my mind...
Thanks in advance
Use a Factory pattern instead:
class Oracle(object):
...
class SQLite(object):
...
dbkind = dict(sqlite=SQLite, oracle=Oracle)
def Database(url):
db_type, rest = string.split(self.url, "://", 1)
return dbkind[db_type](rest)