I have entities designed with peewee in Python. Before I started implementing real database, I've made several tests with in-memory databases. When I started to implement database functionality, I faced strange problem. My queries returns empty results, what more it depends if I run script or use python console.
First of all, let me proof that logic is correct. When I use python console, everything is ok:
>>> from Entities import *
>>> print (RouterSettings.select().where(RouterSettings.name=='RUT00').get().name)
RUT00
As you see, everything is correct. Specific query is executed and returns result. Now the same in a script:
from Entities import *
print (RouterSettings.select().where(RouterSettings.name=='RUT00').get().name)
This one returns exception instance matching query does not exist
print
(RouterSettings.select().where(RouterSettings.name=='RUT00').get().name)
File
"C:\Users\Kamil\AppData\Local\Programs\Python\Python37-32\lib\site-packages\peewee.py",
line 5975, in get
(clone.model, sql, params)) Entities.RouterSettingsDoesNotExist: instance matching query does not exist : SQL:
SELECT "t1"."id", "t1"."name", "t1"."ip", "t1"."username",
"t1"."password", "t1"."model", "t1"."phone_num", "t1"."provider",
"t1"."location" FROM "routersettings" AS "t1" WHERE ("t1"."name" = ?)
LIMIT ? OFFSET ? Params: ['RUT00', 1, 0]
When I was trying to debug, I've found that database was as if not created:
Please note that within debugged variables database object is null (None).
Do you have any ideas what's going on?
My Entities are defined as follows:
from peewee import *
class EnumField(IntegerField):
def __init__(self, *argv):
super().__init__()
self.enum = []
for label in argv:
self.enum.append(label)
def db_value(self, value):
try:
return self.enum.index(value)
except ValueError:
raise EnumField.EnumValueDoesnExistError(
"Value doesn\'t exist in enum set.\nMaybe you forgot to add "
"that one: " + value + "?")
def python_value(self, value):
try:
return self.enum[value]
except IndexError:
raise EnumField.EnumValueDoesnExistError(
'No value for given id')
class EnumValueDoesnExistError(Exception):
pass
class ModelField(EnumField):
def __init__(self):
super().__init__('RUT955_Q', 'RUT955_H', 'GLiNet300M')
class ProviderField(EnumField):
def __init__(self):
super().__init__('Orange', 'Play', 'Virgin')
class BaseModel(Model):
class Meta:
database = SqliteDatabase('SIMail.db', pragmas={'foreign_keys': 1})
class RouterSettings(BaseModel):
name = CharField(unique=True)
ip = CharField(unique=True)
username = CharField()
password = CharField()
model = ModelField()
phone_num = IntegerField(unique=True)
provider = ProviderField()
location = CharField()
You probably are running it with a relative path to the database file, and depending on the current working directory when you're running your app vs the console, its using a different database file.
Related
I have a silly question.
This my code:
from peewee import *
db = SqliteDatabase(None)
class Base(Model):
class Meta:
database = db
class Table(Base):
a_date = DateField()
url = CharField()
def __main()__
parser = argparse.ArgumentParser()
parser.add_argument('--db-dir', action='store')
args = parser.parse_args()
db_path = os.path.join(args.db_dir, 'data.db')
try:
db.init(db_path)
db.connect()
query = Table.select().order_by(Table.a_date.desc()).get()
except Exception:
sys.exit(1)
else:
print(query.url)
sys.exit(0)
if __name__ == '__main__':
main()
This code is working fine, but if the file db not exist db.connect always create it. How I can prevent this ?
Another question is , How can query table database for this field without declare the peewee Model?
Thanks
If I understand correctly peewee doc (http://docs.peewee-orm.com/en/latest/peewee/database.html), they use the api provided by python in order to connect to sqlite.
Which means you have to deal with this api (https://docs.python.org/2/library/sqlite3.html#sqlite3.connect), and the connect method always create the database beforehand.
I however believe that you can pass a custom Connection class to this method (parameter factory), you could define your behaviour in this custom class.
import os
from sqlite3 import Connection
from peewee import *
class CustomConnection(Connection):
def __init__(self, dbname, *args, **kwargs):
# Check if db already exists or not
if not os.path.exists(dbname):
raise ValueError('DB {} does not exist'.format(dbname))
super(CustomConnection, self).__init__(dbname, *args, **kwargs)
db = SqliteDatabase('mydatabase', factory=CustomConnection)
Despite numerous recipes and examples in peewee's documentation; I have not been able to find how to accomplish the following:
For finer-grained control, check out the Using context manager / decorator. This allows you to specify the database to use with a given list of models for the duration of the wrapped block.
I assume it would go something like...
db = MySQLDatabase(None)
class BaseModelThing(Model):
class Meta:
database = db
class SubModelThing(BaseModelThing):
'''imagine all the fields'''
class Meta:
db_table = 'table_name'
runtime_db = MySQLDatabase('database_name.db', fields={'''imagine field mappings here''', **extra_stuff)
#Using(runtime_db, [SubModelThing])
#runtime_db.execution_context()
def some_kind_of_query():
'''imagine the queries here'''
but I have not found examples, so an example would be the answer to this question.
Yeah, there's not a great example of using Using or the execution_context decorators, so the first thing is: don't use the two together. It doesn't appear to break anything, just seems to be redundant. Logically that makes sense as both of the decorators cause the specified model calls in the block to run in a single connection/transaction.
The only(/biggest) difference between the two is that Using allows you to specify the particular database that the connection will be using - useful for master/slave (though the Read slaves extension is probably a cleaner solution).
If you run with two databases and try using execution_context on the 'second' database (in your example, runtime_db) nothing will happen with the data. A connection will be opened at the start of the block and closed and the end, but no queries will be executed on it because the models are still using their original database.
The code below is an example. Every run should result in only 1 row being added to each database.
from peewee import *
db = SqliteDatabase('other_db')
db.connect()
runtime_db = SqliteDatabase('cmp_v0.db')
runtime_db.connect()
class BaseModelThing(Model):
class Meta:
database = db
class SubModelThing(Model):
first_name = CharField()
class Meta:
db_table = 'table_name'
db.create_tables([SubModelThing], safe=True)
SubModelThing.delete().where(True).execute() # Cleaning out previous runs
with Using(runtime_db, [SubModelThing]):
runtime_db.create_tables([SubModelThing], safe=True)
SubModelThing.delete().where(True).execute()
#Using(runtime_db, [SubModelThing], with_transaction=True)
def execute_in_runtime(throw):
SubModelThing(first_name='asdfasdfasdf').save()
if throw: # to demo transaction handling in Using
raise Exception()
# Create an instance in the 'normal' database
SubModelThing.create(first_name='name')
try: # Try to create but throw during the transaction
execute_in_runtime(throw=True)
except:
pass # Failure is expected, no row should be added
execute_in_runtime(throw=False) # Create a row in the runtime_db
print 'db row count: {}'.format(len(SubModelThing.select()))
with Using(runtime_db, [SubModelThing]):
print 'Runtime DB count: {}'.format(len(SubModelThing.select()))
I met an transaction problem when I used the python orm peewee these days. I save two book instances using this orm, and beween the two savings I raise an exception so I except that none of them are saved to database, but it doesn't work. Could anyone explain why? I am new to python, thanks.
this code is below:
from peewee import *
def get_db():
return SqliteDatabase("test.db")
class Book(Model):
id = PrimaryKeyField()
name = CharField()
class Meta:
database = get_db()
def test_transaction():
book1 = Book(name="book1")
book2 = Book(name="book2")
db = get_db()
db.create_tables([Book], safe=True)
try:
with db.transaction() as tran:
book1.save()
raise ProgrammingError("test")
book2.save()
except:
pass
for book in Book.select():
print(book.name)
if __name__ == '__main__':
test_transaction()
The problem is that when you are calling "get_db()" you are instantiating new database objects. Databases are stateful, in that they manage the active connection for a given thread. So what you've essentially got is two different databases, one that your models are associated with, and one that has your connection and transaction. When you call db.transaction() a transaction is taking place, but not on the connection you think it is.
Change the code to read as follows and it will work like you expect.
book1 = Book(name='book1')
book2 = Book(name='book2')
db = Book._meta.database
# ...
In developing a website for indexing system documentation I've come across a tough nut to crack regarding data "matching"/relations across databases in Django.
A simplified model for my local database:
from django.db import models
class Document(models.Model):
name = models.CharField(max_length=200)
system_id = models.IntegerField()
...
Imagined model, system details are stored in a remote database.
from django.db import models
class System(models.Model):
name = models.CharField(max_length=200)
system_id = models.IntegerField()
...
The idea is that when creating a new Document entry at my website the ID of the related system is to be stored in the local database. When presenting the data I would have to use the stored ID to retrieve the system name among other details from the remote database.
I've looked into foreign keys across databases, but this seems to be very extensive and I'm not sure if I want relations. Rather I visualize a function inside the Document model/class which is able to retrieve the matching data, for example by importing a custom router/function.
How would I go about solving this?
Note that I won't be able to alter anything on the remote database, and it's read-only. Not sure if I should create a model for System aswell. Both databases use PostgreSQL, however my impression is that it's not really of relevance to this scenario which database is used.
In the django documentation multi-db (manually-selecting-a-database)
# This will run on the 'default' database.
Author.objects.all()
# So will this.
Author.objects.using('default').all()
# This will run on the 'other' database.
Author.objects.using('other').all()
The 'default' and 'other' are aliases for you databases.
In your case it would could be 'default' and 'remote'.
of course you could replace the .all() with anything you want.
Example: System.objects.using('remote').get(id=123456)
You are correct that foreign keys across databases are a problem in Django ORM, and to some extent at the db level too.
You already have the answer basically: "I visualize a function inside the Document model/class which is able to retrieve the matching data"
I'd do it like this:
class RemoteObject(object):
def __init__(self, remote_model, remote_db, field_name):
# assumes remote db is defined in Django settings and has an
# associated Django model definition:
self.remote_model = remote_model
self.remote_db = remote_db
# name of id field on model (real db field):
self.field_name = field_name
# we will cache the retrieved remote model on the instance
# the same way that Django does with foreign key fields:
self.cache_name = '_{}_cache'.format(field_name)
def __get__(self, instance, cls):
try:
rel_obj = getattr(instance, self.cache_name)
except AttributeError:
system_id = getattr(instance, self.field_name)
remote_qs = self.remote_model.objects.using(self.remote_db)
try:
rel_obj = remote_qs.get(id=system_id)
except self.remote_model.DoesNotExist:
rel_obj = None
setattr(instance, self.cache_name, rel_obj)
if rel_obj is None:
raise self.related.model.DoesNotExist
else:
return rel_obj
def __set__(self, instance, value):
setattr(instance, self.field_name, value.id)
setattr(instance, self.cache_name, value)
class Document(models.Model:
name = models.CharField(max_length=200)
system_id = models.IntegerField()
system = RemoteObject(System, 'system_db_name', 'system_id')
You may recognise that the RemoteObject class above implements Python's descriptor protocol, see here for more info:
https://docs.python.org/2/howto/descriptor.html
Example usage:
>>> doc = Document.objects.get(pk=1)
>>> print doc.system_id
3
>>> print doc.system.id
3
>>> print doc.system.name
'my system'
>>> other_system = System.objects.using('system_db_name').get(pk=5)
>>> doc.system = other_system
>>> print doc.system_id
5
Going further you could write a custom db router:
https://docs.djangoproject.com/en/dev/topics/db/multi-db/#using-routers
This would let you eliminate the using('system_db_name') calls in the code by routing all reads for System model to the appropriate db.
I'd go for a method get_system(). So:
class Document:
def get_system(self):
return System.objects.using('remote').get(system_id=self.system_id)
This is the simplest solution. A possible solution is also to use PostgreSQL's foreign data wrapper feature. By using FDW you can abstract away the multidb handling from django and do it inside the database - now you can use queries that need to use the document -> system relation.
Finally, if your use case allows it, just copying the system data periodically to the local db can be a good solution.
I'm a complete beginner to Flask and I'm starting to play around with making web apps.
I have a hard figuring out how to enforce unique user names. I'm thinking about how to do this in SQL, maybe with something like user_name text unique on conflict fail, but then how to I catch the error back in Python?
Alternatively, is there a way to manage this that's built in to Flask?
That entirely depends on your database layer. Flask is very specifically not bundled with a specific ORM system, though SQL Alchemy is recommended. The good news is that SQL Alchemy has a unique constraint.
Here's how it might work:
from sqlalchemy.ext.declarative import declarative_base, InvalidRequestError
engine = #my engine
session = Session() # created by sessionmaker(bind=engine)
Base = declarative_base()
class User(Base):
__tablename__ = 'users'
id = Column(Integer, primary_key=True)
name = Column(String, unique=True)
# then later...
user = User()
user.name = 'Frank'
session.add(user)
try:
session.commit()
print 'welcome to the club Frank'
except InvalidRequestError:
print 'You are not Frank. Impostor!!!'
Run the part after "then later" twice. The first time you'll get a welcome message, the second time you won't.
Addendum: The closest thing that Flask has to a default authentication framework simply stores users in a dict by username. The way to check to enforce uniqueness is by manually testing eg.
if username in digest_db:
raise Exception('HEY! "{}" already exists! \
You can\'t do that'.format(username))
digest_db.add_user(username, password)
or overriding RealmDigestDB to make sure that it checks before adding:
class FlaskRealmDigestDB(authdigest.RealmDigestDB):
def add_user(self, user, password):
if user in self:
raise AttributeError('HEY! "{}" already exists! \
You can\'t do that'.format(user))
super(FlaskRealmDigestDB, self).add_user(user, password)
def requires_auth(self, f):
# yada yada
or overriding RealmDigestDB, and making it return something which does not allow duplicate assignment. eg.
class ClosedDict(dict):
def __setitem__(self, name, val):
if name in self and val != self[name]:
raise AttributeError('Cannot reassign {} to {}'.format(name, val))
super(ClosedDict, self).__setitem__(name,val)
class FlaskRealmDigestDB(authdigest.RealmDigestDB):
def newDB():
return ClosedDict()
def requires_auth(self, f):
# yada yada
I put this here as an addendum because that class does not persist data in any way, if you're planning on extending authdigest.RealmDigestDB anyway you should use something like SQLAlchemy as above.
You can use SQLAlchemy.It's a plug-in