Flask-Restless dump_to of primary key field - python

I am running into an issue that may be bug, but want to verify it with the community. I am basically trying to conform to camelcase for transporting data, then underscore for the database.
However, on the person_serializer, flask-restless will not allow an outbound "idPerson" as a result of the dump_to="idPerson". For some reason, it checks that the primary key exists and gets a keyError since the actual key is "id_person", not "idPerson".
Any help would be appreciated.
class Person(Base):
__tablename__ = "person"
id_person = Column(Integer, primary_key=True)
first_name = Column(String(50))
last_name = Column(String(50))
class PersonSchema(Schema):
id_person = fields.Integer(load_from="idPerson",dump_to="idPerson")
first_name = fields.String(load_from="firstName", dump_to="firstName")
last_name = fields.String(load_from="lastName", dump_to="lastName")
#post_load
def make_user(self, data):
return Person(**data)
person_schema = PersonSchema()
def person_serializer(instance):
return person_schema.dump(instance).data
def person_deserializer(data):
return person_schema.load(data).data
KEY ERROR IS BELOW
try:
# Convert the dictionary representation into an instance of the
# model.
instance = self.deserialize(data)
# Add the created model to the session.
self.session.add(instance)
self.session.commit()
# Get the dictionary representation of the new instance as it
# appears in the database.
result = self.serialize(instance)
except self.validation_exceptions as exception:
return self._handle_validation_exception(exception)
# Determine the value of the primary key for this instance and
# encode URL-encode it (in case it is a Unicode string).
pk_name = self.primary_key or primary_key_name(instance)
> primary_key = result[pk_name]
E KeyError: 'idPerson'

Related

sqlalchemy model creation failing due to missing required attribute - but it's not missing

I have an sql alchemy model that references another model as a foreign key.
class BaseMixin(object):
#classmethod
def create(cls, **kw):
obj = cls(**kw)
db.session.add(obj)
db.session.commit()
#classmethod
def find(cls, id):
obj = db.session.query(cls).get(id)
return obj
class Node(db.Model, BaseMixin):
id = db.Column(db.Integer, primary_key=True)
label = db.Column(db.String(120), unique=True, nullable=False)
firsts = db.relationship('First', backref='node', lazy=True)
def __repr__(self):
return f'<Node {self.label} {self.id}>'
def __init__(self, label):
self.label = label
class First(db.Model, BaseMixin):
id = db.Column(db.Integer, primary_key=True)
node_id = db.Column(db.Integer, db.ForeignKey('node.id'), nullable=False)
def __init__(self, label):
if label is None:
raise Exception("A first must have a label")
node = db.session.query(Node).filter_by(label=label).first()
if node is None:
node = Node(label=label)
node.create()
self.node_id = node.id
Elsewhere in my code, I try to create a first with a brand new label, which seems like it should create a new node in the process and use it as a foreign key:
first = First(label=context.label)
first.create()
But I get the following error:
File "models.py", line 12, in create
obj = cls(**kw)
TypeError: __init__() missing 1 required positional argument: 'label'
It seems to me that I am passing label to node when trying to create. So why am I getting this error message?
There are a few issues here.
Firstly, the code initialises entities and then calls create on the instance. This would result in two entities being initialised, since create creates a new entity. create is a classmethod, so it can be called on the class directly.
Secondly, the label argument is not being passed to create where it is called, so you get the error when create calls cls(**kw).
Instead of doing this
first = First(label=some_label)
first.create()
do this
first = First.create(label=some_label)
or just
first = First(label=some_label)
db.session.add(first)
(note that create is also called in First.__init__).
Thirdly - and this more of a preference - I'd avoid calling commit in the create method. Commit once, at the end of the request. This is more efficient and avoids situations where some objects get committed but others don't due to an exception being raised part way through a request.

django - can't assign a foreign key

For unknown reasons, I cannot assign a foreign key instance of Item_rarity table into Detailed_item table. Django throws an error:
Cannot assign "u'Basic'": "Detailed_item.rarity" must be a "Item_rarity" instance.
... But in Item_rarity dictionary "Basic" record exists - I can choose it from admin panel and create Detailed_item record manually.
I have defined models:
class Detailed_item(models.Model):
item_id = models.IntegerField(primary_key=True)
name = models.CharField(max_length=50)
level = models.IntegerField()
icon = models.CharField(max_length=150)
rarity = models.ForeignKey('Item_rarity')
general_type = models.ForeignKey('Item_type')
detailed_type = models.ForeignKey('Item_detailed_type')
class Item_rarity(models.Model):
name = models.CharField(max_length=15, primary_key=True)
class Item_type(models.Model):
name = models.CharField(max_length=15, primary_key=True)
class Item_detailed_type(models.Model):
name = models.CharField(max_length=20, primary_key=True)
In views, I try to populate it in this manner (inserting multiple items):
...
items = get_all_items() #get dict of items
for element in items:
tmp_det_type = ''
for key, val in element.iteritems():
#get 'detailed type' from inner dict
if key == "type":
tmp_det_type = val
item = Detailed_item(
item_id=element['id'],
name=element['name'],
level=element['level'],
icon=element['icon'],
rarity=element['rarity'], #error
general_type=element['type'],
detailed_type=tmp_det_type,
)
item.save()
...
I even tried to hard code "Basic" string, but it doesn't work either.
* Solved *
Next two entries, that is Item_type and Item_detailed_type were also invalid.
Correct code:
from app.models import Detailed_item, Item_rarity, Item_type, Item_detailed_type
...
items = get_all_items() #get dict of items
for element in items:
tmp_det_type = ''
for key, val in element.iteritems():
#get 'detailed type' from inner dict
if key == "type":
tmp_det_type = val
#create objects with string values
obj_rarity = Item_rarity(name=element['rarity'])
obj_item_type = Item_type(name=element['type'])
obj_item_detailed_type = Item_detailed_type(name=tmp_det_type)
item = Detailed_item(
item_id=element['id'],
name=element['name'],
level=element['level'],
icon=element['icon'],
rarity=obj_rarity,
general_type=obj_item_type,
detailed_type=obj_item_detailed_type,
)
item.save()
...
Item_rarity instance should be passed while storing Detailed_item object since Item_rarity is a foreign key related object in Detailed_item.
Its that you might have passed the Basic string instead of the <Basic Object> itself.
While creating an object in django using its ORM, any foreign_key related object should be provided with the instance itself instead of the id(pk) of the object, where as while fetching the data from the database you can use either of instance or the id(pk) of the instance.
class ParentModel(models.Model):
model_field = models.CharField(max_length=16)
class MyModel(models.Model):
some_field = models.ForeignKey('ParentModel')
parent_model = ParentModel.objects.create(model_field='some_data')
my_model = MyModel.objects.create(some_field=parent_model)
^^^^^^^^^^^^
Note here that the parent_model object itself is passed instead of the id
While fetching the data back,
parent_model = ParentModel.objects.get(model_field='some_data')
my_model = MyModel.objects.get(some_field=parent_model)
or
my_model = MyModel.objects.get(some_field=parent_model.id)
Both would work in case of data fetch.
You do not have to provide the related object on creation if you change the kwarg in to rarity_name:
item = Detailed_item(
item_id=element['id'],
name=element['name'],
level=element['level'],
icon=element['icon'],
rarity_name=element['rarity'], # no error
general_type=element['type'],
detailed_type=tmp_det_type,
)
I have only tested this with the regular id field (the auto pk) but it
should work with your primary key just fine.
E.g.
class SimpleModel(Model):
value = TextField(blank=True)
class ComplexModel(Model):
simple = ForeingKey(SimpleModel)
title = TextField(unique=True)
ComplexModel.objects.create(title='test', simple_id=1)

SQLAlchemy - How to get object's UUID available before commit/update/query?

I'm a noobie trying get primary model's primary UUID automatically instantiated before it's stored into DB, I would not like to commit objects into db just to get the UUID available.
The short snippets below are from the actual code I have.
I think I need to attach initialization into some SQLAlchemy hook, but I don't know which or how.
I have an UUID helper as follows
class GUID(TypeDecorator):
impl = types.LargeBinary
...
the in the tables I use
class Row(Model,Base):
__tablename__ = "Row"
id = Column(GUID(), primary_key=True, default=uuid.uuid4)
row_text = Column(Unicode, index=True)
original_row_index = Column(Integer)
when I do this test:
def test_uuid():
row_text = "Just a plain row."
irow = 0
row = Row(row_text, irow)
row.save()
row.commit()
if row.id == None:
print ("row.id == None")
else:
print ("row.id set")
row2 = Row(row_text, irow)
row2.save()
if row2.id == None:
print ("row2.id == None")
else:
print ("row2.id set")
it prints
row.id set
row2.id == None
The Model class I use is as follows:
class Model():
def __init__(self):
pass
def save(self):
db = Db.instance()
db.session.add(self)
def commit(self):
db = Db.instance()
db.session.commit()
I suppose that your should use not commit method but flush method: Difference between flush and commit.
Flush not store information into the hard disk but create all changes on primary key:
Primary key attributes are populated immediately within the flush() process as they are generated and no call to commit() should be required
Link to original

Get the relationship metadata from sqlalchemy

I have a very simple User class definition:
class User(Base):
implements(interfaces.IUser)
__tablename__ = 'users'
#Fields description
id = Column(Integer, primary_key=True)
client_id = Column(Integer, ForeignKey('w2_client.id'))
client = relationship("Client", backref=backref('users', order_by=id))
I want to generate automatically a GUI to edit the object User (and other type of class). So I need to get all the meta data of the table, for example, I can do:
for c in User.__table__.columns:
print c.name, c.type, c.nullable, c.primary_key, c.foreign_keys
But I can not get any information about the relationship "client", the c.foreign_keys just shows me the table related to the foreign_keys but not the attribute "client" I've defined.
Please let me know if my question is not clear
It's true that is not readily available. I had to come up with my own function after some reverse-engineering.
Here is the metadata that I use. I little different than what you are are looking for, but perhaps you can use it.
# structure returned by get_metadata function.
MetaDataTuple = collections.namedtuple("MetaDataTuple",
"coltype, colname, default, m2m, nullable, uselist, collection")
def get_metadata_iterator(class_):
for prop in class_mapper(class_).iterate_properties:
name = prop.key
if name.startswith("_") or name == "id" or name.endswith("_id"):
continue
md = _get_column_metadata(prop)
if md is None:
continue
yield md
def get_column_metadata(class_, colname):
prop = class_mapper(class_).get_property(colname)
md = _get_column_metadata(prop)
if md is None:
raise ValueError("Not a column name: %r." % (colname,))
return md
def _get_column_metadata(prop):
name = prop.key
m2m = False
default = None
nullable = None
uselist = False
collection = None
proptype = type(prop)
if proptype is ColumnProperty:
coltype = type(prop.columns[0].type).__name__
try:
default = prop.columns[0].default
except AttributeError:
default = None
else:
if default is not None:
default = default.arg(None)
nullable = prop.columns[0].nullable
elif proptype is RelationshipProperty:
coltype = RelationshipProperty.__name__
m2m = prop.secondary is not None
nullable = prop.local_side[0].nullable
uselist = prop.uselist
if prop.collection_class is not None:
collection = type(prop.collection_class()).__name__
else:
collection = "list"
else:
return None
return MetaDataTuple(coltype, str(name), default, m2m, nullable, uselist, collection)
def get_metadata(class_):
"""Returns a list of MetaDataTuple structures.
"""
return list(get_metadata_iterator(class_))
def get_metadata_map(class_):
rv = {}
for metadata in get_metadata_iterator(class_):
rv[metadata.colname] = metadata
return rv
But it doesn't have the primary key. I use a separate function for that.
mapper = class_mapper(ORMClass)
pkname = str(mapper.primary_key[0].name)
Perhaps I should put the primary key name in the metadata.

Flask-SQLalchemy update a row's information

How can I update a row's information?
For example I'd like to alter the name column of the row that has the id 5.
Retrieve an object using the tutorial shown in the Flask-SQLAlchemy documentation. Once you have the entity that you want to change, change the entity itself. Then, db.session.commit().
For example:
admin = User.query.filter_by(username='admin').first()
admin.email = 'my_new_email#example.com'
db.session.commit()
user = User.query.get(5)
user.name = 'New Name'
db.session.commit()
Flask-SQLAlchemy is based on SQLAlchemy, so be sure to check out the SQLAlchemy Docs as well.
There is a method update on BaseQuery object in SQLAlchemy, which is returned by filter_by.
num_rows_updated = User.query.filter_by(username='admin').update(dict(email='my_new_email#example.com')))
db.session.commit()
The advantage of using update over changing the entity comes when there are many objects to be updated.
If you want to give add_user permission to all the admins,
rows_changed = User.query.filter_by(role='admin').update(dict(permission='add_user'))
db.session.commit()
Notice that filter_by takes keyword arguments (use only one =) as opposed to filter which takes an expression.
This does not work if you modify a pickled attribute of the model. Pickled attributes should be replaced in order to trigger updates:
from flask import Flask
from flask.ext.sqlalchemy import SQLAlchemy
from pprint import pprint
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqllite:////tmp/users.db'
db = SQLAlchemy(app)
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80), unique=True)
data = db.Column(db.PickleType())
def __init__(self, name, data):
self.name = name
self.data = data
def __repr__(self):
return '<User %r>' % self.username
db.create_all()
# Create a user.
bob = User('Bob', {})
db.session.add(bob)
db.session.commit()
# Retrieve the row by its name.
bob = User.query.filter_by(name='Bob').first()
pprint(bob.data) # {}
# Modifying data is ignored.
bob.data['foo'] = 123
db.session.commit()
bob = User.query.filter_by(name='Bob').first()
pprint(bob.data) # {}
# Replacing data is respected.
bob.data = {'bar': 321}
db.session.commit()
bob = User.query.filter_by(name='Bob').first()
pprint(bob.data) # {'bar': 321}
# Modifying data is ignored.
bob.data['moo'] = 789
db.session.commit()
bob = User.query.filter_by(name='Bob').first()
pprint(bob.data) # {'bar': 321}
Just assigning the value and committing them will work for all the data types but JSON and Pickled attributes. Since pickled type is explained above I'll note down a slightly different but easy way to update JSONs.
class User(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80), unique=True)
data = db.Column(db.JSON)
def __init__(self, name, data):
self.name = name
self.data = data
Let's say the model is like above.
user = User("Jon Dove", {"country":"Sri Lanka"})
db.session.add(user)
db.session.flush()
db.session.commit()
This will add the user into the MySQL database with data {"country":"Sri Lanka"}
Modifying data will be ignored. My code that didn't work is as follows.
user = User.query().filter(User.name=='Jon Dove')
data = user.data
data["province"] = "south"
user.data = data
db.session.merge(user)
db.session.flush()
db.session.commit()
Instead of going through the painful work of copying the JSON to a new dict (not assigning it to a new variable as above), which should have worked I found a simple way to do that. There is a way to flag the system that JSONs have changed.
Following is the working code.
from sqlalchemy.orm.attributes import flag_modified
user = User.query().filter(User.name=='Jon Dove')
data = user.data
data["province"] = "south"
user.data = data
flag_modified(user, "data")
db.session.merge(user)
db.session.flush()
db.session.commit()
This worked like a charm.
There is another method proposed along with this method here
Hope I've helped some one.
Models.py define the serializers
def default(o):
if isinstance(o, (date, datetime)):
return o.isoformat()
def get_model_columns(instance,exclude=[]):
columns=instance.__table__.columns.keys()
columns=list(set(columns)-set(exclude))
return columns
class User(db.Model):
__tablename__='user'
id = db.Column(db.Integer, primary_key=True, autoincrement=True)
.......
####
def serializers(self):
cols = get_model_columns(self)
dict_val = {}
for c in cols:
dict_val[c] = getattr(self, c)
return json.loads(json.dumps(dict_val,default=default))
In RestApi, We can update the record dynamically by passing the json data into update query:
class UpdateUserDetails(Resource):
#auth_token_required
def post(self):
json_data = request.get_json()
user_id = current_user.id
try:
instance = User.query.filter(User.id==user_id)
data=instance.update(dict(json_data))
db.session.commit()
updateddata=instance.first()
msg={"msg":"User details updated successfully","data":updateddata.serializers()}
code=200
except Exception as e:
print(e)
msg = {"msg": "Failed to update the userdetails! please contact your administartor."}
code=500
return msg
I was looking for something a little less intrusive then #Ramesh's answer (which was good) but still dynamic. Here is a solution attaching an update method to a db.Model object.
You pass in a dictionary and it will update only the columns that you pass in.
class SampleObject(db.Model):
id = db.Column(db.BigInteger, primary_key=True)
name = db.Column(db.String(128), nullable=False)
notes = db.Column(db.Text, nullable=False)
def update(self, update_dictionary: dict):
for col_name in self.__table__.columns.keys():
if col_name in update_dictionary:
setattr(self, col_name, update_dictionary[col_name])
db.session.add(self)
db.session.commit()
Then in a route you can do
object = SampleObject.query.where(SampleObject.id == id).first()
object.update(update_dictionary=request.get_json())
Update the Columns in flask
admin = User.query.filter_by(username='admin').first()
admin.email = 'my_new_email#example.com'
admin.save()
To use the update method (which updates the entree outside of the session) you have to query the object in steps like this:
query = db.session.query(UserModel)
query = query.filter(UserModel.id == user_id)
query.update(user_dumped)
db.session.commit()

Categories

Resources