Auto-incrementing attribute with custom logic in SQLAlchemy - python

I have a simple "Invoices" class with a "Number" attribute that has to
be assigned by the application when the user saves an invoice. There
are some constraints:
1) the application is a (thin) client-server one, so whatever
assigns the number must look out for collisions
2) Invoices has a "version" attribute too, so I can't use a simple
DBMS-level autoincrementing field
I'm trying to build this using a custom Type that would kick in every
time an invoice gets saved. Whenever process_bind_param is called with
a None value, it will call a singleton of some sort to determine the
number and avoid collisions. Is this a decent solution?
Anyway, I'm having a problem.. Here's my custom Type:
class AutoIncrement(types.TypeDecorator):
impl = types.Unicode
def copy(self):
return AutoIncrement()
def process_bind_param(self, value, dialect):
if not value:
# Must find next autoincrement value
value = "1" # Test value :)
return value
My problem right now is that when I save an Invoice and AutoIncrement
sets "1" as value for its number, the Invoice instance doesn't get
updated with the new number.. Is this expected? Am I missing
something?
Many thanks for your time!
(SQLA 0.5.3 on Python 2.6, using postgreSQL 8.3)
Edit: Michael Bayer told me that this behaviour is expected, since TypeDecorators don't deal with default values.

Is there any particular reason you don't just use a default= parameter in your column definition? (This can be an arbitrary Python callable).
def generate_invoice_number():
# special logic to generate a unique invoice number
class Invoice(DeclarativeBase):
__tablename__ = 'invoice'
number = Column(Integer, unique=True, default=generate_invoice_number)
...

Related

How to set a random integer as the default value for a Django CharField?

My models.py looks like this:
import random
random_string = str(random.randint(10000, 99999))
class Content(models.Model):
......
unique_url = models.CharField(default = random_string)
When I add a content in admin, an integer in the range is generated and put into the charfield as its default value. From there, I can simple add more words to the charfield. However, the problem with my current set-up is that the integer remains the same every time I add a new article. I want to generate and insert the random integer as I am using the unique_url field to basically find each of my specific objects, and I am expecting a lot of content, so adding the random number will generally ensure that each content has a one of a kind unique_url.
Therefore, I am looking for a system which generates a random integer everytime a new content is added using the admin panel, and puts it as the default of one the fields. Is such a thing even possible in Django?
This way you generate a random number once. You need to define a function such as:
def random_string():
return str(random.randint(10000, 99999))
And then define your model as you already have, without () in order to pass a reference to the function itself rather a value returned by the function:
class Content(models.Model):
......
unique_url = models.CharField(default = random_string)
You can generate a random string by passing the function as the default value. The function will run and set the default value with the return value of the function. Quoting the example given by #Wtower.
def random_string():
return str(random.randint(10000, 99999))
class Content(models.Model):
......
unique_url = models.CharField(default = random_string)
But this has a caveat : When you create a field and migrate an existing database the function will run only once and updates with the same 'random' number.
For example, If you already have 500 entries in the model. You will have the same string, say '548945', for every unique_url which will kill the whole purpose.
You can overcome this by changing the values of the existing entries in the database. This is one time job and can be done using django shell.
python ./manage.py shell
from appname.models import Content, random_string
# Change appname and file name accordingly
entries = Content.objects.all()
for entry in entries :
entry.unique_url = random_string()
entry.save()

Sqlalchemy - update column based on changes in another column

I'm using sqlalchemy but find documentation difficult to search.
I've these two columns:
verified = Column(Boolean, default=False)
verified_at = Column(DateTime, nullable=True)
I'd like to create a function that does something like this:
if self.verified and not oldobj.verified:
self.verified_at = datetime.datetime.utcnow
if not self.verified and oldobj.verified:
self.verified_at = None
I'm not sure where to put code like this. I could put it in the application, but would prefer the model object took care of this logic.
I think what you're looking for is a Hybrid Property.
from sqlalchemy.ext.hybrid import hybrid_property
class VerifiedAsset(Base):
id = Column(Integer, primary_key=True)
verified_at = Column('verified_at', String(24))
#hybrid_property
def verification(self):
return self.verified_at;
#verification.setter
def verification(self, value):
if value and not self.verification:
self.verified_at = datetime.datetime.utcnow
if not value and self.verification:
self.verified_at = None
# Presumably you want to handle your other cases here
You want to update your verified_at value in a particular way based on some incoming new value. Use properties to wrap the underlying value, and only update when it is appropriate, and only to what you're actually persisting in the db.
You can use sqlalchemy's events registration to put code like that: http://docs.sqlalchemy.org/en/latest/core/event.html.
Basically, you can subscribe to certain events that happen in the Core and ORM. I think it's a clean way to manage what you want to achieve.
You would use the listen_for() decorator, in order to hook when those columns change.
Reading "Changing Attribute Behavior" and "ORM Events" is a good start on trying to solve this type of problem.
One way to go about it would be to set an event listener that updates the timestamp:
#event.listens_for(MyModel.verified, 'set')
def mymodel_verified_set(target, value, oldvalue, initiator):
"""Set verified_at"""
if value != oldvalue:
target.verified_at = datetime.datetime.utcnow() if value else None

Validating for uniqueness in MongoAlchemy?

I am trying to ensure name uniqueness in a MongoAlchemy-backed model, and am uncertain how to go about it.
My first attempt involved writing a wrap validator which checked for existing database entries with the same name and checked against them (to ensure that there were either 0 or 1 entries with the same name), but this failed because the validator only receives the string with the name, not the entire object (so comparing mongo_ids was impossible).
What's the best way to ensure that objects of a single class all have unique names?
You should use a unique index.
http://www.mongoalchemy.org/api/schema/document.html#mongoalchemy.document.Index
>>> class Person(Document):
... name = StringField()
... name_index = Index().ascending('name').unique()
The database will enforce the constraint for you. It's just wrapping the code that mongo already has here:
http://docs.mongodb.org/manual/tutorial/create-a-unique-index/

Can primary key use BigInteger as the AutoField in Django 1.2.4?

It seems that the default primary key is int. Is there anyway to use the big integer for the autofield as the primary key?
I would suggest you use a newer Django. Official Django documentation doesn't go farther back than 1.3 now. And 1.3 is insecure and unsupported. I realize the question was asked over 3 years ago, but since there is still no accepted answer I will give it a shot.
In Django 1.6.5 you can just do this in your model:
class MyModel(models.Model):
id = models.BigIntegerField(unique=True, primary_key=True)
The primary_key=True will override the default id on the model. In use this field auto increments with each new model object. It just works!
There are a couple of ways I can see to implement this. Either way, you have to define your pk field.
First of all, just create your own id field and override the save method.
modelname(models.Model):
# model definition
def save(self):
self.pkfield = nextIntFucntion()
super(modelname, self).save()
The nextIntFunction() is easy enough with a query of objects ordered by id, then get the id+1
I also found this link BigIntegerField and BigAutoField which seems to solve the problem, but I have not tested it myself
I met the same question too.
I have add some code like
User._meta.has_auto_field = True
User._meta.auto_field = id
And I define the id field to BigIntegerField(primary_key=True)
After I use user.Save(), user.id will have its id, don't need I query again.
I think it works, but it is not a beautiful solution, so I still finding a good way.
Since Django 1.10 you can use BigAutoField as described on documentation works exactly as AutoField but it is guaranteed to fit numbers from 1 to 9223372036854775807.
So you can use it like:
class SomeModel(models.Model):
id = models.BigAutoField()
...
You can hack Django and change the default auto-keys to the right values. Check out:
http://code.djangoproject.com/browser/django/trunk/django/db/backends/mysql/creation.py
from django.conf import settings
from django.db.backends.creation import BaseDatabaseCreation
class DatabaseCreation(BaseDatabaseCreation):
# This dictionary maps Field objects to their associated MySQL column
# types, as strings. Column-type strings can contain format strings; they'll
# be interpolated against the values of Field.__dict__ before being output.
# If a column type is set to None, it won't be included in the output.
data_types = {
'AutoField': 'integer AUTO_INCREMENT',
'BooleanField': 'bool',
'CharField': 'varchar(%(max_length)s)',
You can modify this using a patch in your own code:
DatabaseCreation.data_types['AutoField'] = 'bigint AUTO_INCREMENT'
You will also have to patch the AutoField class:
http://code.djangoproject.com/browser/django/trunk/django/db/models/fields/__init__.py
(untested code, good luck)
http://docs.djangoproject.com/en/dev/topics/db/models/
class BigIntegerField([**options])
available option is :
primary_key
If True, this field is the primary key for the model.
And after all you do a south migration:
ALTER TABLE mytable MODIFY COLUMN myid BIGINT(20) NOT NULL AUTO_INCREMENT;
You are right, sorry. The neccessary snippet is here:
http://djangosnippets.org/snippets/1244/
Allows to create bigint (mysql), bigserial (psql), or NUMBER(19) (oracle) fields which have auto-increment set by using the AutoField of django, therefore ensuring that the ID gets updated in the instance when calling its 'save()' method.
If you would only subclass IntegerField to BigIntegerField and use that as your primary key, your model instance you create would not get the id attribute set when calling 'save()', buy instead you would have to query and load the instance from the DB again to get the ID.
These snippets work. Use the BigAutoField class as your primary key on your model and it works seemlessly without any hacking.

How do I define a unique property for a Model in Google App Engine?

I need some properties to be unique. How can I achieve this?
Is there something like unique=True?
I'm using Google App Engine for Python.
Google has provided function to do that:
http://code.google.com/appengine/docs/python/datastore/modelclass.html#Model_get_or_insert
Model.get_or_insert(key_name, **kwds)
Attempts to get the entity of the model's kind with the given key name. If it exists, get_or_insert() simply returns it. If it doesn't exist, a new entity with the given kind, name, and parameters in kwds is created, stored, and returned.
The get and subsequent (possible) put are wrapped in a transaction to ensure atomicity. Ths means that get_or_insert() will never overwrite an existing entity, and will insert a new entity if and only if no entity with the given kind and name exists.
In other words, get_or_insert() is equivalent to this Python code:
def txn():
entity = MyModel.get_by_key_name(key_name, parent=kwds.get('parent'))
if entity is None:
entity = MyModel(key_name=key_name, **kwds)
entity.put()
return entity
return db.run_in_transaction(txn)
Arguments:
key_name
The name for the key of the entity
**kwds
Keyword arguments to pass to the model class's constructor if an instance with the specified key name doesn't exist. The parent argument is required if the desired entity has a parent.
Note: get_or_insert() does not accept an RPC object.
The method returns an instance of the model class that represents the requested entity, whether it existed or was created by the method. As with all datastore operations, this method can raise a TransactionFailedError if the transaction could not be completed.
There's no built-in constraint for making sure a value is unique. You can do this however:
query = MyModel.all(keys_only=True).filter('unique_property', value_to_be_used)
entity = query.get()
if entity:
raise Exception('unique_property must have a unique value!')
I use keys_only=True because it'll improve the performance slightly by not fetching the data for the entity.
A more efficient method would be to use a separate model with no fields whose key name is made up of property name + value. Then you could use get_by_key_name to fetch one or more of these composite key names and if you get one or more not-None values, you know there are duplicate values (and checking which values were not None, you'll know which ones were not unique.)
As onebyone mentioned in the comments, these approaches – by their get first, put later nature – run the risk concurrency issues. Theoretically, an entity could be created just after the check for an existing value, and then the code after the check will still execute, leading to duplicate values. To prevent this, you will have to use transactions: Transactions - Google App Engine
If you're looking to check for uniqueness across all entities with transactions, you'd have to put all of them in the same group using the first method, which would be very inefficient. For transactions, use the second method like this:
class UniqueConstraint(db.Model):
#classmethod
def check(cls, model, **values):
# Create a pseudo-key for use as an entity group.
parent = db.Key.from_path(model.kind(), 'unique-values')
# Build a list of key names to test.
key_names = []
for key in values:
key_names.append('%s:%s' % (key, values[key]))
def txn():
result = cls.get_by_key_name(key_names, parent)
for test in result:
if test: return False
for key_name in key_names:
uc = cls(key_name=key_name, parent=parent)
uc.put()
return True
return db.run_in_transaction(txn)
UniqueConstraint.check(...) will assume that every single key/value pair must be unique to return success. The transaction will use a single entity group for every model kind. This way, the transaction is reliable for several different fields at once (for only one field, this would be much simpler.) Also, even if you've got fields with the same name in one or more models, they will not conflict with each other.

Categories

Resources