I've got a Django REST Framework serializer containing the following:
from rest_framework import serializers
class ThingSerializer(serializers.ModelSerializer):
last_changed = serializers.SerializerMethodField(read_only=True)
def get_last_changed(self, instance: Thing) -> str:
log_entry = LogEntry.objects.get_for_object(instance).latest()
representation: str = serializers.DateTimeField('%Y-%m-%dT%H:%M:%SZ').to_representation(log_entry.timestamp)
return representation
This is problematic because if the datetime formatting ever changes it will be different to all the other datetimes. I want to reuse the code path which DRF uses to serialize other datetime fields.
What I've tried so far:
The only answer which looked relevant doesn't actually produce the same result as DRF (it includes milliseconds, which DRF does not), presumably because it's using the Django rather than DRF serializer.
rest_framework.serializers.DateTimeField().to_representation(log_entry.timestamp), rest_framework.fields.DateTimeField().to_representation(log_entry.timestamp) and rest_framework.fields.DateTimeField(format=api_settings.DATETIME_FORMAT).to_representation(log_entry.timestamp) don't work either; they produce strings with microsecond accuracy. I've verified with a debugger that DRF calls the latter when serializing other fields, so I can't understand why it produces a different result in my case.
LogEntry.timestamp is declared as a django.db.DateTimeField, but if I try something like LogEntry.timestamp.to_representation(log_entry.timestamp) it fails badly:
AttributeError: 'DeferredAttribute' object has no attribute 'to_representation'
Taking a look through the source of DRF, the interesting stuff is happening in rest_framework/fields.py.
In particular, all of the formatting stuff is happening directly in the DateTimeField.to_representation method.
You have a couple of ways of replicating DRF's behaviour.
First, you could just not pass a format at all. DRF should use its default if you don't explicitly supply a format.
representation: str = serializers.DateTimeField().to_representation(log_entry.timestamp)
Alternatively, keep doing what you're doing, but explicitly pass the format string from DRF's api_settings.DATETIME_FORMAT. This might feel less magical, but honestly it's probably more brittle to API changes in the future.
This might look like:
from rest_framework.settings import api_settings
...
representation: str = serializers.DateTimeField(api_settings.DATETIME_FORMAT).to_representation(log_entry.timestamp)
However, given that you attempted the first and it failed, we need to look a bit deeper!
The default DateFormat for DRF is ISO_8601, which has the following code in it:
value = value.isoformat()
if value.endswith('+00:00'):
value = value[:-6] + 'Z'
return value
That is, it effectively just leans on the python isoformat function.
isoformat will format differently if the value has microseconds or not.
From the Python docs, isoformat will:
Return a string representing the date and time in ISO 8601 format, YYYY-MM-DDTHH:MM:SS.ffffff or, if microsecond is 0, YYYY-MM-DDTHH:MM:SS
In this case, the solution is to explicitly set the microseconds to zero in the timestamp. There are a couple of ways to do this, but we can switch to a Unix timestamp, clip to seconds, and back again
ts = int(log_entry.timestamp)
representation: str = serializers.DateTimeField().to_representation(ts)
or keep using the DateTime object directly, which will have better timezone handling:
representation: str = serializers.DateTimeField().to_representation(
logentry.replace(microsecond=0)
)
Related
I have subclassed built-in model Fields to reduce repetition in similar columns. This triggers exceptions in tests against Django 3.2 (but interestingly does work in the otherwise now irrelevant, unsupported, version 2.2)
django.core.exceptions.FieldError: Expression contains mixed types: DecimalField, DecimalFWB. You must set output_field.
from django.db.models import Model,DecimalField,F
from decimal import Decimal
class DecimalFWB(DecimalField):
#property
def validators(self):
return super().validators + [MinValueValidator(0.1), ]
...
class Repro(Model):
frac = DecimalFWB(max_digits=4, decimal_places=4, default=Decimal("0.2"))
...
# same internal type
assert DecimalFWB().get_internal_type() == DecimalField().get_internal_type()
# 3.2: works
# 2.2: works
Repro.objects.annotate(dec_annotation = -F("frac") + Decimal(1)).first()
# 3.2: django.core.exceptions.FieldError
# 2.2: works
Repro.objects.annotate(dec_annotation = Decimal(1) - F("frac")).first()
I found this entry in the Django 3.2 release notes that could explain the change in behaviour from the earlier version:
[..] resolving an output_field for database functions and combined expressions may now crash with mixed types when using Value(). You will need to explicitly set the output_field in such cases.
That suggestion does not solve my problem. If I were to bloat all annotations with ExpressionWrapper/output_field=, I could just as well bloat the model definition and not use the subclass in the first place.
I am trying to emulate the internal type. I want the combined output_field of DecimalField and DecimalFWB to be DecimalField - regardless of order of super/subclass. How do I express that no mixing is happening here?
Automatically selecting the shared field as the output has been fixed as of Bug #33397, released in Django 4.1 (but not backported). The change does however comes with a warning (emphasis mine):
As a guess, if the output fields of all source fields match then
simply infer the same type here.
This guess is mostly a bad idea, but there is quite a lot of code
(especially 3rd party Func subclasses) that depend on it, we'd need a
deprecation path to fix it.
Meaning this might change again in a future release, but at least then intentionally, and reliably triggering a DeprecationWarning on request.
I want to do a query based on two fields of a model, a date, offset by an int, used as a timedelta
model.objects.filter(last_date__gte=datetime.now()-timedelta(days=F('interval')))
is a no-go, as a F() expression cannot be passed into a timedelta
A little digging, and I discovered DateModifierNode - though it seems it was removed in this commit: https://github.com/django/django/commit/cbb5cdd155668ba771cad6b975676d3b20fed37b (from this now-outdated SO question Django: Using F arguments in datetime.timedelta inside a query)
the commit mentions:
The .dates() queries were implemented by using custom Query, QuerySet,
and Compiler classes. Instead implement them by using expressions and
database converters APIs.
which sounds sensible, and like there should still be a quick easy way - but I've been fruitlessly looking for how to do that for a little too long - anyone know the answer?
In Django 1.10 there's simpler method to do it but you need to change the model a little: use a DurationField. My model is as follows:
class MyModel(models.Model):
timeout = models.DurationField(default=86400 * 7) # default: week
last = models.DateTimeField(auto_now_add=True)
and the query to find objects where last was before now minus timeout is:
MyModel.objects.filter(last__lt=datetime.datetime.now()-F('timeout'))
Ah, answer from the docs: https://docs.djangoproject.com/en/1.9/ref/models/expressions/#using-f-with-annotations
from django.db.models import DateTimeField, ExpressionWrapper, F
Ticket.objects.annotate(
expires=ExpressionWrapper(
F('active_at') + F('duration'), output_field=DateTimeField()))
which should make my original query look like
model.objects.annotate(new_date=ExpressionWrapper(F('last_date') + F('interval'), output_field=DateTimeField())).filter(new_date__gte=datetime.now())
I find myself stuck on this problem, and repeated Googling, checking SO, and reading numerous docs has not helped me get the right answer, so I hope this isn't a bad question.
One entity I want to create is an event taking place during a convention. I'm giving it the property start_time = ndb.TimeProperty(). I also have a property date = messages.DateProperty(), and I'd like to keep the two discrete (in other words, not using DateTimeProperty).
When a user enters information to create an event, I want to specify defaults for any fields they do not enter at creation and I'd like to set the default time as midnight, but I can't seem to format it correctly so the service accepts it (constant 503 Service Unavailable response when I try it using the API explorer).
Right now I've set things up like this (some unnecessary details removed):
event_defaults = {...
...
"start_time": 0000,
...
}
and then I try looping over my default values to enter them into a dictionary which I'll use to .put() the info on the server.
data = {field.name: getattr(request, field.name) for field in request.all_fields()
for default in event_defaults:
if data[default] in (None, []):
data[default] = event_defaults[default]
setattr(request, default, event_defaults[default])
In the logs, I see the error Encountered unexpected error from ProtoRPC method implementation: BadValueError (Expected time, got 0). I have also tried using the time and datetime modules, but I must be using them incorrectly, because I still receive errors.
I suppose I could work around this problem by using ndb.StringProperty() instead, and just deal with strings, but then I'd feel like I would be missing out on a chance to learn more about how GAE and NDB work (all of this is for a project on udacity.com, so learning is certainly the point).
So, how can I structure my default time properly for midnight? Sorry for the wall of text.
Link to code on github. The conference.py file contains the code I'm having the trouble with, and models.py contains my definitions for the entities I'm working with.
Update: I'm a dummy. I had my model class using a TimeProperty() and the corresponding message class using a StringField(), but I was never making the proper conversion between expected types. That's why I could never seem to give it the right thing, but it expected two different things at different points in the code. Issue resolved.
TimeProperty expects a datetime.time value
import datetime
event_defaults = {...
...
"start_time": datetime.time(),
...
}
More in the docs: https://cloud.google.com/appengine/docs/python/ndb/entity-property-reference#Date_and_Time
Use the datetime() library to convert it into a valid ndb time property value
if data['time']:
data['time'] = datetime.strptime(data['time'][:10], "%H:%M").time()
else:
data['time'] = datetime.datetime.now().time()
ps: Don't forget to change data['time'] with your field name
In the project I work for, we often need to convert text to the value of a trait. Generally, we use the is_trait_type method to do the appropriate conversion.
However, it doesn't work with Date traits. Here is a MWE:
from traits.has_traits import HasTraits
from traits.trait_types import Int, Date
class A(HasTraits):
a_date = Date
an_int = Int
a = A()
class_traits = a.class_traits()
print class_traits["an_int"].is_trait_type(Int)
print class_traits["a_date"].is_trait_type(Date)
The Int behave as expected but the Date fails with:
TypeError: isinstance() arg 2 must be a class, type, or tuple of classes and types
We use Enthought traits module (version 4.1.0) under Ubuntu 14.04.
As mentioned in the comments, Date (and Time) trait types are not classes but instances. The is_trait_type(x) method checks whether self.trait_type is an instance of the provided class (i.e. the value of x), and hence it fails if x is not a class. In my opinion, this a bug in the API.
If you need a workaround, you can define a method like this:
def my_is_trait_type(trait, trait_type):
if isinstance(trait_type, BaseInstance):
return trait.is_trait_type(trait_type.__class__)
else:
return trait.is_trait_type(trait_type)
However, I would reconsider using is_trait_type() for the task of finding the appropriate conversion. Maybe a map would do, for instance.
We have sqlite databases, and datetimes are actually stored in Excel format (there is a decent reason for this; it's our system's standard representation of choice, and the sqlite databases may be accessed by multiple languages/systems)
Have been introducing Python into the mix with great success in recent months, and SQLAlchemy is a part of that. The ability of the sqlite3 dbapi layer to swiftly bind custom Python functions where SQLite lacks a given SQL function is particularly appreciated.
I wrote an ExcelDateTime type decorator, and that works fine when retrieving result sets from the sqlite databases; Python gets proper datetimes back.
However, I'm having a real problem binding custom python functions that expect input params to be python datetimes; I'd have thought this was what the bindparam was for, but I'm obviously missing something, as I cannot get this scenario to work. Unfortunately, modifying the functions to convert from excel datetimes to python datetimes is not an option, and neither is changing the representation of the datetimes in the database, as more than one system/language may access it.
The code below is a self-contained example that can be run "as-is", and is representative of the issue. The custom function "get_month" is created, but fails because it receives the raw data, not the type-converted data from the "Born" column. At the end you can see what I've tried so far, and the errors it spits out...
Is what I'm trying to do impossible? Or is there a different way of ensuring the bound function receives the appropriate python type? It's the only problem I've been unable to overcome so far, would be great to find a solution!
import sqlalchemy.types as types
from sqlalchemy import create_engine, Table, Column, Integer, String, MetaData
from sqlalchemy.sql.expression import bindparam
from sqlalchemy.sql import select, text
from sqlalchemy.interfaces import PoolListener
import datetime
# setup type decorator for excel<->python date conversions
class ExcelDateTime( types.TypeDecorator ):
impl = types.FLOAT
def process_result_value( self, value, dialect ):
lxdays = int( value )
lxsecs = int( round((value-lxdays) * 86400.0) )
if lxsecs == 86400:
lxsecs = 0
lxdays += 1
return ( datetime.datetime.fromordinal(lxdays+693594)
+ datetime.timedelta(seconds=lxsecs) )
def process_bind_param( self, value, dialect ):
if( value < 200000 ): # already excel float?
return value
elif( isinstance(value,datetime.date) ):
return value.toordinal() - 693594.0
elif( isinstance(value,datetime.datetime) ):
date_part = value.toordinal() - 693594.0
time_part = ((value.hour*3600) + (value.minute*60) + value.second) / 86400.0
return date_part + time_part # time part = day fraction
# create sqlite memory db via sqlalchemy
def get_month( dt ):
return dt.month
class ConnectionFactory( PoolListener ):
def connect( self, dbapi_con, con_record ):
dbapi_con.create_function( 'GET_MONTH',1,get_month )
eng = create_engine('sqlite:///:memory:',listeners=[ConnectionFactory()])
eng.dialect.dbapi.enable_callback_tracebacks( 1 ) # show better errors from user functions
meta = MetaData()
birthdays = Table('Birthdays', meta, Column('Name',String,primary_key=True), Column('Born',ExcelDateTime), Column('BirthMonth',Integer))
meta.create_all(eng)
dbconn = eng.connect()
dbconn.execute( "INSERT INTO Birthdays VALUES('Jimi Hendrix',15672,NULL)" )
# demonstrate the type decorator works and we get proper datetimes out
res = dbconn.execute( select([birthdays]) )
tuple(res)
# >>> ((u'Jimi Hendrix', datetime.datetime(1942, 11, 27, 0, 0)),)
# simple attempt (blows up with "AttributeError: 'float' object has no attribute 'month'")
dbconn.execute( text("UPDATE Birthdays SET BirthMonth = GET_MONTH(Born)") )
# more involved attempt( blows up with "InterfaceError: (InterfaceError) Error binding parameter 0 - probably unsupported type")
dbconn.execute( text( "UPDATE Birthdays SET BirthMonth = GET_MONTH(:Born)",
bindparams=[bindparam('Born',ExcelDateTime)],
typemap={'Born':ExcelDateTime} ),
Born=birthdays.c.Born )
Many thanks.
Instead of letting Excel/Microsoft dictate how you store date/time, it would be less trouble and work for you to rely on standard/"obvious way" of doing things.
Process objects according to the standards of their domain - Python's way (datetime objects) inside Python/SQLAlchemy, SQL's way inside SQLite (native date/time type instead of float!).
Use APIs to do the necessary translation between domains. (Python talks to SQLite via SQLAlchemy, Python talks to Excel via xlrd/xlwt , Python talks to other systems, Python is your glue.)
Using standard date/time types in SQLite allows you to write SQL without Python involve in standard readable way (WHERE date BETWEEN '2011-11-01' AND '2011-11-02' makes much more sense than WHERE date BETWEEN 48560.9999 AND 48561.00001). It allows you to easily port it to another DBMS (without rewriting all those ad-hoc functions) when your application/databse needs to grow.
Using native datetime objects in Python allows you to use a lot of freely available, well tested, and non-EEE (embrace, extend, extinguish) APIs. SQLAlchemy is one of those.
And I hope you are aware of that slight but dangerous difference between Excel datetime floats in Mac and Windows? Who knows that one of your clients would in the future submit an Excel file from a Mac and crash your application (actually, what's worse is they suddenly earned a million dollars from the error)?
So my suggestion is for you to use xlrd/xlwt when dealing with Excel from Python (there's another package out there for reading Excel 2007 up) and let SQLALchemy and your database use standard datetime types. However if you insist on continuing to store datetime as Excel float, it could save you a lot of time to reuse code from xlrd/xlwt. It has functions for converting Python objects to Excel data and vice-versa.
EDIT: for clarity...
You have no issues reading from the database to Python because you have that class that converts the float into Python datetime.
You have issues writing to the database through SQLAlchemy or using other native Python functions/modules/extensions because you are trying to force a non-standard type when they are expecting the standard Python datetime. ExcelDateTime type from the point of view Python is a float, not datetime.
Although Python uses dynamic/duck typing, it still is strongly typed. It won't allow you to do "nonsense/silliness" like adding integers to string, or forcing float for datetime.
At least two ways to address that:
Declare a custom type - Seems to be the path you wanted to take. Unfortunately this is the hard way. It's quite difficult to create a type that is a float that can also pretend to be datetime. Possible, yes, but requires a lot of study on type instrumentation. Sorry, you have to grok the documentation for that on your own.
Create utility functions - Should be the easier way, IMHO. You need 2 functions: a) float_to_datetime() for converting data from the database to return a Python datetime, and b) datetime_to_float() for converting Python datetime to Excel float.
About solution #2, as I was saying that you could simplify your life by reusing the xldate_from_datetime_tuple() from xlrd/xlwt. That function "Convert a datetime tuple (year, month, day, hour, minute, second) to an Excel date value." Install xlrd then go to /path_to_python/lib/site-packages/xlrd. The function is in xldate.py - the source is well documented for understanding.