I have a Google App Engine datetime property which I populate with x.date = datetime.datetime.now(). I do a lot of comparisons between dates, and after much debugging, it turns out my client device sends dates out with less precision than a Python date, which caused a terrible mess.
Here is what Python generates:
2012-08-28 21:36:13.158497 with datetime.datetime.now(), but what I want is 2012-08-28 21:36:13.158000 (notice the three zeros at the end.)
How can I achieve this? (keep in mind, I'm not trying to format strings or anything. I want to format a date object.)
I guess one way would be to format it into a string with desired precision, like this:
dateString = date.strftime('%Y-%m-%d %H:%M:%S.%f')[:-3]
and then back to a date object. But there's got to be a better way.
dt = dt.replace(microsecond = (dt.microsecond / 1000) * 1000)
This will truncate the last 3 digits. Proper rounding is a little more complicated due to the possibility that it might round to 1000000 microseconds.
Related
I have a column of float values which are tweet creation dates. This is the code I used to convert them from float to datetime:
t = 1508054212.0
datetime.utcfromtimestamp(t).strftime('%Y-%m-%d %H:%M:%S')
All the values returned belong to October 2017. However, the data is supposed to be collected over multiple months. So the dates should have different months and not just different Hours, Minutes and Seconds.
These are some values which I need to convert:
1508054212.0
1508038548.0
1506890436.0
Request you to suggest an alternative approach to determine the dates. Thank you.
I assumed df['tweet_creation'].loc[1] will return a number like the examples you gave.
Unfortunately, I don't know what f is, but I assumed it was a float.
My answer is inspired by this other answer: Converting unix timestamp string to readable date. You have a UNIX timestamp, so the easiest way is to use it and not convert it as a string.
from datetime import datetime, timedelta
dtobj = datetime.utcfromtimestamp(int(df['tweet_creation'].loc[1])) + timedelta(days=f-int(f))
To have the string representation you can use the function strftime.
I need to output a timestamp for a .csv file of the current time in milliseconds. Right now I have:
localTime = time.localtime(time.time())
now = time.localtime(time.time())
currTime = time.time()
now = time.strftime("\"%Y-%m-%d %H:%M:%S.%f\"", time.localtime(currTime))
doing it this way will output the timestamp in the following format:
"2017-05-09 10:13:33.%f" this obviously is not correct. Ive heard that time.time only goes as precise as a second, but have also heard that it can support microseconds. Can somebody clear this up for me or show me the proper way to format this code to get a timestamp in the needed format? (2017-05-09 10:13:33.100) for example
A quick solution would be:
t=time.time()
millis = int((t - int(t))*1000)
As you said, the problem is that time doesn't necessarily give you the precision you want[1]. datetime would be a better option:
from datetime import datetime
now = datetime.utcnow() # or datetime.now(your_timezone)
formatted = now.strftime("%Y-%m-%d %H:%M:%S.%f")
print(formatted)
[1] Both in python 2.x and 3.x, according to the docs:
Note that even though the time is always returned as a floating point number, not all systems provide time with a better precision than 1 second. While this function normally returns non-decreasing values, it can return a lower value than a previous call if the system clock has been set back between the two calls.
This question already has answers here:
Convert weird Python date format to readable date
(2 answers)
Closed 7 years ago.
I'm importing data from an Excel spreadsheet into python. My dates are coming through in a bizarre format of which I am not familiar and cannot parse.
in excel: (7/31/2015)
42216
after I import it:
u'/Date(1438318800000-0500)/'
Two questions:
what format is this and how might I parse it into something more intuitive and easier to read?
is there a robust, swiss-army-knife-esque way to convert dates without specifying input format?
Timezones necessarily make this more complex, so let's ignore them...
As #SteJ remarked, what you get is (close to) the time in seconds since 1st January 1970. Here's a Wikipedia article how that's normally used. Oddly, the string you get seems to have a timezone (-0500, EST in North America) attached. Makes no sense if it's properly UNIX time (which is always in UTC), but we'll pass on that...
Assuming you can get it reduced to a number (sans timezone) the conversion into something sensible in Python is really straight-forward (note the reduction in precision; your original number is the number of milliseconds since the epoch, rather than the standard number of seconds from the epoch):
from datetime import datetime
time_stamp = 1438318800
time_stamp_dt = datetime.fromtimestamp(time_stamp)
You can then get time_stamp_dt into any format you think best using strftime, e.g., time_stamp_dt.strftime('%m/%d/%Y'), which pretty much gives you what you started with.
Now, assuming that the format of the string you provided is fairly regular, we can extract the relevant time quite simply like this:
s = '/Date(1438318800000-0500)/'
time_stamp = int(s[6:16])
I have imported some timestamps into a Pandas frame (via MongoDBclient). They are to the microseconds. I'd like a way to round to the Day. I've seen a previous question about using np.round while converting to ints and back again, but this doesn't work (I tried inlining a div by 3600 x 24 x 100000 but that didn't work).
I have this rather plain version, but it seems REALLY inefficient. What am I missing in either to_datetime, or the np.round example.
df['doa'] = df['doa'].map(lambda x: x.strftime("%Y-%m-%d")
pd.to_datetime(df['doa'])
Note, these are not INDEXES so I can't use the frequency trick.
There's this feature request, which suggests there's no good way:
ENH: add rounding method to DatetimeIndex/TimedeltaIndex
However, in the article I found there is also this approach for minutes, which I might be able to modify:
pd.DatetimeIndex(((dti.asi8/(1e9*60)).round()*1e9*60).astype(np.int64))
Rounding Pandas Timestamp to minutes
I'd like to a convert unix timestamp I have in a string (ex. 1277722499.82) into a more humanized format (hh:mm:ss or similar). Is there an easy way to do this in python for a django app? This is outside of a template, in the model that I would like to do this. Thanks.
edit
I'm using the python function time.time() to generate the timestamp. According to the doc:
time.time()
Return the time as a floating point number expressed in seconds
since the epoch, in UTC. Note that
even though the time is always
returned as a floating point number,
not all systems provide time with a
better precision than 1 second. While
this function normally returns
non-decreasing values, it can return a
lower value than a previous call if
the system clock has been set back
between the two calls.
import datetime
datestring = "1277722499.82"
dt = datetime.datetime.fromtimestamp(float(datestring))
print(dt)
2010-06-28 11:54:59.820000