Note that the builtin julianday is probably far more efficient. Internally, a datetime is stored as a 64-bit offset in milliseconds from the julian epoch. It takes all the same parameters as the rest of the datetime functions to initialize this internal offset -- and the return value is merely "converting" that iJD offset into a floating point number of days since the epoch. strftime('%J', ...) is syntactic sugar for printf('%.16g', julianday(...)) Both take all the same arguments (...) however strftime returns a character string that is the result of converting the double julianday number to text, whereas julianday simply returns the double directly -- so if you are going to be doing further calculations there is no point in converting a double to text then back to a double again. Since the internal datetime is stored in milliseconds since the julian epoch, the result calculated on the julianday() function should be precise to the millisecond right up until the year 10K problem is happening.