This is somewhat fascinating because according to my calculations the number of milliseconds since the julian epoch (the iJD value) should be 464269060799999 for the greatest round-trippable datetime ('9999-12-31 23:59:59.999'). This value can be exactly represented in a double precision floating point value (it requires 49 bits of the 53 available significand bits). Thus all lesser values should also be exactly represented. Similarly the divisor 86400000 to convert from milliseconds since the epoch to days since the epoch can also be exactly represented in a double precision floating point value. Dividing the former by the later should result in an "correctly rounded" result within 1 ULP, if the IEEE-754 computation rules are being followed. Since this appears to not be the case, I can only conclude that the arithmetic computations (division) are somehow defectively implemented (apparently by the GCC 5.2.0 compiler being used). The JulianDay value at the top of the round trippable range should have a machine precision of 0.08047 milliseconds, or well within the millisecond accuracy of the source iJD.