Thanks, Richard - good point! My code now treats the timestamp as a 64-bit integer, multiplies it by 1,000,000, and adds the microseconds.
Thanks, Richard - good point! My code now treats the timestamp as a 64-bit integer, multiplies it by 1,000,000, and adds the microseconds.