> and will give an "table full" error when the 64-bit signed integer overflows Given that `(2^63)/60/60/24/365/1e6 = 292471.2`, seems to me one needs at least 292K row mutations per **microsecond** for a full year to overflow that signed 64-bit int, can we all agree that's not a problem in practice?