SQLite Forum

"Office Space"... I'm losing all those pennies
> Why would you expect differently from a Database being told to use floating-point numbers to be different than a C program using floating-point numbers.

Ok, but shouldn't other RDBMSs do the same? However, in MariaDB 10.5.8, I get:

`SELECT SUM(CAST('16.15' AS FLOAT))*100; // 1614.9999618530273`

`SELECT CAST(SUM(CAST('16.15' AS FLOAT))*100 AS INT); // 1615`

`SELECT SUM(CAST(CAST('16.15' AS FLOAT)*100 AS INT)); // 1615`

I clearly understand the pros and cons of floating-point arithmetic. But what puzzle me each time is that different software returns different results for the same thing??? My confidence in floating-point arithmetic would be much greater if it was consistent everywhere.