SQLite Forum

Endianness help
Login
I said the bit numbering convention **is** *pretty near universal* because: (1) There is no law of physics making it so; and (2) There was a time, in living memory for some (such as you and me), when that convention had not been settled upon. Fortunately, it has been settled since before systems in (common) use today were devised and documented.

The now-settled convention makes the number with which we label the bit the same as the base 2 logarithm of its weight when treated as part of an integer. This is much easier to remember than "A bit's number is the word size - log2(its weight) - 1." (Eg Xerox SIGMA 7) This convention is less crazy-making too, a feature I believe led to its ultimate prevalence.

It might be noticed that the OP mentioned "bit address", which could be (and sometimes should be) independent of the bit labeling scheme. The ordering of bits written in text could be yet another independent variable, varying among people. (I understand that the left-to-right <=> earlier-to-later mapping is not universal.)

Fortunately, today, regardless of what people choose to call the bits, or how they elect to depict them, when a byte with value treated as N is written to a portable storage medium or communication channel and read back from that into another competent [a] machine as a byte, that byte will have the value treated as N.  For example, I wrote this post on a machine whose least significant bit is called 'Chester', (a purely local and transient convention), yet when displayed on other machines, the glyphs I saw while writing it likely resemble the ones seen by those who read this far. That is true for the same reason this thread is much ado about nothing.

[a. By "competent", I mean outside of the scorn-worthy set. ]