There is one place in computers where there is real bit-order, and that when it is represented on serial lines.
I know I've seen material where a CPU numbered bits from 1 to 32, with 1 being most significant, but I can't remember the manufacturer.
Let us not forget the PDP-11, whose 32-bit format is both BE (in bytes) and LE (in words).
Now, why is bit 0 the lowest? Well I think this follows naturally from binary representation as a polynomial of powers or two. So bit 0 is 2^0, bit 1 is 2^1, etc. Representation with digits in base 2 also simply follows standard order of numerals: 100 is one hundred, not zero zero one whether in decimal or binary.
The issue of big-endian vs little-endian really arises when you want to represent a word across bytes. Different people made different choices, since they are equally valid.
Now, over a serial line you can also decide to transmit bits one way or the other, and this can indeed go either way depending on the interface, either LSB (least significant bit) first or MSB (most significant bit) first.
Want to compare two numbers? You can short-circuit (hah) as soon as the leftmost bit is different.