Say you invented a new algorithm. Required qubit is now N -> (N-3) [1]. By the 2^N rule, we see that this is equivalent to 2^3 = 8 times faster. O(0.125 2^N). This is the kind of major breakthrough in the field.
Often I reimplement someone else' code and got 20X~50X speed up. The speed is equivalent to lowering qubit requirement by 5.
But complexity theory tell us this speed up doesn't matter since constant term is ignored.
[1] Tapering-off qubit requirement by IBM lab
Still, it is just a constant.