I recently joined a team that uses a different heuristic than I prefer, and frankly until now I assumed my approach was pretty universal. Since I am joining "their" code base I'm adapting to their style but wanted an outside perspective on what is more common. The way I see it, there are 5 main approaches.
Const literally never: 'Const' is not allowed as all data should be mutable, and the responsibility to mutate the values belongs to the object/class that value is declared in, and that object/class should have full power to do what it wants within itself. Don't expose set() functions if you're worried.
Const nearly never: Only use const for real constants, like in the mathematical sense. Pi, domain name, etc. If a different client or instance could get a different value - it's not a constant.
Const for things that locally never change: Things like feature flags, usernames, etc. Stuff where changing the value outside very explicit situations, something almost definitely went wrong.
Const whenever possible - Use const at all times unless you explicitly need to change the value, such as in an accumulator. If you write a function and never need to reassign the value for your use case, make it const, ignoring all other aspects. If you return to that function a month from now and now need to reassign it, change it to 'let' and make your changes.
Const literally always: 'let' is not allowed. If you need to declare 'let', you either need to actually declare several 'const' variables or make better use of native functions (such as reduce() for accumulators)
Lots of data is not intended to be mutable, even if it lives within an object and there are business reasons for that. I find the restrictions above ideologically purist over practical, like the devs who insist that private/internal methods should not have unit tests.
Const literally always means reduced readability and sometimes a lot of extra code.