But that also makes me wonder, how exactly does large binary files slow git down?
If that one commit which involves the binary files is slow, that's self-evident.
If cloning is slow, that's self-evident. There's more date to download.
But it seems that once the big files are in, even if they don't get changed at all, the whole repository still slows down significantly, on almost every operation. A simple `git status` takes several seconds to complete. Neither `gc` nor `repack` could salvage that. In the end, I had to move them all to LFS, which removes all the binary files from the entire history.
So, why does large binary files affect git so much, even if they remain unchanged after being added?
[1] - https://stackoverflow.com/questions/3055506/git-is-very-very...