HACKER Q&A
📣 powersnail

Why does large binary files make Git slower?


I've made the mistake of committing a few hundred MBs of binary files to a git repository. I should've known better. Repeatedly, I was informed by many sources that it is unwise to do so.

But that also makes me wonder, how exactly does large binary files slow git down?

If that one commit which involves the binary files is slow, that's self-evident.

If cloning is slow, that's self-evident. There's more date to download.

But it seems that once the big files are in, even if they don't get changed at all, the whole repository still slows down significantly, on almost every operation. A simple `git status` takes several seconds to complete. Neither `gc` nor `repack` could salvage that. In the end, I had to move them all to LFS, which removes all the binary files from the entire history.

So, why does large binary files affect git so much, even if they remain unchanged after being added?


  👤 LinuxBender Accepted Answer ✓
Here [1] is some discussion on this question on StackOverflow.

[1] - https://stackoverflow.com/questions/3055506/git-is-very-very...