HACKER Q&A
📣 trealira

How did Git become the standard when Windows is the majority OS?


I'm young enough that Git became the standard before I was even into programming, so I didn't witness this happen in real time.

On Windows, Git requires MSYS2, Bash, and some other tools from Linux to run, although the installer is nice enough that it's not too much of a hassle. However, from what I've read secondhand, it wasn't always nice.

According to Statista, 61% of software developers used Windows in 2022, and it was similar in precision years; however, it seems like some amount were dual-booting (Windows + MacOS + Linux/other Unix adds up to more than 100%) [1].

Given that, why didn't Windows users establish something like Mercurial as the standard, given their majority/plurality status? Or did something like that happen?

[1]: https://www.statista.com/statistics/869211/worldwide-software-development-operating-system/


  👤 wg0 Accepted Answer ✓
Windows used to have a version control called Visual Source Safe.

You'd have to select files you want to modify and check out them. Then they're locked. No one else in the team can touch them.

God forbid you forgot to check them in at the end of the day and are not at work for a week then those files are locked and don't even have your changes.

Yet to talk about branching.

Better then that was svn which was no less horrible.

So net result was that around 2000s, there was no concept and culture of version control for the windows developers. Pretty much. It was difficult to do, was cumbersome and such.

Then came git. Offline, distributed, instant branching without creating a whole copy of the project and what not. It came from Linux but it's ease of use and feature set made it defacto in Linux world.

But then came GitHub and that's when I guess most Windows Developers got to know about.

True story, I joined a team in late 2000s and their workflow was to literally FTP the code daily to production. They all were on Windows and didn't know any better.


👤 jval43
Short answer: It was better than everything else, and the competition really sucked.

Long answer: A big factor was that new devs already knew Git because it had taken off spectacularly in the open source world. In only a few years SVN was legacy and everyone switched to Git (and Github) for open source.

Windows shops that had been using SVN had a very natural transition to Git, with lots of tooling and experience available.

But there were indeed a lot of Windows shops that used other, mostly commercial and centralized version control systems before. While they might have had some advantages over CVS and SVN or when used for enterprise, those advantages disappeared when compared to the new distributed version control systems like Mercurial and Git.

Even though Git is not exactly easy to learn, it was still vastly better than the old version control systems. In many ways the old systems just really sucked in comparison.

As for Git Bash: You could always install Git through Cygwin, but I don't remember Windows people using the command line much. There were many Git GUI applications for Windows, and IDEs provided integrations as well. E.g. Eclipse came with a completely commandline-free Git client and integrated UI that worked flawlessly on Windows.

As for Mercurial vs Git: it wasn't completely clear from the start that Git would "win", as they are very similar. I think network effects, Github, and the Linux kernel halo effect are mostly responsible for the difference in use.


👤 seabass-labrax
I would say, as an entirely personal impression, most Windows application developers didn't use version control systems at all throughout the 2000s.

By being the dominant operating system, Windows would have had a wider distribution of developer skill levels than Linux at the time: both the really unskilled ones and the extremely highly skilled ones, plus plenty in between. The upper end of the competence distribution would be able to install Git even on Windows, and I believe that the majority of those who did use VCS would use a proprietary, Windows-only tool such as Team Foundation Server, especially in a corporate context.

But why didn't Team Foundation Server and its ilk completely take over? I would say that the lack of FOSS licensing was a very big part, as well the fact that Linux and almost all of its distributions were using Git, making it a sort of mandatory thing to use to participate in the FOSS world.

By the time the Git for Windows installer had progressed enough to be really easy to use in the 2010s, the concept of open source software had become so well understood even in Windows-land that everyone just gave it a go, with GitHub being an especially important factor in that. It's perhaps a bit like how people were installing VirtualBox on Windows just to get a LAMP stack, as that's what all the books and blogs said you should use.

All my own personal perspective, but much as we sometimes lament, I truly believe that the average quality of build tooling in FOSS has always been massively better than that in the proprietary software world through the 2000s and 2010s.


👤 genezeta
> On Windows, Git requires MSYS2, Bash, and some other tools from Linux to run

It is not required. It is, sure, the very recommended way, but it's not required. You can, and I've seen teams do it without much of a problem, use it from the Windows command line without MSYS nor Bash.

Not only that, but a lot of developers work within their IDEs, which would often have a plug-in that hides away many of the details of whatever VCS they'd be using. Large quantities of developers just learn the basic menu options in their IDEs, many without even knowing or caring about advanced features or how it worked or even what tool was behind it -SCCS, CVS, Subversion, Mercurial, Git, Perforce, Source Safe, Plastic, StarTeam or whatever.

Having said that, of course, some tools had their particular moments of popularity. CVS was widely popular. Shit, but widely popular because VSS was even worse. SVN replaced CVS to a large extent and for a while it was quite common.


👤 jwells89
I may be totally off base, but my impression is that since around 2000-2003 onward, developers using OS X and Linux became disproportionally influential and trendsetters of sorts.

Actually I think the popularization of Macs in the dev world may have played a big part. Linux devs were using git earlier, but especially from the mid-late 2000s onwards Macs started becoming mainstays in tech hub city startups and small companies, and with git working just as well on OS X as it does on Linux it quickly became the standard for that group of devs, which then spread to the larger dev sphere.

I remember back in the heyday of Rails, some time around 2012, when Windows users would post in community forums about having trouble getting Ruby, git, etc set up they’d promptly be told to either go install Linux or buy a Mac. In several dev circles around that time Windows users were oddballs.


👤 NonEUCitizen
TortoiseSVN is a subversion client integrates with Windows Explorer (SVN commands show up in right-click menu). Version 1.14.5 was released in September 2022, so some Windows users still use subversion.

https://tortoisesvn.net/


👤 physPop
You don't need msys2 and bash. You can just install it with choco and the powershell integration is great.

👤 wilsonnb3
The Windows dev culture is way less command line centric than *nix, most people use either their IDE or a GUI tool.

Visual Studio has supported Git since 2012ish, I think. TortoiseGit has been around since 2008ish.

So using Git on windows has never really felt "unixy" for most windows devs, until something goes wrong and you have to copy commands off the Internet into git bash and cross your fingers.

The main reason git won on windows is the same reason it won everywhere else, which is GitHub (and other git hosts).


👤 dragonwriter
> Given that, why didn't Windows users establish something like Mercurial as the standard, given their majority/plurality status

Most software developers use Windows and work on proprietary/internal software (and the second part was even more true when Git became the standard for open collaboration.) They—or rather, Microsoft—established Visual Source Safe (succeeded later by Team Foundation Server) as the standard for that kind of use.

SVN and then Git were standards for public collaboration, which (while it still involves lots of devs on Windows) was much more centered in the Linux/MacOS world (as what remains of the broader Unix/Unix-like world.)

Git was successful enough that places doing proprietary/internal development on Windows use it now, too, but that wasn't how it got established.


👤 mikewarot
The bit advantage of Git is that you can fork anything, and keep your own forks synced with the world, or not. For small projects, you never have to worry about granting permissions or any of that, there's no one true version problem either.

My first version control system was pkZip and a stack of floppy disks. I went through SVN and Mercurial, but only as a way to interact with existing projects. Git is so much easier to deal with (especially once you grok that it just takes snapshots of everything). The Windows installation process is painless, so why not use it?

Git/Git GUI made it trivial to keep a Lazarus/Free Pascal project on a non-networked computer at work synced with a copy on a thumbdrive, then syncing that with a copy on my desktop, and at github, was all a piece of cake.


👤 Stevvo
Nearly all devs were using Linux on the server regardless of Desktop OS. So it's not like we were unfamiliar with Unix, on the contrary, it's the one technology that binds all developers together.

👤 alok-g
Several comments note that there were no good alternatives. While Git was picking up, there was not only Mercurial mentioned by a few others here, but also Canonical Bazaar. https://en.wikipedia.org/wiki/GNU_Bazaar

I had personally found Canonical Bazaar to be better. It lacked behind Git in performance though, which they improved upon shortly. But perhaps it was too late by then.


👤 goodboyjojo
my guess is that many programmers like git because of the free/libre folk. i personally like git because it has an easy to understand gui interface that many people understand if you are not tech savy. my fav gui interface is github desktop that i use to talk to my github account and make commits to my personal website. i wish the other git repository hosting sites like gitlab and codeberg had an easy to use gui. maybe somebody will make one

👤 nonameiguess
All the reason given are good, but I don't know how much I trust this chart. Attempting to click on any of the "source" links does nothing. Statista is a platform for visualizations that anyone can post to. It'd be nice to see the actual survey this claims to be based on.

Note that if you pick a year, let's say 2020, the percentages add up to 153%. That isn't typically the way percentages work as far as I understand them.


👤 MissTake
Before git, there were systems like PVCS (that’s what I cut my eye teeth on).

That ran natively on both Windows and Unix.


👤 dutchbrit
I think because it just worked better than the alternatives. I remember working with subversion when I started at my first job. They switched to git a year later. This was a company mainly specializing in Java and doing development on Windows.

👤 gardenhedge
I also wonder this. SVN is probably enough for most use cases. I think gits adoption was helped by recruiters listing it under required technologies.

👤 mbfg
i haven't used windows in a really long time,but i believe gitbash is still a thing and makes things easy to use.

But git just crushes everything else in capabilities. Hg is decent and maybe more appealing to windows folks, but it's branching model just isn't as nice. (as least as i remember it -- could have changed over the years).


👤 deejatsplit
My answer here is very subjective (may be completely wrong). Still, with that caveat stated...

Wondering about why git became standard is a bit like wondering why mammals became big and more dominant a few tens of millions of years ago. It was a side effect of other bigger changes and being in the right place at the right time. Not that many other factors mentioned here were unimportant, but I don't think they're the high order bit.

By the early mid 2000's, Microsoft's dominance in the software space was under serious attack. The web was beginning to get to be a more serious place to build apps, undermining a lot of the market that MS had dominated for in-house apps (visual basic etc etc). Things like WHATWG, and then Safari, and eventually Chrome helped to decouple lots of apps from dependence on proprietary Windows technologies. Additionally, cloud computing was taking off in one form or another (VMware and virtualization, then actual cloud offerings like AWS), which made linux a much more appealing platform for doing a lot of server-side development. These two factors, I think, helped move unix/linux environments into more of a mainstream place in many developer's lives.

Not that they were not viable before, but endless stories about the titans of the new industry relying on commodity, cheap, open source platforms and technologies certainly influenced many organization's mindsets. (it probably doesn't hurt that the transition to .net wasn't the smoothest not to mention Windows Vista, but that's also probably not the high order bit).

Another interesting datapoint (also probably not a high order bit) is that in the mid 00's, the mac became VERY attractive among at least silicon valley developers. The presence of a linux-like development environment (Terminal etc) with a spiffy UI coupled with just being trendy (I think I'm allowed to say that, as a multi-decade  fanboy :-) ) lured many of the thought leaders away from Windows. I remember being shocked when a brand new coworker demanded a mac at work (this would have been around 2010) and actually got one... something that I'd never seen happen before. And I know of people who went to MS developer conferences in silicon valley in the same timeframe where everyone was using macs. I remember one observing: "Oh, wow. MS is in trouble".

Anyway, my own view is no one forced or made git the standard. it is more that "linux"/linux-like development environments became much, much more standard and accepted and at that point it was "the best" tool available. If MS hadn't lost its control over the software industry, lots of us might still be using visual source safe (or its successor).


👤 simonblack
when Windows is the majority OS.

Define 'majority OS'. :)

Windows is not the majority when you factor-out those Desktop users.