HACKER Q&A
📣 Dracophoenix

Whatever happened to dedicated sound cards?


During the '90s and the early '00s, dedicated soundcards were in-demand components in much the same way GPUs are today. From what I know, Creative won, on-board sound became good enough sometime between Windows XP and Windows 7, and the audio enthusiasts moved on to external DACs and $2000 headphones. Today Creative still sells soundcards, but none of them appear to be substantial improvements over previous models.

So what other reasons could have caused the decline in interest? Was there nothing that could be improved upon? Were there improvements on the software side that made hardware redundant and/or useless? Is there any other company besides Creative, however large or small, still holding the torch for innovating in this space?


  👤 speeder Accepted Answer ✓
The main reason for their death in my opinion, is the DRM-driven (although MS claim it wasn't because of DRM) changes to Windows drivers rules.

When DVDs and HDMI were becoming popular, and Windows Vista was launched, a lot of restrictions were put on drivers, I saw many people defending them claiming it was for better stability, avoiding blue screens and so on.

But a major thing the restrictions did, was restrain several of the sound cards features, most notably their 3D audio calculations that were then just starting to take off, people were making 3D audio APIs that intentionally mirrored 3D graphics API with the idea you would have both a GPU and a 3D audio processor, and you would have games where the audio was calculated with reflections, refractions and diffractions...

After that, the only use of sound cards became what the drivers still allowed you to do, that was mostly play sampled audio, so sound cards became kinda pointless.

Gone are the days of 3D audio chips, or having sound cards full of synthethizers that could create new audio on the fly.

Yamaha still manufactures sound card chips, and their current ones have way less features than the ones that they made during the sound card era.

EDIT: also forgot to point out the same restrictions kinda killed analog video too, for example before the restrictions nothing prevented people from sending arbitrary data to analog monitors, so you could have monitors with non-standard resolutions, non-square pixels, unusual bit depths (for example SGI made some monitors that happily accepted 48 bits of color) or not even having pixels at all (think vectrex) and so on. All this died and in a sense also affected video development, some features that video cards were getting at the time were removed and hardware design moved to a narrower path, more compatible with MS rules.

As for what the restrictions have to do with DRM: the point was not allow people to intercept audio and video using analog signals with perfect quality, since this would be an easy way to go around the DRM built-in on HDMI cables.


👤 pjlegato
Soundcard-like devices called "audio interfaces"[1] -- now usually USB breakout boxes -- are alive and well in the professional audio segment, targeted at musicians, recording studios, video editing shops, and similar applications.

They're not necessary for consumer apps. Consumer audio applications got "good enough" with mass produced builtin motherboard "soundcard on a chips" that basically replicated the function of the old soundcards at a much lower price point.

If you want to, say, connect 16 microphones at once and record to 16 seperate tracks, or you plan to apply a bunch of digital effects and therefore want a much higher sample rate than what your consumer audio chip can do, you can buy an audio interface.

[1] https://www.sweetwater.com/shop/studio-recording/audio-inter...


👤 nsxwolf
Pure digital audio and huge hard drives killed them. Back when we were chasing better and better synthesis - FM, wavetable... because playing back CD quality digital audio wasn't possible - you were lucky to have 23kHz mono, or nothing at all, your hard drive was tiny and MP3 wasn't a thing yet... Every sound card upgrade was literally music to your ears.

Now every computer has a little chip that plays back at least CD quality audio from an infinite pool of storage and RAM. Nobody wants to hear MIDI in their games anymore. I'm not even sure what a better sound card could even do for me - reduce line noise, drive high impedance headphones or something. Boring!


👤 an_aparallel
HI HN, long time reader - first post I felt super compelled to respond to.

This is a massive bugaboo in the audio industry in my opinion.

I have always been a PCI soundcard user - and still am to this day, but industry trends are stopping this. I think a big part of this is due to laptops/ipads and the like becoming more popular devices, as well as from a useability standpoint - companies optimise for succesful adoption into a users system - than technical specifications.

I started my DAW with a Terratec soundcard with midi + stereo audio ins and outs roughly 20 years ago.

Fastforward 7 years - i bought an early USB interface - the NI Audio Kontrol 1 to use with a laptop. I could run everything on it - take it out and about - cool!!

Fastworward another few years - and i got more serious about audio and bought a Lynx PCIe AES card (now without midi) - to use with an Apogee Rosetta 800 (8 in / 8 out). Now we're getting there. But - not an all in one solution.

In 2022 - surprisingly - the only (?) companies doing full PCIe audio solutions are Lynx and RME. In a fresh session in FL Studio or Ableton - with a sample buffer rate of 64 (the lowest) i enjoy latency of 0.72m/s. This cant be beaten by USB. However - that's not a deal breaker for most people sadly.

It greatly saddens me that audio in general is a second class citizen with regards to tech advancement. It still blows my mind that the Atari STE with MIDI in built onto its circuit still thwarts the tightness in the midi department - of a brand new full specced blazing machine. We need more development for Realtime O/S in the midi world.


👤 h2odragon
The onboard sound chips became good enough, and for those for whom they weren't good enough the noise reduction bonus of an external DAC was worth it anyway. Computers are generally bad for analog signals within a few inches of the case.

I think another factor is MP3 players and phone audio; people stopped using their computer as the (interface to) media source when other things took that function over for them.


👤 modeless
There are lots of complex reasons people are postulating here, but I think it's pretty simple. The CPU can do good enough audio rendering without hardware acceleration. The CPU can't do good enough graphics rendering without hardware acceleration. So video accelerators stayed, and sound accelerators died.

Add-on sound devices still exist, but they are simple because they don't include extensive hardware acceleration anything like what a GPU has. In fact, if you want hardware acceleration for audio processing algorithms today, like really fancy 3D sound propagation or something, GPUs would actually be great at that, and they support digital audio output too.


👤 theevilsharpie
A couple of reasons:

- During the MS-DOS era, there wasn't really a standard API for sound, so using a cheap, off-brand sound chip (including anything that might be integrated) often meant compatibility problems. Even though it might not necessarily have offered the highest quality sound, Creative's Sound Blaster line was the gold standard for compatibility during this time. Standardized sound APIs have largely eliminated this issue.

-Throughout the '90s, music for games (and a number of other applications) was distributed as MIDI (or MIDI-like) instructions to be generated by a synthesizer, and the quality of the music was very much dependent on the synthesizer used. The Roland Sound Canvas series was the gold standard at the time (in part due to its quality, and in part because that's what the composers themselves used), but it was very expensive and out of reach to the mass market. Software synthesizers were either too slow, or the quality sucked. That gave an opportunity for sound card manufacturers like Creative to offer higher-quality hardware synthesizers on their sound cards than what cheap/integrated cards could do. These days, most audio is PCM, and hardware is perfectly capable of high-quality software sound synthesis, so hardware synthesis has become a non-issue and modern consumer sound hardware doesn't even have hardware synthesis capabilities anymore.

- During the '00s, sound cards began to offer accelerated environmental and positional audio (e.g., Aureal3D, Creative EAX), which games quickly adopted to improve the sense of immersion. However, changes in the Windows audio architecture introduced with Windows Vista broke this functionality without a replacement. Advances in CPU hardware have since allowed this type of processing to be done on the CPU (e.g., XAudio 3D, OpenAL Soft) with acceptable performance.

In the current era, we do have dedicated soundcards, although not in the form of PCIe add-in boards. External DACs (either dedicated USB, or integrated into a display or AV receiver) are popular, as are the DACs used by wireless/USB headphones. Also, there has been some work done to utilize the computational capability of GPUs for real-time audio ray tracing.


👤 ksec
On-Board was good enough and cheap enough. It was as simple as that. A lot of the Audio processing moved to the CPU. Dedicated Sound Processor Effect requires Gaming support.

There was also Aureal. Both Creative and Aureal had their own specific API to try and create a similar moat like Glide from 3DFx but failed. And then Realtek took over.

Creative could have competed with onboard Audio as well. But they were too worry about losing their Sound Blaster Revenue, so they somehow diverged into other things like GPU ( 3DLabs ), MP3 players, Speakers, etc etc. And every single one of them failed.

If you are looking for modern Audio Engineering, you could look at PS5. But powerful DSP isn't exactly rocket science anymore. A lot of the improvement has to do with software.

Creative used to be the pride of Singapore. It is sad the company was badly managed and never made the leap to the next stage.


👤 tenebrisalietum
AC97 (1997) was the first blow - this was Intel's improvement on the defacto SB16 interface (and not compatible with it) and was around the time audio started being integrated into motherboards.

This is also around the time it started to be common for pre-built systems to integrate functionality into the motherboard, such as VGA, audio, USB, and in some cases even AGP video all as part of a chipset.

The peak of PC audio probably matches the peak of the "HTPC" wave that happened in the first half of the 2000's - PCs designed to be put under your TV and replace your stereo.

But also, laptops started getting cheaper and more popular as the late 90's turned into the 2000's and beyond - where integration of components was even more valued. Then smartphones started to take over in the 2010's.

The culture is different now. These days, the young people don't have stereos anymore, they might have at best have a TV soundbar or some really good wireless speakers, or a couple of bluetooth speakers, and the phone is the centerpiece of the personal audio experience now.

Hi-Fi that's not dedicated to making your car rattle or be \blasted at 500w-per-channel volume over a bar/club PA speaker is dead.

Desktop PCs are for businesses which need only good enough audio for business purposes, and gamers who probably want to spend money on a GPU over audio.


👤 rickdeckard
Intel came with AC'97 as a "good enough" onboard solution for audio, with standard drivers and all mainstream capabilities. No MIDI-port, no fancy spatial audio, just good-enough stereo out and mic/line-in.

It forced the dedicated soundcard vendors to justify the add-on price by pushing features like multichannel, Surround sound codecs, hardware controls etc, but none of those features were of mainstream interest.

Total sales volume for dedicated soundcards dropped, economics of scale dropped, prices had to increase, pushing the products even more into niche...


👤 mmastrac
USB killed it. Keep your signal digital until you hit the speakers (or a short audio cable). No interference, 44.1kHz from end-to-end (or more).

If you don't like the DAC in the headphones, you can also find a high-quality USB DAC and use the audio cable from there.


👤 anigbrowl
USB

If you're serious about audio you just plug a cable into your breakout box and have your interfaces. converts, and preamps there. Your sound hardware can be anything from a pure i/o to an elaborate instrument under computer control. You can do audio synthesis and compositing on the CPU, GPU (not so different from a DSP) or external hardware.

Soundcards are only 'gone' in the sense that PCI cards are less important because many people use laptops and the audio built into motherboards is more than Good Enough for everyday purposes.


👤 Merad
At least in terms of gaming I think multi-core CPUs killed them. A big argument for sound cards used to be that they'd give you higher quality audio and use less CPU. I can remember benchmarks from the mid 2000s showing less CPU usage with a dedicated sound card vs onboard sound. But by the time you get to the early 2010s anyone building a mid to high end gaming PC was using a quad core CPU, and with 4+ cores it's really hard to care if onboard sound is using a few percent extra CPU.

And while 3d audio is/was a cool concept, most people don't really have a sound system that will really take advantage of it. Even most "serious gamers" that I know use headphones or stereo speakers... now that I think about it I'm pretty sure I'm the only person in my friend group with a 5.1 speaker setup on my PC.


👤 schlauerfox
Creative still makes very high quality discrete soundcards, they just suffered a combination of on-board audio being enough for the average and wireless headphones not needing a DAC. Also my father's generation was super into audio and had lots of disposable income even for middle class people to invest in audiophile gear, today that income bracket is very very gone, so the markets are much smaller for gadgetheads to spend money on stuff like that.

👤 jensgk
USB-based external DAC's took over. They very good now a days. Head over to audiosciencereview.com to see lots of DAC reviews, technical info and discussions.

👤 oneplane
They aren't relevant anymore for the same reason that USB cards, serial cards, network cards and SATA cards aren't relevant anymore. They still have their niche, and in commercial settings you'll still see HBAs (with SATA but mainly SAS), 10G+ network cards, and audio interfaces, but but for most of the personal computers the level of 'better than good enough' was already reached a while ago.

There is no point to using an add-in card if the facility is now on the main board and can do the task to the user's wishes.

The same thing can be said about many previously modular components where more and more is now simply a function of the main board itself. Take all the legacy I/O which used to be various chips, often on various add-in boards. They were all condensed into one single Super IO chip that can do all of it, but at a fraction of the size, cost and energy usage.

A lot of peripherals used to be implemented in separate chips and sometimes even discrete logic. If we were to try to do that today we'd either have to cut 90% of the currently available standard features or make the mainboard ten times as big to be able to implement it the old school way.


👤 fortym2
Who needs better audio still looks for dedicated sound cards, *but* external.

For example if you produce music, you probably are good with an external USB audio interface like a Focusrite Scarlett.


👤 spiffytech
Two big things that I think contribute to this:

1) Most people are happy with good enough. To most people's ears, speaker quality makes a bigger difference than audio output, and people already settle there. Furthermore, when iTunes was a big deal it turned out people got accustomed to low bit rates and mediocre equipment and thought it sounded better than the good stuff, because it's how they expected their music to sound.

2) With most computing moving to laptops and then to mobile, people generally don't have a choice about the audio processing technology inside their computer.


👤 peteforde
Former MT-32 and CM-64 owner, here!

For what it's worth, if you have a small discretionary budget, I would recommend a "top of the low-end" DAC to anyone who listens to a lot of music. I "did my own research" and concluded that for me, the Topping D10s USB DAC was the correct amount of gadget. It has RCA outputs and supports 384kHz audio, which needs to be enabled in your sound settings.

When I got it set up, it was as if my previously disappointing desk stereo speakers and preamp combo took a great sigh of relief and the sound opened up. Everything sounds more defined, I can hear where the instruments are positioned in the soundstage, and I am now one of those people who appreciates Bandcamp allowing FLAC downloads. For me, this was worth CAD$139.

https://www.amazon.ca/gp/product/B08CVBKHFX/ <- currently "unavailable" but likely easy to locate via search


👤 squarefoot
Onboard DACs are good enough for HiFi, so there's less need for specialized sound cards today, except for making music when one needs ultra low latency (built in audio is getting a lot better also at this however) or multi track recording. I'll be buying shortly a Tascam US-16x08; it doesn't offer much more in terms of sound fidelity than my older Steinberg CI1, but it can record 16 channels at the same time, which is handy when miking a complete set of instruments (drumset mikes + overheads, keyboards, guitars & bass amps, voice, etc... you never have enough inputs) so I can easily keep track during rehearsals and have more freedom during recordings.

👤 ssharp
I used to love MIDI's and was always excited to run my whole collection through any new sound card. Of course, the best option was to run MIDI out to my Yamaha keyboard and play everything through that.

It's still fun today, 25-30 years later, to crack open in a 90's MIDI in a DAW and route the channel outs through virtual instruments to see how they sound.


👤 RoyBatty89
They're still around, I use one because the sound chip on my motherboard somehow failed. I'm allowed to use a subwoofer and 2 sets of speakers I got at goodwill for like 20 bucks. These speakers I use were like 200 dollar speakers back in the 90s so despite them being old they're still good and still work fine. Soundblaster mainly makes the soundcards, granted mines old from 2013 but it works fine on a modern PC. I just checked the site and they made a 2022 version so there's obviously STILL a demand for them, probably because it's hard to get an inexpensive subwoofer unless you buy second hand and it's actually cheaper to buy a separate sound card for 45 bucks and buy used speakers than buy a modern bluetooth or USB one. I saw other more expensive options but went with the least expensive because I needed sound and believe me it's better than the soundbar I used before for music. If you're an audiophile who wants to use your computer as a stereo with older speaker setups? They're still useful. Most prefer headphones now but personally I actually like using my setup even if I was kinda forced into it by a sound chip failure.

👤 iancmceachern
I think all the innovation moved to the creator/professional stuff. You can buy amazing sound gear that interfaces with the computer, controls stuff in Ableton, etc. For me at least, I've chosen to go with studio monitor type speakers from the professional audio world rather than home hi fi stuff. I suspect others are similar, a small 4 channel mixer with dac and other cool interface stuff is common.

👤 analog31
I've tested USB audio adaptors, ranging from cheapies from Amazon, to a PreSonus recording interface. In fact, without resorting to really fancy measurement gear, an audio adaptor is close to the only thing good enough to test an audio adaptor. I've also tested the audio output of phones, built-in PC audio jack, etc.

I use PC audio to test analog audio circuits that I make, so I've made it my business to know the quality of my measurements.

I've also checked out the specs on the chips used in those devices. The delta sigma conversion technique is one of the wonders of the modern world.

The fact is that the audio quality coming out of those devices is stunning, and probably doesn't need to be better. I can see where a recording studio might want to spend more on "overkill" to make the artifacts of their digital interface a non-issue, but for the rest of us, we're living in a golden age of audio.


👤 nde9ZYunCA9Y82q
> on-board sound became good enough

You answered it yourself there. Sound hardware started being integrated into the motherboard and/or southbridge/PCH.

(Although a minor quibble... on-board sound *started existing*. Back in the days of sound cards the only on-board sound would've been a PC Speaker which... well, it can do beeps of various frequencies, but that's about it.)

Old sound cards also had various synthesis and MIDI stuff, instead of playing sampled audio, which is great in theory but... then your audio sounds different on every different hardware. Also, these days CPUs are fast enough to do a lot of the synthesis in software (and have extra cores so you're not stealing cycles from something else). That way, even if you really wanted synthesis, not only do you not need extra hardware, it also sounds the same "everywhere".


👤 thatjoeoverthr
I wanna add the death of the synthesizer. There was no way to ship a game with a soundtrack without a hardware synthesizer, due to storage and CPU power. Hardware synthesizers could differ in quality. Listen to what this musician did with under 50 kb using the SNES hardware synthesizes: https://youtu.be/gkCcvoJ09gU

In the 90s, CD audio could handle some soundtrack work but without looping, and it would obviously block disc reads.

However you have now significant storage and hardware compressed audio decoders. So, soundtracks are shipped as compressed waveforms.

All decoders will play them the same way. There is no differentiation.


👤 HeckFeck
As an aside, the hardware/driver support for sound cards is terrible on modern Windows.

There's this trusty Creative Audigy Rx that I nabbed from a closing down sale at an electronics retailer. Poetically, both are facing the same fate.

I built a Ryzen 7 machine and installed Windows 10. Whilst installing the CD drivers for the soundcard Windows 10 BSOD'd and rebooted. Not to be deterred, I tried the latest downloadable drivers (marked as Windows 8 compatible) but it is all WDM so surely OK? Not so. Another BSOD.

There was no help to be found online so I very reluctantly gave up. Now it lives in the retail packaging somewhere in my house, the bright and elaborate box promising an audible experience that exists only in my mind.


👤 ROTMetro
Creative, that is a name I haven't heard in a while. I'm old enough to remember limos pulling up to the (to be bought by Creative) EMU office in Scotts Valley by the likes of Trent Reznor to play with the latest sampler. So much gear envy back then. It's so cool to live in 'my' future where I can get a used Pigments soft synth for $75 and Ableton lite for $15 on a niche message board of people from around the world and buy from someone anywhere in the world, pay digitally, and in less than an hour have as much music production as $50,000+ would have bought me in the 90s. It's so crazy to live in what to me is the future.

👤 golergka
There's still a lot of dedicated audio interfaces for people who are involved in making music. However, most of those people are using Macs because of how better CoreAudio compared to ASIO or anything else, and subsequently, because most of software is developed and tested first and foremost for Macs thanks to network effects. And most mac users have Macbooks, which means that majority of usually used audio interfaces are external ones.

As for people who just listen to music, IMO, built-in audio interfaces or just digital Bluetooth headphones have been good enough for a very long-time in digital-audio conversion.


👤 exabrial
They're alive and well: https://www.behringer.com/product.html?modelCode=P0BK1

This is the unit I have and I'm quite happy with it. It's on the "low end" of "good". It has a surprisingly low noise floor. Lacks sophisticated routing and switching. Latency is really well at 192khz, good enough for live mixing, as low as you have the CPU mmmph to handle it.

They only take off in price from here, up to thousands of $.


👤 pacomerh
The included audio interfaces got to a point where they're good enough for the conventional user. But that aside, Having an external interfaces, like we do now provides better and more modular options. Better latency, quality, DAC converters, additional inputs/outputs, etc. Focusrite, M-audio, Presonus, Motu, Behringer, Unversal Audio, the list goes on.

👤 RoyBatty89
They're still around, I use one because the sound chip on my motherboard somehow failed. I'm allowed to use a subwoofer and 2 sets of speakers I got at goodwill for like 20 bucks. These speakers I use were like 200 dollar speakers back in the 90s so despite them being old they're still good and still work fine. Soundblaster mainly makes the soundcards, granted mines old from 2013 but it works fine on a modern PC. I just checked the site and they made a 2022 version so there's obviously STILL a demand for them, probably because it's hard to get an inexpensive subwoofer unless you buy second hand and it's actually cheaper to buy a separate sound card for 45 bucks and buy used speakers than buy a modern bluetooth or USB one. I saw other more expensive options but went with the least expensive because I needed sound and believe me it's better than the soundbar I used before for music.

👤 simplyaccont
There are a lot of mentions how onboard audio killed sound cards and that there are usb audio adapters.

This is right till you want something a little bit out of the ordinary: usb dac with 5.1 that supports hardware decoding of DSD/etc . Most of the options have either $x000 price tag. The next best thing is to use A/V Receiver via HDMI


👤 pdpi
The answer is simple: Who benefits from a soundcard?

For your average consumer, built-in audio is plenty good. Not much reason to get any extra hardware at all.

For audio enthusiast consumers who want better than what onboard offers for some reason, an internal card is only useful for desktp computers, whereas external DACs are uasable on mobile devices and laptops - and that's what most people use.

For audio producers, an internal soundcard can't physically fit the I/O you need. 1/4" jacks, XLR, that sort of thing, so older professional cards all had breakout boxes. If you're going to have a breakout box anyway, might as well have the whole device be self-contained and plugged in through Firewire (or, more recently, USB or TB). Depending on your setup, you might even have these things rack-mounted.


👤 nickt
There's still a lot of interest in the retro computing community - keropi and marmes (http://pcmidi.eu) have built a few cards, including what many view as an ultimate retro sound card, the Orpheus, which uses some OG components such as the YMF289B OPL3 and has wavetable daughterboard expandability. While it's new, it's true to the character and spirit of the 1990's cards, with the benefit of modern manufacturing and better quality components than Creative were using at the time. Looking forward to the Orpheus II! https://www.vogons.org/viewtopic.php?f=62&t=88957

👤 pengo
Improvements in architecture, bus speeds and external ports mean there now no need for audio to be handled by an internal card; Thunderbolt or USB is more than adequate. This has moved the focus to "external soundcards", more commonly referred to as audio interfaces.

👤 MisterTea
tl;dr Higher integration and faster CPU's killed them.

In the early days of PC's most EVERY peripheral was provided by an IO expansion card save for the keyboard. My 386 had a 16-bit multi-IO ISA board that provided the essential coms ports: ATA, floppy, serial, parallel. You purchased a VGA card and then a sound card. You had at least 2 or three ISA cards because your motherboard was taken up by all the CPU, FPU, RAM, and essential control logic chips. My second 486, a DX2 66MHz, had the ATA, serial, parallel and floppy ports on-board which amazed me as it eliminated a whole ISA board. (Now everything fits on a single silicon die...)

Early on-board audio was usually a sound card soldered the the motherboard. Then Intel developed AC'97 which integrated standard audio into the south-bridge. Coincidentally that made Microsoft's life easier as all the PC's they were running on would use this standard meaning all they had to do was provide AC'97 drivers and everyone with an Intel machine had sound. No more competing 3rd party audio api's from Creative, et al. it was all Wintel. PC builders could now provide multimedia PC's for cheaper prices with audio by default. Also, USB happened which allowed people to plug in things without opening cases and fiddling with circuit boards which is alien to many.

And as CPU's became faster, the need for dedicated DSP processors to handle audio processing or synthesis is eliminated. You can now run a whole DAW complete with synth, sample playback, effects and mixing in real-time on a cheap general purpose off the shelf computer with no special hardware.


👤 yardie
It went from $30 PCI card with a $10BOM to a <$1BOM chip directly on the motherboard.

👤 franky47
As stated by others, external sound "cards" (boxes would be a better term) are still prevalent in pro-audio applications, and may have evolved from internal cards not only for the convenience of plug-and-play, but to allow accessing cables more easily than diving behind a PC tower.

There are still "cards" being made for pro-audio users, that embed DSPs for computing plugin algorithms [1]. Not quite the same application, but an interesting parallel.

[1] https://www.avid.com/products/pro-tools-hdx


👤 arminiusreturns
The last pci soundcard I had was the Razer Barracuda AC-1 [1] which I really wish I had kept now. Paired with the headset it was made for, the Barracuda, it was at the time a pretty amazing piece of hardware for the price. I think software is just eating lots of things...

1. https://www.phoronix.com/review/590

2. https://www.modders-inc.com/razer-barracuda-hp-1-8-channel-g...


👤 ABoredBirb
I have an Asus STX II dedicated sound card and it is a very substantial improvement over built in sound cards, to the point i'd never consider having a PC without one. Most people have just never tried anything better and thus live under the illusion that the built in sound cards are "good enough", rather than the lowest common denominator.

Lack of marketing, maybe? A myth that you need to fork over thousands for a complicated audiophile setup to get very incremental improvements over the baseline? Probably a combination of these is to blame.


👤 MangoCoffee
desktop decline. people got used to the sound quality of laptop, smart phone and tablet. there's also AirPods type of headphone for people who want to enjoy their music via laptop, smart phone and tablet.

👤 colechristensen
Like others have said, it's just people who want that kind of thing don't do cards any more, they do external "audio interfaces".

I have a Scarlett attached to the underside of my desk with a pair of Sennheiser HD600 headphones and a Monoprice Stage Right condenser mic on a desk stand.

I'm guessing the inner-PC card space has a bit too much of a EM noise problem for people who care about quality higher than integrated sound can provide (which is pretty good these days anyway) and external devices have room for the inputs people actually want.


👤 dvfjsdhgfv
> and the audio enthusiasts moved on to external DACs

Well, the internal ones still exist [0]. However, with higher bus speeds, external interfaces are more practical: you can connect more devices to them, and you can move them - a lot of music today is done on laptops.

[0] E.g.: https://www.esi-audio.com/products/maya44ex/


👤 Nextgrid
With the rise of wireless headphones, the "sound card" is actually within the headphones now, so the sound card in your machine is irrelevant.

👤 numlock86
> and the audio enthusiasts moved on to external DACs and $2000 headphones

While good studio-grade audio monitors are available for less than $200 (Beyerdynamic DT990 Pro for example) I don't see what a soundcard offers that an external DAC doesn't. If anything most soundcards I find are quite lacking some much needed features compared to most DACs, which probably is the answer to your question.


👤 dusted
Multiple tings:

Desktop PCs have become a niche, more people have laptops than Desktop PCs, so that already reduces the market for dedicated internal sound cards significantly.

Desktop PC motherboards now come with integrated DACs that are "good enough", and if you care enough that they aren't, you'd have a hard time arguing for an internal solution over an external one anyway.


👤 wombat-man
Because of reasons, I wanted an optical in. So I got a sound blaster card. It worked fine but the software was kinda... bad. For something that was not engaged most of the time it would sometimes take a lot of CPU according to task manager. Not really sure what the issue was, but ended up disabling the card when I wasn't using it.

👤 aristofun
The bottleneck with sound is human ears + end device (headphones, speakers).

10 years ago even many budget sound cards had outperformed the capabilities of an average human ear and an average headphone.

There’s not much room left for growth.

As opposed to much more complex and dimensional video data.


👤 smm11
Sound cards were a thing because DOS/Windows was a thing.

In the Mac/Amiga/non-DOS world sound was good enough very early. As soon as we said "we need sound," it was there, as long as you weren't on DOS.

The Amiga Video Blaster card was a thing, too. Your Android/iOS/PalmOS device surpassed that long ago.


👤 CrendKing
Human vision is way more information heavy than human hearing. Just how our species is physically.

When the Realtek chips become good enough for audio consumers in 00s, the market for dedicated sound cards no longer exist there. Of course, they still exist for audio producers and professionals.


👤 fortran77
I have some "audiophile DACs" and they all either USB or SPDIF interfaces from the computer. So there's no need for a sound "card" -- it's now a USB or SPDIF device that plugs into the computer's port.

👤 sfortis
I was lucky enough to live the fascinating journey from PC speaker to AdLib, SoundBlaster, SoundBlaster AWE64, Roland LAPC-1 (what a f#^n piece of hardware!), Yamaha SW60 and Gravis Ultrasound. Glory days!

👤 mmphosis
I can only dream of a day when dedicated video cards become just as redundant.

👤 midislack
They still exist, but they're mostly external. For example I'm using my Parasound preamplifier as a sound "card" right now (it has a super nice Wolfson DAC).

👤 sys_64738
They're commodity items that eventually get folded into the main CPU to reduce costs for manufacturing. This is the majority of the market.

👤 m3kw9
Cost saving? CPUs and new algorithms can do sound with low single digit cpu usage and doesn’t justify a cpu offload card.

👤 b20000
i don’t know what you are talking about. audio interfaces never went away. they became external due to USB and firewire. and creative stopped making them. pro audio companies primarily make and sell them now as they did back then. RME and MOTU are some examples.

👤 zxspectrum1982
AC97 I. e. onboard audio chips becoming good enough for 99% of the people.

👤 rektide
One thing not really mentioned is in threads is...software. Every soundcard used to compete by trying to offer better 3d or better effects processing than the others. Game developes had a half dozen sound sdk's they could write their game audio for (EAX, Aureal... I dunno there's probably at least half a dozen). Cards competed on features. In the pro segments, literal megabytes of midi sample space & dsp coprocessors attracted audiences. Each of these tended to have one of any number of software toolkits... so you needed games or apps to all get onboard too. Many SDKs had software fallbacks so many users would even care that they didnt have Aureal3D support or whatever.

The actual advantages of doing things in hardware quickly faded, as cpus became ever more powerful. But just as much, the various apis didnt really ascend, or their technologies (head-related transfer function) got swallowed/mainstreamed. Namely DirectSound3D (1996), which grew various hardware offloads (EAX), and eventually because DirectX Audio. The pressure to compete deflated under thus mainstreamification... few people wanted to the extra 100 miles to support bespoken fancy hardware capabilities 0.01% of gamers might have, when they got 90% there using the common denominator.

I dont actually know what folks like gamedevs do now. Both Xbox & Playstation say they tick a lot of really good 3d audio boxes, they support an array of common 3d audio standards that I dont really know that much about. I'd love to know more about hoe the gamedevs feature-sets & capabilities have evolved over time... most coverage is alas ultra consumer facing & abstract; getting some intimicacy with technically what is possible now vs then would be fascinating.

I do think Creative has continued doing some refinement on their dedicated cards, but, like, in general, I think the other answer to the question is just that we are damned near perfection. We have really smart folk telling us we're making things worse by having too high a sample rate (>=192kHz, see Monty's https://people.xiph.org/~xiphmont/demo/neil-young.html & newer excellent audio myth videos), there's so little noise left to chase out to drive SNR higher, THD is tiny. Many laptops & gaming motherboards have really really good audio outputs, which are like $4 more cost for the person making the system for nearly flawless output, and even cheap stuff has gotten quite fine (but boy can engineers still cut corners & create trashfire designs, especially on low end systems, but it's gotten harder!).

What's kind of exciting in the past half decade is that Intel realized that on-card processing needed effective commodification if it was to survive. Each chip/driver maker making their own bespoke solutions like back the 90s was going nowhere & the only effective pushback against forever doing more & more on the cpu was to make the sounnd hardware more usable, easier to implement consistently & well. To that end, they did the amazing working of making SOF (sound open firmware) which is an implementable reference firmware anyone can use & ship for sound devices. It's a community effort now, to make a orchestrator/controller that implement commom driver interfaces, & figures out how to use a slate of DSPs effectively to do thr job, which under the hood is what soundcards have been. Now everyone can work together to use these DSPs effectively & well, whatever chips eith ehatever DSPs on them you happen to have. AMD is one other noted user, I forget who else.

https://thesofproject.github.io/latest/introduction/index.ht...


👤 djmips
People just use USB sound devices now. Things like Motu and Scarlett.

👤 fffobar
What does a sound card actually do, besides ADC and DAC?

👤 imwillofficial
Onboard sound cards got good

👤 keepquestioning
What DSP does the OP-1 use?

👤 martinowen836
If you look at the market for sound cards in 2019, you will realize that there is a rather strange trend going on. Most people are not buying sound cards simply because they are facing a decline. But at the same time, there is a crowd that swears by it that sound cards are still relevant in 2019.

True, you can get some of the best sound cards available in the market if you are willing to spend the money but one thing that most people can’t understand is that aside from a handful of the sector, sound cards are not being bought by a lot of people. This has convinced us to explore whether sound cards are still relevant or not. You can know and learn by visiting https://enterprise.affle.com/mobile-app-development Honestly, there could be a lot of reasons behind why these are being phased out. However, is the situation bad to the point that sound cards will soon be considered relics of the past?

This is what we are going to explore in today’s opinion. Onboard Audio is Getting Better Okay, let’s be honest. Part of the reason why sound cards were created in the first place because onboard audio had a lot of distortion issue mainly because the components were placed closer to each other. For the longest time, this was a huge issue that resulted in sub-par audio, and that is why many companies like Creative banked on this and created a long range of soundcards, starting from cheaper options, to the ones that cost a lot of money.

However, as time progressed, the onboard audio only improved. So much so that many companies like Asus started working on shielding the audio components onto the separate layer. This technique reduced the distortion by a great mile and the onboard audio started improving a lot, too.

Wireless Headphones are Taking Over Back in the old days, wireless headphones or any wireless peripheral was simply not good enough to match the quality and fidelity provided by their wired counterparts. However, things have changed drastically to a point that wireless technology has improved by a drastic measure. With most gamers and general users now proffering wireless technology over the wired one, there is no denying that wireless headphones are taking over. They are convenient, introduce absolutely minimal input delay, the battery life is great, and most importantly, they come with built-in audio processing technology. xternal DAC/Amp Combos Are Now Becoming the Choice Another reason why the sales of sound cards are declining is that people are now going for options like external DAC/Amp combos a lot more than they are going for sound cards. Sure, these combos are certainly expensive, but the good news is that the performance they deliver is actually a lot better than some might expect, in the first place.

For starters, a Schiit Magni and Modi combo are going to be good enough to beat pretty much every single sound card available in the market. I know that people might want to invest money on something a sound card but when you are getting better overall quality from DAC/amp combos, you do not really see the reason.

Sound Cards are Not as Versatile This actually ties into the previous point that we discussed. Simply put, sound cards are not as versatile. They were never versatile, to begin with. However, back then, the needs were not as widespread. If you are talking about internal sound cards that most people want to go with, they can only be used once they are plugged into the PCI-express slot on your motherboard.

However, when we are looking at the DAC and amp combos that are widely available in the market, you really do not need to do that. They are plug and play, most importantly, they are driver-less, and can work on pretty much any device that comes with the required ports.

Needless to say, sound cards are simply not as versatile, and that gives a lot of people the reason to stop using them and opt for something that actually serves them properly.

Hope this things will help.