I want a sound card that does not exist

I want a sound card that does not exist (or generate sound!) My ideal sound card:

  • PCIe 1x form factor
  • presents itself to Windows as a 7.1 analog device
  • bitstreams 8 channel output over HDMI
  • generates a dummy video signal (required as HDMI audio does not exist without video)

Why?

  • I run a high refresh 1440p monitor and an audio receiver for sound
  • G-sync doesn’t work over HDMI
  • Cloning DisplayPort to HDMI locks you to the lowest mutually supported resolution and refresh rate (1080p 60Hz)
  • Setting extended desktop results in a phantom display and your mouse cursor getting lost in the Bermuda Triangle
  • My receiver doesn’t speak 1440p or high refresh rate
  • DD/DTS encoding reduces audio quality
  • DD/DTS encoding adds latency
  • Many games default to stereo when they detect an SPDIF connection, ignoring my mobo’s Realtek DTS encoder and Windows’ 5.1 channels
  • Many newer games lack the ability to manually set audio channel configuration (see above)

Currently running a SoundBlaster Z, but it’s not ideal. Audio latency is much worse than Realtek’s DTS encoder. There is a very significant delay between the analog headphone output and the DTS stream.

The only PCIe sound card with HDMI passthrough I can find is the Auzentech X-Fi HomeTheater HD, which was discontinued around 2008…

1 Like

Yeah… You may be better off looking for an updated HDMI 2.1 receiver instead. Then you could do a clone maybe. And 8 channel PCM audio without DTS. At least I think that’s part of HDMI 2.1.

1 Like

I’m rocking the Denon x3700h and i have my 3440x1440@100hz display as the main, and the sound transferring HDMI as a 2nd video out at 4k resolution going to the receiver(May not be necessary, but i didn’t want any compression happening). It is also convenient because the receiver is setup to display two inputs to my 1 output TV. So i can use my nearby TV as a 2nd screen when i feel like it. Otherwise, yes it is cumbersome/slightly annoying that there isn’t a way to transfer 5.1 or 7.1 channels, at high bitrate, without having to send a video signal at the same time. Since i believe optical can only achieve 5.1 with compressed audio.

Most integrated DACs for mobos suck and standalone sound cards are not worth it anymore.

I used to have an auzentech x-fi home theatre HD which did that. It used to be the only (and expensive) way of playing high def audio from blu-ray until all the graphics cards got HDCP. If you can find one you’ll need to get the unofficial drivers to make it work on Windows 10 as it was from before directsound got killed.

Sound is an afterthought in a lot of games.
So even if you had that legendary sound interface, latency would likely not improve much since the windows audio stack is still in there.
I am not aware of a single game that supports ASIO (wich would bypass all of the windows audio fuckery)


BTW: What do you need surround audio for? Got some complex speaker setup?

HDMI really killed all the “experimental” systems with separate SPDIF outs per zone, etc.
HDMI ARC is kinda sorta circling back, but it will take for ever until we get graphics card that support that.

MADI or ADAT could work, if you found a software to act as a virtual sound device that mediates from WDM, MME or WASAPI (I am not sure what video games support audio wise, just know that ASIO is never supported).

I have a 1080Ti… doesn’t do HDMI 2.1. I’d also hate to toss a perfectly good receiver.

Surprised there isn’t a Chinesium card on Ali Express that does this. Guys have been griping about HDMI audio on the HTPC forums for 15+ years.

What about using another pci-e x1 gpu?

hey check this guy out, he’s made several sound cards. TexElec - YouTube

maybe just me and my weird ideas, but would the USB-C output not be able to solve all this if it had some sort of firmware update? I mean, USB-C and Displayport is same family, and ill imagine that from the computer it would see it as the same output, then a fancy “active usb-c to hdmi” cable could solve rest of the way to the receiver.

I mean, inventing a new cable and adjusting a firmware, sounds far easier and manageable than inventing a whole new type of chip/card.

Or am I totally wrong?

Guess its hard enough to get AMD and Nvidia onboard on that idea, but I absolutely see the issue (I used to work with setting up home cinemas for 10 years some time back, so I’ve been dealing a lot with these sort of problems)

Thanks. Left a project idea inquiry on his store page.

USB-C display output would create a second display adapter in Windows and would have the same problems as connecting both DP and HDMI to the GPU: crippled DP output in desktop clone mode and a phantom display in extended desktop mode. Disabling the video adapter shuts off the HDMI audio device because HDMI audio is inserted into the HDMI video stream.

They make HDMI embedder boxes that insert optical and analog audio into HDMI video streams. What I need is something like that with an 8 channel USB audio device on the input side.