Second GPU for VGA output

I’m planning a long-overdue build, replacing pretty much every component in my current system, except one…the CRT monitor, of course. I use it for emulation sometimes, perfect latency and response time, the high refresh rate is nice, the colors are nice and saturated, etc.

Ah who am I kidding, I just keep it around for shits and laughs. Even so, I’m attached to it, and don’t plan to give it up. But here’s the issue: recent graphics cards don’t support analog output over DVI-I anymode (goodbye ramdacs), so the only way to connect is using one of those crappy adapters from Amazon/Aliexpress. I don’t think any of them support odd refresh rates, and I would be shocked if they operate without a framebuffer, so that’s no good. I’m also not interested in buying a framemeister or other external scaler, mainly due to added complexity and price.

My grand plan is to, instead, buy a GTX 750 or something from ebay and connect the CRT to that, while I drive my new giant LCD monitor from something a little more powerful. These two GPUs wouldn’t be in SLI or anything; my intention is to get a standard multi-monitor setup, using one GPU per monitor.

So in case anyone here has attempted something similar, I have a couple questions:

  1. Will this work at all?
  2. Is it a bad idea to mix AMD with NVIDIA? (Again, I’m not trying to crossfire the two cards, but I will presumably need both drivers installed.)
  3. For the VGA-output card: how old is too old? (As in drivers don’t exist anymore, etc.)
  4. Can the powerful card render into the weak card’s framebuffer? (If not, there might be some contrived edge case where the ancient card will bottleneck the system. It’s also fun to play recent games on the beast sometimes, but not a dealbreaker if that doesn’t work.)

Any and all knowledge is appreciated, this is my first multi-gpu build. Also, in case it’s relevant, I use both Linux and Windows (dual-booted) and the system is going to be Intel/Nvidia (just gaming; do I look productive to you?).

1 Like
  1. It works. I have this sort of configuration.
  2. I’ve never used mixed card setups specifically for VGA support, but I’ve mixed cards for other reasons and had no problems.
  3. Generic drivers generally work fine for just basic video output. Nvidia seems to maintain driver support for older cards on newer operating systems for longer than AMD if you care about features.
  4. It can be done in windows using display mirroring, though it can take some messing around to get windows to recognize the more powerful card as the ‘primary’ display. I’m sure it can be done in linux more elegantly. Any way you do this is going to add latency to the output. Whether or not that latency will be perceptible on an otherwise lagless CRT, I can’t say for certain.

For what it’s worth, I have had pretty good results with one ‘Accell’ brand HDMI to VGA adapter. I was able to use it to drive a CRT at 1600x1200 90hz with no issues(until the flyback transformer blew up; not the adapter’s fault). This was years ago though. With those adapters, pay attention to the ‘video clock’ specification. Most of them are rated to do something like 162mhz, which I think is just barely enough for 1920x1200 60hz reduced blanking. The adapter I used could do significantly higher than that, and as far as I could tell it’s the only one on the market like that.

1 Like

display port to vga adapter?

Thanks, that’s really good to know. I forgot about generic drivers on Windows. And thanks for the tip about the video clock. My one concern is that I won’t be able to create custom resolutions without the mfg. drivers installed (CRU doesn’t work with generic drivers apparently)…60Hz CRTs hurt my brain. But still good to know I have options.

Also, welcome to L1T forums!

These adapters would certainly get a picture on the tube, but they aren’t exactly thoroughly specced. It’s hard to know how they would handle high refresh rates, or really weird resolutions (like 240x480@120Hz, or 480x3840@60Hz). The manufacturers also don’t publish the latency impact—one frame probably? They’re cheap enough that I could get a few to test out, but I would prefer the more-robust implementation found on a VGA card (especially since the adapters all likely use the same chip on the inside). But if none of this works out, I’ll give it a shot; maybe I’ll be pleasantly surprised.

Google: CRT Emudriver and go buy yourself whatever card is compatible with it, of you’re choice.

Update! It turns out that graphics cards with native analog output are still being made, since OEM’s are capable of adding the circuitry back in, even if it’s not part of the chipset proper. I’ve only been able to find two examples of this, both based off the RX 550. Sapphire makes a low-profile card with a DVI-I output, but it’s not available for individual purchases.

https://www.sapphiretech.com/en/consumer/pulse-rx-550-4g-g5-lp-1

The other option is everyone’s favorite GPU partner, Yeston (yes, that Yeston), with a suspiciously-similar part:

I’m taking a chance on it; it’s cheap, power-efficient and, embarrassingly, more powerful than my current card. I’ll report back to tell if it works or not.

No joke, I had thoughts of doing this exact build when you did, except at the time my 15" CRT was my MAIN monitor with a secondary 8" CRT running with an HDMI to Composite adapter. I actually had a GTX750Ti at the time. Now I have a curve monitor as my main and a GTX1650Super. I plan on re-adding my 15" CRT back to my setup with my 750 as a secondary card so I’ll read into this. I hope you’re running a newer generation processor than me though, as I am still using Core 2 Quad and not i3-9 lol. I’ll tell you from experience that mixing AMD and Nvidia is bizarre and it made my other computer run wonky when we swapped in an AMD gpu from my buddy’s old pc. We had to uninstall all the drivers and run purely AMD, but it was slow as garbage(not a good AMD card lol) and switch back to Nvidia. Hope your build is still going well, and sorry for rambling!

Did the rx 550 yeston lp produce Analogue signal or it was just an integrated active converter
Because AMD stopped making Analogue signal gpus from polaris onward

Hey, welcome to L1T! I lost my 2-factor codes and got locked out of my old account, so don’t mind the username.

Judging by the xrandr output, the latter. The Yeston would always show a displayport output in addition to the VGA output, and any mode I added to the VGA output would get duplicated in the DP output.

I recently switched to a dongle-style DP-to-VGA converter from Icy Box and the single (GPU) life has been great. Several of these converters are basically zero-latency, and the one I have can handle even larger resolutions than the Yeston could. I’ve also had nothing but luck playing with non-standard timings but YMMV. Playing games with high settings on my main GPU is a big plus too; no messing with Windows display mirroring etc.

There’s a massive thread on Hardforum where all the FW900 guys talk monitors and share notes on these converters. There is one chipset that’s the absolute best, but it’s only in a few different products. There’s one from Delock, one from Icy Box, and one from Sunix. They can be hard to get depending on where you live, but all should work basically the same.