I’m planning a long-overdue build, replacing pretty much every component in my current system, except one…the CRT monitor, of course. I use it for emulation sometimes, perfect latency and response time, the high refresh rate is nice, the colors are nice and saturated, etc.
Ah who am I kidding, I just keep it around for shits and laughs. Even so, I’m attached to it, and don’t plan to give it up. But here’s the issue: recent graphics cards don’t support analog output over DVI-I anymode (goodbye ramdacs), so the only way to connect is using one of those crappy adapters from Amazon/Aliexpress. I don’t think any of them support odd refresh rates, and I would be shocked if they operate without a framebuffer, so that’s no good. I’m also not interested in buying a framemeister or other external scaler, mainly due to added complexity and price.
My grand plan is to, instead, buy a GTX 750 or something from ebay and connect the CRT to that, while I drive my new giant LCD monitor from something a little more powerful. These two GPUs wouldn’t be in SLI or anything; my intention is to get a standard multi-monitor setup, using one GPU per monitor.
So in case anyone here has attempted something similar, I have a couple questions:
- Will this work at all?
- Is it a bad idea to mix AMD with NVIDIA? (Again, I’m not trying to crossfire the two cards, but I will presumably need both drivers installed.)
- For the VGA-output card: how old is too old? (As in drivers don’t exist anymore, etc.)
- Can the powerful card render into the weak card’s framebuffer? (If not, there might be some contrived edge case where the ancient card will bottleneck the system. It’s also fun to play recent games on the beast sometimes, but not a dealbreaker if that doesn’t work.)
Any and all knowledge is appreciated, this is my first multi-gpu build. Also, in case it’s relevant, I use both Linux and Windows (dual-booted) and the system is going to be Intel/Nvidia (just gaming; do I look productive to you?).