Hey!
So, something I’ve been kind of nerdy about lately is system power consumption. I’d really like to–as much as I can–live off of renewable energy. But I also grew up as a PC gamer and can appreciate that sweet 120Hz+ gaming at high resolution, and occasionally do need to do heavier workloads like video editing on my system. (And I’m trying to do more of both of those.)
So there’s a couple of options but basically, the lowest possible power consumption in a given use case is ideal.
The looking glass tech that a lot of folks here are working on is super interesting to me. When I heard about it, I started to fantasize about using some kind of workstation card with four or more Mini-DisplayPort Outs. Ideally, a cheaper AMD card that could support cheaper Freesync monitor at a high resolution and refresh rate.
The thought I had was, I could use this low-end workstation card for my daily desktop usage. But, if I wanted to game, I could do the frame buffer copying from a high-end gaming card given to a VM and use Freesync on the hosts’s workstation card. Hopefully I could “turn off” the high-end card when I’m not gaming and have it use little-to-no power.
Do you suppose I would use more or less power on my desktop with the workstation card handling my daily desktop usage, or would I always be adding the high-end card’s “idle” power draw to the equation and be better off using it as my “host card” and “gaming card?”