Power usages of "passed-through" GPU

I am beginning to investigate possible builds for a “Pass-through” system and I was wondering the following:

Say we have the following:

  1. Host with integrated GPU
  2. Host with integrated GPU + an non-initialized dedicated GPU
  3. Host with integrated GPU + initialized GPU

Supposedly once your launch VM the drivers/utilities should do power managed on the dedicated GPU, which makes sense.

  1. Gives the lowest power usage(duh all is integrated)
  2. “Should” result in power consumption couple of wats above option 1.
  3. “Should” result in power consumption option 1 + power consumption of the GPU

I am asking, since I have a Maxwell based chip with power unlock which is great and I want to keep it until there is a proper mainstream GPU that can handle 4k(from AMD that is).

P.S. - Of course running a VM will eat some more CPU and RAM power, but that should be negligible, compared to 400W GPU.

I’m trying to test this out once I get my STRIX 1080 next week or the week after to work in tandem with a GTX 1060 to see the power draw from 2 discrete GPUs for passthrough and what happens if both run games.

Not gonna use a Kill-a-watt, but am going to use something new Reed came out with.