Linus sort of exposed the real world use of PSUs when it comes to GPUs and it's kind of shocking..
My GTX780Ti "needs" 600w but it's probably only pulling 250w. and Linus's quad titan's or whatever were only pulling 600-800 watts, what's up with that?
So why do manufacturers have such high specifications??
Because people buy shitty PSUs.
They aren't saying that your card alone needs 600, they're saying that your entire system should have a 600 Watt PSU to support it. Including your fans, CPU, GPU, Motherboard, HDDs, everything.
Also, they need to put it at the high end, just in case. You know? If they said that you only need a 300 watt power supply, and some idiot gets it, has a 300 watt PSU, but like, 4 HDDs, and an 8350, it isn't going to work. And then the company could take a hit for it.
Ahhhh ok I see that makes sense.
Is there any peak or max output wattage ratings for the GPU in particular listed in specs? Or do you usually just have to search the interwebs n stuff?
No matter what PSU you have, they all have the same type of connections. GPUs will use a 4-pin(rare), 6-pin, and/or an 8-pin connection. These are standard connections across all PSUs so you don't really have to worry about paying attention to voltages. They are a standard that everyone follows
Does that answer your question?
My rig with a GTX 780 pulls over 600 watts with 2011 i7 on games if its graphic intensive and i have k boost on. So their requirements are correct and for me they are actually under what I need to run my system in certain situations like overclocking i have hit over 650 with intense gpu cpu overclocking.
Yeah I thought that it was the requirement for just the GPU though.
youll just have to check around. See what PSUs others are using and such. The requirements they give are pretty trash.
No, manufacturers just want to make sure that people have enough headroom on their PSUs to properly power their GPUs. If you've ever used pcpartpicker, they give you a rough estimate of how many watts your system will actually be pulling from the wall once you select all of the parts that will be in your rig. You'll draw more power if you're overclocking though so it's important to be mindful of that.
KSJAL is right, basically some people buy cheap low quality power supplies which do not provide their advertised power output on a continuous basis, sometimes at all, infact some powersupplies used to have fake 80+ certs. So they be conservative and cite a high number, that way they don't need to deal with all the problems.
Look at all the graphics cards that have a 6pin and 8pin PCIe plug, those are rated for 75W and 150W respectively according to ATX power specifications, the PCIE socket itself is also rated for 75W but some card violate this standard. This is a total of 300W, looking at reviews for even power hungry cards like the 290(x) or the 980(OCd) you can see them sitting around the 250W average mark.
Yet you can see manufacturers like Gigabyte cite requirements as high as 600W for a single card which is absurd. Even if you fill your case with Fans and hard drives and have the dinosaur CPU that chews through the power, your not going to go through 300-350Ws of power. Furthermore alot of PSUs are X50w i.e. not 600w but 650W. So it is really more like 350-400Ws of excess power for the rest of the system.
Also another thing to consider is that PSUs degrade overtime and the amount of power they can provide on a continual basis diminishes, although this should not be such a problem with the high quality branded PSUs that you see are made by reputable companies and offer a reasonable warranty period. But with cheap low-quality PSUs this is another story entirely. Not to mention overclocking increases it.
- Low quality trash PSUs
- Power Degredation over time
Beheeemoth somewhere up there you asked about power requirements of the GPU itself. I haven't found that info in manufacturer specs, which is frustrating. I go looking for the info in reviews of the GPU, and I'm usually able to find something.
Aren't all the components' logics controlled by switches? Where you have switches going on & off a gazilion times per second (CPU, GPU) the electrical current is going to fluctuate so you'll have to take RMS into calculation. Basically the reported power consumption is the peak value, not the effective RMS value, which we get when we measure power consumption from the socket.
Baz what do you mean by reported, reported by who? It does fluctuate wildly and the TDP numbers usually given (by say AMD) are an average at peak or near peak sustained draw.
TDP is thermal, if a GPU has a reported TDP value of 125 watts that means that the cooling unit must be able to dissipate 125 watts of heat energy to prevent the chip from overheating, it doesn't mean that the power consumption is going to be 125 watts.
"Reported" as in whatever random numbers the manufacturers print on the boxes.
Yeah i'll have to look online I guess.
Really? Where do you think the energy goes if not heat?
Bad choice of words in my last post, my point was that TDP is a nominal, not an actual value.
Work. It takes energy to activate a transistor to flip it on/off. that's where energy is getting used. Since everyting has resistance that's where the heat is generated.
fun fact, your car actually produces 3x to 4x more horsepower than it's rated for. That extra energy is lost in heat. This is also the reason Teslas are more efficient over all, even though they're indirectly using fossil fuels because the power they run off of was generated at a large facility where they do not have the large loss of through heat that plagues small internal combustion engines.
look up the Second Law of Thermodynamics It's awesome
Hmm. To my understanding "work" is manipulating the motion of an object, and I don't think there's any moving parts inside electronic components so the reason why power is consumed in electronics is because electric energy is converted to radiant energy.