Will the Power Consuption Myth Die

Do you guys think that the power consumption myth will ever die?  I happened to be looking through the YouTube comments on the latest 960 videos, and couldn't help but notice one thing.  When ever someone would recommend an AMD card that actually outperforms the 960, someone would always say that over time the 960 would save you money in the long run due to power efficiency.  That is a bunch of crap, you would have to have a GPU running at FULL LOAD FOR 20 YEARS for that to be true.  It just makes me mad that people fall for marketing and fanboyism, when there are better cards that you can buy.

It's not a really a myth. It's based on math and is real. But to what extent? It depends on how much you will use the card at full load over the life of the card and how much money you are saving buy buying red team.

https://www.youtube.com/watch?v=4et7kDGSRfc#t=9m

watch at 9mins

So long as marketing departments exist no. People (the masses) are gullible idiots and will readily eat up what ever crap you send out as long as the source is semi official.

The same thing happened with the 8350/9590 vs i7 it turned out the be 6 cent a year more or something equally small.

Nvidia with the 900 series are thinking right but just completely missing the point of massively powerful GPUs. They are supposed to compete with the competition not barley/fail to beat you previous generation. They tried to be forward thinking and ended up not making good enough cards to stop AMDs 290. Graphics performance first then power saving not the other way round.

It's certainly not a myth, there is a difference in terms of what you pay for electricity between a GTX 960 and an R9 280 or 290.

That said, the price difference is extremely tiny that you've have to run it 365/7/24 to see a real difference.

Furthermore, I think I'll repeat what others have said: If 50w is really such a big deal, replace some of your lightbulbs with LEDs.

I will say that the power consumption numbers on the 900 series and 750/750 Ti are all quite impressive - but just not big enough to justify buying them over a similarly priced but better performing AMD card.

Look at the 960 vs 290 for example.

Assuming $200 960 and $250 290, $50 more gives you:

  • double the VRAM
  • four times the bus width (not exactly a fair comparison, since Maxwell has color compression. Still, I doubt that compression can result in a quarter of the normal bandwidth)
  • 37%-47% better performance, depending on the game

Downsides:

  • The 290 consumes considerably more power

and even then, aftermarket coolers easily handle the increased heat output, and it won't cause a huge difference in your electric bill. 

Despite this, the 960 will sell like hotcakes. Because it's branded nvidia and has low power consumption.

>the 960 will sell like hotcakes.

I'm a cashier @microcenter and some idiot bought a $260 GTX960 today. what the hell

I want to kill this before it comes up but

TDP ≠ power consumption.

TDP is thermal energy released by processor when it is running at full strength

So an R9 290X appears to cost ~$310 for the cheapest "good" card and the GTX 970 is ~$340 for the same brand/style, so a difference of $30. The power consumption difference is apparently about 52 watts gaming. Using the average cost of electricity in the US, $0.12/kWh, that comes out to a difference of $0.00624 per hour of gameplay. $30 divided by $0.00624 comes out to 4807.6923 hours of gameplay to break even. And say you had the card for three years, that means the card would need 1602.5641 hours of use per year, or about 4.3906 hours per day. 3.2929 hours per day if kept for four years.

I know people who would definitely go over that and I know people that would go under that.So if somebody gamed pretty regularly and didn't upgrade cards more than once every few years I think the power consumption is valid, albeit it's only a difference of a few dollars at most.

but TDP is in watts. so that's how much power it draws onviously

But seriously Marketing needs to start advertising this correctly. They don't realise the masses don't know that watts measure two forms of energy, and they need to start labeling things more clearly and correctly.

also, it should be noted that your after market gpu with that higher clock speed is using more power/outputting more heat than what's advertised because they usually ship with a high clock rate than the advertised stock TDP.

another thing on TDP Doesn't matter your cooling, stock, Large air or water, the amount of thermal energy that gets removed from the system and put into your room is the exact same, the difference is how efficiently the system can remove it from the heat producing component.

But, to play devil's advocate for a second here, we generally replace our equipment when it becomes more apparent that it just can't keep up any more. I know some people replace their stuff every X years, but most people replace their stuff when there's a game they want to play that their current equipment can't handle real well.

So, if the lower performing card can perform well for 4 years but the better performing card can last you more than that then getting better performance trumps power efficiency because you're replacing several hundred dollar equipment less often.

Really, what nVidia needed to do was cut down the power consumption of their cards to be more efficient and used that extra thermal room to make more powerful cards.

The 960 or subsequent 950 might cut it in SFF because lower power consumption causes less heat which makes it easier to keep your entire system cool.

GTX 980 TDP: 165W    (@stock)

r9-290X TDP: 290W    (@stock, no uber)

That's a difference of 125 thermal watts

While it's actually a difference of 61 electrical watts

 

People are gullible, so this marketing bullcrap will never stop.

The review sites are really helping to spread this. Look at techpowerup's conclusion of the Asus Strix:

"In terms of pricing, NVIDIA has set a $200 MRSP, which is very reasonable, but not good enough to take over the price/performance crown in this segment, which AMD has covered with such competitively priced cards as the R9 280, R9 285, and the R9 280X. [..] However, the integral difference to me and a ton of users is that NVIDIA's new GTX 960 is so very power efficient, which makes it run much cooler and quieter than AMD's cards."

I lol'ed.

The fanboys r reel.

I also wish I could work at a microcenter.  Too bad the nearest one is like 300 miles away :(.

I think it is VERY important that graphics card manufacturers take into account the TDP/performance ratio and when one manufacturer does it better than the other I would surely take that into consideration. Most of the time that is NVIDIA or Intel.

This power consumption myth isn't only about the few cents you can save by going Green over Red on your electricity bill. This is a kind of problem formulation that Team Red likes to think since that serves their position. As a side note: Who gives a shit about a company who can't keep up with changing consumer standards? They decide to enter this 'competitive game'.

Lower power consumption serves more purposes than saving a couple of cents on your bill. I can save a couple of cents AND help the environment? I think I have made my choice. 

With electricity prices here in oz every bit of power saving is crucial. Going from sli'd 760's to a single 980 has cut my system's power consumption to less than half in gaming. That 150w + consumption (measured at the wall) saving adds up over a billing period.

It's not a myth. It depends on where you live. There are countries (like where I live) where electricity is very expensive, the economy is terrible and graphics cards cost between half and twice or three times as much as average monthly salary. People in those countries will take into consideration every aspect of the GPU, including power consumption because they'd like to keep the power consumption of the entire system as low as possible. Every dollar/euro or whatever counts when you're constantly on a budget.

From those charts it's actually less than that in real use-case scenarios. Idle power difference is 15 watts. The total SYSTEM draw at full tilt is 61 watts. That includes the cpu and other components as well. So if your card is drawing a sh*^& ton of frames, physics, and other gpu tasks it will talk to the cpu and other components more increasing their overall power draw as well. Notice how the 680, that inefficient two gpu behemoth is actually the card with the least power draw? This can be mitigated with vsync or frame capping to ensure the gpu and other components aren't wasting resources drawing frames that will never appear on screen.

Once they move to a gpu only task, furmark, the difference is only 31 watts. That's at full tilt all of the time. Even then, it would take 100 hours for that to become 3 kWh which even with high electricity rates like AU's 30 cents US or so per kWh cost you $1.50 US.

But we don't spend all day with our gaming cards at full load. Even when gaming. So the number is likely to be much closer to idle overall than full load.

So 290x prices currently range from $300 to $530 in US markets.

980s go for $550 - $700 in US markets.

So... around $250 between the two on average. That's thousand of hours of gaming before you even dent that price difference. That's also using AU power rates, which are 3x higher than US average rates. Ours go somewhere between .07 and .12 depending on locale. (Sorry Aussies, I'd recommend moving here but shit's not great here either... Mass migration to Sweden anyone?). It becomes a silly amount of time that you actually have to spend in order to actually save money.

Yup.

you say 290vs980? :P

i dont like Nvidia cards, but what this thread clearly points out that as of right now, the two manufactures flagship lines are a bit more specialized than they have before, allowing people to choose a bit easier than in the past.

If GPUs are so expensive, then why are you not choosing based on performance? Don't you want the best gaming experience for your money?

Those myths never really die, I mean hell, AMD GPU's still have "no drivers" or "tons of driver problems" (when was this last relevant and widespread? in '07?). The R9 290(x) is still "hot and loud" even though aftermarket coolers handle them just fine and they're comparable to GX 780(ti)'s and Titans. People will keep digging stuff up and the internet will never forget, just ignore fanboys and try to educate people who simply don't know better.

Power consumption on a desktop PC is basically irrelevant. As long as the heat is under control and you're not running your PC with 100% load 24/7 it really doesn't matter. It's especially amusing when the people who toot the power efficiency horn mention "overclocking to the max" in the same breath. When you're running a high-end GPU it's irrelevant. Performance per watt might be extremely important for datacenters and servers, but it's really not a huge deal for your average PC gamer.

I'll admit that reduced temps and lower noise (some benefits of lower power consumption) are nice, but that's about it.