750watt for 2x 7970?

Hi everyone,

in my current system i have a 750watt bronze certified power supply (more like silver because of 110v vs 230v efficiency). Anyway i have a 3570k at 4,2 ghz with 1,16v max, a ssd, 2 hdd's. (i really dont know how i should take all of this into account, as i don't think they draw a lot of power).

My current gpu is a HIS 7970 Iceq Ghz ed, and i wanted to eventually get a r9 280x as my second card (his has one that has the same cooler as my 7970)

Would you think my psu is sufficient for that? Or should i upgrade? I really don't want to upgrade as my psu was quite expensive and getting a bigger one would be even more.

Now i checked how much power my system was drawing under full load the other day, and with 2 monitors (19" 4:3 and 24" 16:9) it peaked at around 550watt while running furmark and prime95.

I'd be glad to get some tips for this

Greets

EntroP

More than enough.

https://teksyndicate.com/comment/reply/144195/1445704

Look for yourself.

That would be fine, and there would be room for upgrades as well. I would recommend the RM750, as you can pick and choose what cable links you have. so if you don't need molex connectors, you can just remove the molex chain. so good for cable management. Best PSU I've ever had!

 http://hexus.net/tech/reviews/psu/60449-corsair-rm750/

 

Uhm yeah. I would still go for a 850W 80+ Gold. Seasonic of course; tighest voltage regulation on the market, robust components that can handle overload, silent fan and overall build quality is top-notch. Not something you'd throw in a tight budget build.

Uhm, yeah.

That is from the wall, not what is going to the system. Accounting for efficiency of the unit, he was pulling 701W, iirc. 750W unit is fine.

On a side note, how is 220v more efficient?

I personally went with OVERKILL on my PSU but got a SCREAMIN' deal on it so game on? :P

I got a 1000w 80+Gold for 8350 OC to 4.7 with 2 7970's.

Think of it this way; Volts are how many punches you are throwing, and Amps are how hard you are punching. More voltage (220/230V) compared to 120V is more potential, so you have a 'higher chance' of being able to do something. Like throwing 1 million punches with 220V, or half a million punches with 120V; you just have more to work with, so less can be wasted.

Seeing as how an 850W isn't that much more than a 750W, I would be inclined to go bigger, just to keep my mind at ease.And if money is the problem, you can get an OEM version.

 

http://www.newegg.com/Product/Product.aspx?Item=N82E16817151100&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=3938566&SID=

Either way, both are being converted to DC. Shouldn't matter how much voltage is going through the outlet. Can get the same results with either, if you drop one or the other. Assuming whomever did the electrical work did it to current code, it should not matter. 

+1. It really is a common misconception (I also made the same mistake which started alot of confusion)

Also keep in mind 701W was the complete torture limit, while using overclocks. you should totaly be fine with 750W

PC build
http://s1010.photobucket.com/user/tgunsg35/slideshow/PC%20build

personally just spent the money. My fans run stronger too, as you can see in the 1st picture I had a RM 750 which did the job, but the ax1200 give me more head room. 

So 750 yes, but you're always better off with a little overkill.