I have an msi gtx 1080 sea hawk. The monitor I use is the RoG swift, 144hz 1440p g sync.
Is it worth getting another 1080 for maximum performance ? Do enough games make use of sli to justify it ?
I have an msi gtx 1080 sea hawk. The monitor I use is the RoG swift, 144hz 1440p g sync.
Is it worth getting another 1080 for maximum performance ? Do enough games make use of sli to justify it ?
if you want
i just did some 1080 vs 1080 sli tests like a month ago
leemi link it
i took my second card back
but now that im pretty hardcore Folding@home that extra 700k-900k ppd is pretty enticing
Well its realy hard to say if its realy worth it or not.
Because that is realy a matter of personaly preference.
I personaly dont realy think that it would be worth it for anything less then 4K gaming.
IF you want more performance, your best option would be to grab a second 1080. If you don't need more performance, then don't buy another. Nobody can tell you if its worth it too you, that's personal.
what about boinc?
ive not used bionic on this card yet
all i can say is on my watercooled asus O8g strix gtx 1080 (2100mhz)
i sit at 41'c 99% usage and get UP TO 900,000 ppd F@H
That's pretty fricking sweet.
team 231300 is barnacles nerdgasm
we are really rising up in the global ranking
just with this gtx 1080 and about a week ive blazed past almost the top 800 up-loaders on the group leaderboard
i think we are second to linus tech tips still tho....
to get it to run the correct workload
you gotta enter your identity key and name from F@h into your f@h program
just entering your proper key and passphrase will enable quick workloads
if you see FAHCore 0x21 then your golden for the high ppd workloads
fyi every second you have it paused the ppd estimate drops
in case you saw the ppd est in those shots and were wondering why it was so low
you might as well just REMOVE the CPU slot
the gpu will suck up less electricity and also waaay outshine the cpu workloads
unless your using a crazy server chip theirs no mathematical reason to eat all that extra power
you can see here on my screenshot all it has under "slots" is the gpu core
Yeah it blows my mind how much faster a GPU is at crunching things. I think when they had some of these BOINC projects in mind they figured most lab rats don't have access to a GPU and most computers are Dells with 2-4 cores at best in a university office.
for real tho
ive got a 12 threaded i76800k
and its ppd was jack shit
dont even remember but i think like 10k or somthing (honestly dont remember)
back before i realized my gpu was only doing roughly 100k ppd
and figured out how to enable optimized workloads (up to 900K ppd)
even back then (lol like a week ago)
i decided just to kill the cpu workloads and only use gpu
my power usage is just shy of 300 watts with gpu and idle everything else
and closer to 500 with the cpu + gpu (for me thad be like $55 usd a month in power)
LOL if i actually use my radiator fans
the gpu temp drops to 36'c
lmao
Actually killing CPU workloads seems like a smart move. Cheaper on the bill, less heat, I can run my shit in Quiet mode.
Literally the only reason I bought an i7 was to BOINC (and do lightroom rendering and handbrake).
Yeah killing cpu would save me like 200+ watts of juice
And almost 25 bucks a month
Never such a thing called too much radiator
Uploading to mega playing pandora and background junk, folding on the gpu.
Power consumption is 310 - 350 watt
let me start by saying, any cpu currently on the market will bottleneck 2 1080s in a lot of games, especially those open world in nature due to 95% of games being 4 core optimized. the diminishing returns would be ungodly. The only games where you would see decent returns are things like Tomb Raider. So in my opinion, no, it is not worth it.