My PC, as it is, runs fine IF I let my motherboard control the two 120mm fans that are cooling my GPU radiator. But once I connect a PWM fan adapter from the GPU to those two fans, it causes it to lose video output when going under full load and/or gaming.
Is this a power issue? I'm guessing that they do consume a little more power, but do 120mm fans consume THAT much more power than the stock GPU fans, or even the two 60mm/92mm fans that are currently being controlled by my GPU?
GPU is an R9 280X. Cooled by an Antec Kuhler H20 620+NZXT Kraken G10.
Uhh, I thought those fans are supposed to run off of the mobo? The fans you are using probably need a lot more power to run that the stock GPU fans, so it poops out.
I just wanted to try and see if they could be controlled by the GPU. Seems like a better idea to have them ramp up and down with the GPU loads. The fans in question are two Corsair SP fans that originally came with my H100i.
GPU Fans are generally 5v and 120mm Case fans are 12v. You may be just barely over the starting voltage of those fans connecting them to the GPU. Once the fans call for more power the GPU fails as it cannot deliver.
I see. Could increasing the power limit in AMD catalyst resolve this issue? I might try that.
And I do think you're right, because when I set it to 100% fan speed, they both actually do spin at 100%. BUT that's only if the GPU is not doing anything.
But then again, what the stock fans require and what the GPU is able to deliver could be different figures. After all, the PWMs on motherboards can power up to 3 fans. Or is it even more than that?
Honestly I didn't even consider that you could connect radiator fans directly to a card.
But yea , the motherboard has tiny little power traces that can burn out a lot easier then the power wire that SHOULD be connected to the fans. just plug them into the power supply.
Alternatively you could use something like SpeedFan (I am not sure if this has the exact functionality but there is a program like it that does if it is not this one) to ramps up the motherboard fans with the GPU temps and just move the motherbaord fans to the GPU.
Would solve your problems and not risk anything with the card.
They may require 12v .25 amps, but the resistance when cold booting them could be much higher due to differences in design of the fan motor...
If you think a minor little plug like this is going to be worth the potential cost of a new GPU, then go ahead. Not understanding a couple hundred dollars of equipment and then messing with it seems rather dumb to me.
You're right. I am completely ignorant when it comes to stuff like this. I do appreciate all of your advice!
Perhaps some Corsair sp120 quiet editions then? They're rated at 12v, 0.08 amps. The online calculator says that's less than 1 watt. 0.96w to be exact.
I've never used that Fanspeed software, and from what i'm reading online, most of those programs won't let you set fans to GPU load. They usually go off of CPU load. But I don't know...
It's an r9 280 , you just set the fan speed with catalyst. speed fan is just a program that shows your fans rpm and displays temps.
And , once again , just plug the fans into the power supply! Why get overly technical by having the fan match the gpu temp and adjust speeds when you can plug the fan into the 12v and just have them ALWAYS keeping the gpu it's coldest at any given point?
fans are designed to run at their top rated speed , they can't burn out from using the amount of power they are designed for. it's not like a car engine that will break from running at it's max rpm for too long. set them to the max and problem solved.
Because noisy PCs are annoying and loud even with earphones or noise cancelling earbuds in your ears. The whole point behind fan speed adjustment is to try and keep the fan noise to a minimum.
Take for instance 25,000 RPM Server fans, they are loud as shit. But, since a server is usually locked in a closet or stuffed in a sound deadened office room, who cares how noisy it is?