Passive GPU options?

I’m having problems upgrading my workstations to AM5 - the Radeon RX 6400 that I equipped with Accelero S3 are barely fitting on the new motherboards.

I started looking at options and it turns out nVidia has for a couple years had a GeForce 3050 that’s passively cooled! How did this slip past me? Probably because the proprietary Linux drivers are nowhere near as polished as amdgpu and they’ve been stuck on deprecated xrandr for over a decade now, which makes me prone to automatically ignore nVidia news.

Are there seriously no options from AMD?

1 Like

I owm an ati x1300 card thats passivley cooled.

Do you know how old that is? Even the passive 460 I keep just in case is many times faster than that.

1 Like

I saw a video yesterday where they undervolted the 4080 and it basically got the same perfomance as usual without even running the fans. Basically modded to a passive card.

What AIB partner had the 3050? Most AIBs don’t touch passive cooling this day and age unless it is specifically made for home theater.

Palit: aHR0cHM6Ly93d3cucGFsaXQuY29tL3BhbGl0L3ZnYXByby5waHA/aWQ9NTE0NyZsYW5nPWVu unbase64 for link since I’m not allowed to post for some reason.
Their KalmX series seems to really enjoy looking all the same with each release - I have the 750 and 1650, and it looks exactly like the 1050 and 3050.

1 Like

Here’s the thing…
I have 6700XT. It has not spun it’s fans in a few months.
It’s a triple slot white 6700XT hellhound by Powercolor.
I have set the fan curve to start the fans when they hit 70C. My GPU is yet to hit 70C.
However that requires good case airflow. So it’s a bit of a stretch if you will be able to do that.
You may have the same config as mine and just having worse case may increase your GPU thermals.

Passive GPUs are pretty much non existent at this point…

1 Like

However that requires good case airflow

That’s a tall order. I’m servicing my computer which is in a 4U case and my wife’s desktop which sits in a medium ATX tower. All fans are removed so that there are no moving parts except the BluRay drives. We’ve been able to get by very comfortably on AM4 hardware with those setups. The 5700X’s declared 65W TDP feels actually honest while 7900’s claims of 65W feel really underrated.

I’ve managed to get the 6400 back in without the reinforcements - turns out the card is so baked that the termal pads cemented the backplate to it, so there’s no more need for the reinforcement brackets that kept colliding with everything on the motherboard - and now it works. It’s just as hot as before. But still - hardware that does the same or more, for less TDP, is the hardware that I want. I don’t have free electricity for goodness’ sake…

2 Likes

Been years, since I’ve seen an AIB Passive built GPU

1 Like

The Palit one OP replied with for the 3050 is the first I’ve seen in at least 5 years. It’s unfortunate how Capitalism and technological advances shun the niche market. There IS a market for it, but nVidia and AMD don’t want to make product for it. Intel could swoop in on the low power market, but they won’t. Again, Capitalism vs niche.

Edit: I meant to start this off sympathizing about electricity bills. It’s fixed where I am, but I know it’s not for most of the world. May I suggest GamersNexus on YouTube for power consumption in their reviews?

Is this based on temperature reporting or power consumption? Because TDP “Thermal Design Power” has nothing to do with power consumption or thermals, from AMD or Intel. It’s an odd advertising metric which will always be disingenuous, at least these days; whether it meant something in the past or does in the future, who knows? For video cards, there’s a whole different metric of TBP “Total Board Power”, which (thankfully, because they don’t have to) AIB partners tend to state what wattage power supply is recommended if using their variation of a given graphics card. Why is TBP not used for CPUs? Because a CPU is not a daughter board.

It would be fantastic if marketing could work with actual engineers to come up with a proper metric, but I don’t see that happening.

So far I’ve based my buying and design decisions based on TDPs and this used to be a reliable metric.
2017: got a Ryzen 2700 and the CR-80EH to match with a margin of 15W, works great
2021: I wanted to upgrade from Radeon 460 (the passive one) so I got a 6400, with an Accelero S3 to match, with a margin of 76W
2022: got my wife a Ryzen 5700X and another CR-80EH to match, again with the same margin, and I also made her a passive 6400 with another S3, worked superb
2024: I got myself and my wife a set of Ryzens 7900 and we’re at this point and already regretting getting AM5

AMD is undoubtedly better at power consumption, but this is exactly why there should be an industry standard and not arbitrary “marketable” numbers. It is a massive pain in the ass, but don’t be mad at AMD because Intel does it, too, but Intel is struggling to compete at this point because they willfully stagnated for a decade.

I know that it’s not easy for you, possibly being from somewhere where power consumption concerns are a literal day to day subject. The only thing that I can say is that we ALL live in a day and age where knowledge is more powerful than ever. :face_with_diagonal_mouth:

Intel has a very interesting interpretation of the TDP which is why I’ve been using AMD for the last 20 years.

Can’t sell 4 cores forever.

Thank you very much for your understanding. You know what, power consumption isn’t that much of a deal - of course it’s a good reason, but ultimately my entire passive quest is the result of a grudge. 12 years ago I had a Bitcoin miner powered by 2 XFX Radeon 7950s, the ones with 2 fans each. After half a year of uninterrupted chugging 2 of the fans broke thus reducing the performance. I tried to RMA the cards because at the time I had no idea how to fix the fans myself, and basically XFX and ebuyer_dot_co_dot_uk tried to send me on a wild goose chase between them. On that day I vowed to never buy a fan-cooled computer again. I would’ve gone with water cooling, but I’ve had doubts about it and went with passive. And so my first passive rig materialized in 2017. At long last with a sigh of relief I’ve proclaimed that now XFX and ebuyer can shove their fans where they best not fan any air out from and I lived happily ever after. And I sold the XFX cards to someone more capable at maintaining them.

Did XFX promise any mining potential back then? I legitimately don’t know, but that is on them and them alone if they did.

I’ve been AMD CPU wise the last 20 years myself, even though I was young and dumb for the FX series of processors.

Every one makes mistakes, but pushing ridiculous wattage is on purpose; looking at you, Intel.

Either way, independent reviewers are the best way to keep up with tech. L1 being an awesome, mostly server-centric outlet. There’s a reason GN and even the (ridiculously) overrated LTT all call on Wendell to be their janitor. In GN’s case, it is out of necessity. In Linus’ case… foot in mouth disease? Didn’t properly vet his own crew?

It depends which crypto you were mining.
Bitcoin = sha256(sha256(x))
Litecoin = scrypt(x)
Ethereum = random gibberish interpreted as instructions

For Bitcoin specifically, each XFX Radeon 7950 offered 400MH/s. Also AMD was the goto choice for Bitcoin mining since OpenCL scaled better than CUDA for computing checksums. I had a friend who wanted to get in on the Bitcoin game and bought some overpriced nVidia Titan or whatever it was called, and he was deeply disappointed with the results. Before ASICS were a thing, people were very enthusiastic about using Bitcoin as a benchmark and this is how we know that PS3 could do 25MH/s and Motorola Atrix could do somewhere around 670KH/s.

No no, was it advertised to do any crypto mining? It doesn’t matter if the community says it can do said thing, it’s whether or not it was sold to you by the manufacturer to do said thing.

I don’t personally mine. It’s not something I take into consideration.

Remember that mining was never the original purpose of any product, it just so happened that it COULD do it, but it was never promised… except those nVidia mining cards that they tried to sell a year too late when no one could get a 30xx card.

Nothing has been advertised like this in that time. Sometime by the mid 2013 people started coming up with FPGAs and ASICs which were not just advertized, but designed to do specifically that

Then you are using that hardware outside of what it was meant to be used for, in the case of a GPU, graphics processing. You can’t blame a brand for a use case where the product was never intended.

Yes, FPGAs and ASICs purpose built for mining exist and THOSE promise mining performance because they are made for that. Consumer GPUs were never made for that use case even though they can TECHNICALLY do it, but neither XFX nor AMD promised that.

The reason that any purpose built FPGA or ASIC can mine well is because they are built to handle a specific instruction set; what that instruction set is varies.

Maybe look around this website: AMD Ryzen Fanless Tower PC

and perhaps a look here: https://www.monsterlabo.com/

and finally, here: https://www.fullysilentpcs.com/?v=7516fd43adaa

1 Like