Average energy heat waste by PCs in 2023?

I heard someone say that nearly 100% of the wattage that goes into home electronics, including PCs, ends up being wasted as heat, which doesn’t sound quite right. Does anyone know a reliably rough figure?

Nearly 100% is an accurate figure.

And most of the delta from 100% is mechanical energy used in fans, hard drives, speakers and light emission from the display. A water pump also falls in that category.

The actual compute and electronics is pretty much all waste heat.

9 Likes

Where else would all the watts of electricity go? Laws of equivalent exchange. This is middle school stuff.

1 Like

That’s why I love physics. You never lose anything. Everything comes back in a different form.

Radiation in the form of heat certainly isn’t a desirable process for most things we use electricity for. But if you can make use of it you can also benefit from it. If my PC runs now October, I don’t really have to heat the room by other means as long as my GPU and CPU are running with some load.

But if I have to install an AC to dissipate that heat in the summer, that’s waste disposal with an equal amount of waste heat added on top going right out of the window.

But if you can somehow make a heating plant for your town by using the local datacenters waste, that’s a smart thing to do. Mankind has certainly managed to use “waste” for productive purposes in the past.

4 Likes

My middleschool must have opted out of the alchemy courses.

I kid, I kid. Conservation of energy and all that.

1 Like

Not sure what you mean.
The more efficient the conduction, the less resistance/friction, the more goes into computation and not heat waste.

Computation is basically just pushing electrons around. They experience friction and bing there goes your heat.

The less friction there is, the less heat is generated. See superconductors.

It’s not a bad thing. All the energy the computer takes in, it uses, and in doing so, generates heat. The harder it works, the more it heats up. Hence heatsinks.

Phones also heat up, and also have heatsinks/plates/glue and such, but use less energy, and so don’t fees as hot but they do still heat up while working away.

A benefit of lower power consuming chips, is they LSO generate less heat, in a positive reinforcement, allowing them to run cooler, with quieter fans

True. Problem is that we don’t currently have room temperature superconducting materials. Although I’m no material scientist, I don’t think we would have computers with zero energy wasted to heat even though we got to the point of superconductivity at room temperature

You are right, room temperature superconductors wouldn’t appreciably change processor efficiency even if they could be scaled down in size (which they cannot) and were epitaxially compatible, the majority of the heat within a processor is created by switching transients and dumping gate charge, two things a superconductor would have no bearing on.

edit: Also we’ve pretty much achieved room temperature superconductors already, within 18°F . We’ve got Hydride systems that have a transition temperature of 518 Rankine; obviously not a practical material as every superconductor discovered since the early 1960s (sorry REBCOs).

All you had to do was watch a few episodes of Full Metal Alchemist to remediate that. :grin:

2 Likes

It may be wasted heat in summer, but in winter it can replace the central heating.

1 Like

I don’t like the term “waste heat” because it makes me think of the inefficiencies in something like a PSU as opposed to a mechanical fan or processor using that energy to do work and then producing heat as a byproduct of the work being done.

Electron energy gets converted to heat or is used to reverse entropy. It takes work to align magnetic domains on a hard drive platter or store charge in flash.

But the energy used for neg-entropy is so small. The rest is all heat.

I agree with what others here have stated, but to expand on it, there’s efficiency to think about - for X amount of Watts, how much computation can be done in what amount of time?

Modern CPUs may be able to compute something faster, but at the expenditure of more Watts of energy, which gets sunk into heat as the work for that computation is done.

A more efficient computer may be able to do the same calculation with less energy. Or maybe it uses the same energy as another, but can do it in less time. An ultra-efficient computer may only be able to do a limited calculation, but that specificity in design allows it to be way more efficient than using general purpose circuitry.

Temperature also has a role to play here - specifically, computers can be more efficient (faster) at lower temperatures (so, super high clock rates with liquid nitrogen cooling) – but that comes at a massive cost elsewhere (you have to use energy to create the liquid nitrogen) as well as the practical considerations (you have to somehow manage a computer running at sub-ambient temperature and deal with condensation, managing the liquid nitrogen, etc. etc.).

Fun stuff to dig into - it all ends up boiling down to thermodynamics and physics.

It’s a bit beyond me, but some interesting things to read about are: Entropy in thermodynamics and information theory - Wikipedia and Landauer's principle - Wikipedia

2 Likes

The best summary is that practically speaking computers take in energy and neither store a reasonable amount of energy nor perform any work, so all the energy they use goes out as waste heat.

When you dive into the weeds it starts going into fundamental questions about reality to figure out the real numbers. Just like @shadowimmage mentioned there are papers debating and testing what the true cost of flipping a bit is.

1 Like

right, but if there was no friction, there would be no vdroop, and therefore no loss of power, and therefore no power consumed…

The power consumed by computation becomes heat, because if it didn’t, it wouldn’t be consumed. It’s not that there’s a split of the energy between heat and computation, but rather, that computation requires an amount of power loss that must go somewhere, so it distributes kinetic energy into the surrounding area, which has nowhere to go, becoming only high frequency vibrancy(or low, if you compare to light), or in other words heat.

All that energy is, in essence, motion, and must be conserved in motion. Even spinning fans produce heat by friction in the air, which is why they slow down when not receiving power. Any power dumped into running the fan eventually must become some other kind of energy, with heat being the result of energy without any kind of focus.
So, in other words, out of anything that consumes power, you get heat, sound, emi, or some other energy emission, all of which will either travel infinitely into the void, or eventually break down into heat.

5 Likes

Down in my basement nothing is wasted. My over-clocked PCs help keep an otherwise chilly environment nice and toasty. As a result my furnace kicks in less often. Then again, I live in a land where winters are cold and the weather keeps you real.

2 Likes

Are we now heating up the void…do I need to plant more trees?

1 Like