Well, can’t measure my stuff at the wall but idle (well, as i type this - so at the desktop doing whatever background shit Windows 11 + Steam + GOG + Origin + EPIC + etc. does) power draw in Radeon settings (shows both CPU and GPU) for my system (5900x + 6900XT) is about 14-20 watts for the CPU and 10-15 watts for the GPU with out of the box settings.
Lets call it say… 50 watts when mostly idle (pessimistically)?
1kw / 50 = 20 hours to draw 1kh/h
So you’re looking at roughly 1-1.2 kw/h per day idle for the machine - assuming your numbers are similar to mine… mine should be a little worse most likely.
…or 31.4 cents = $9.42 every 30 days in idle power for the machine you already have. Assuming you don’t let it go to sleep.
Lets assume you’re working the CPU flat out at 140 watts.
That’s ~7 hours per kilowatt hour.
So you can run your lab CPU flat out for 7 hours for 31.4c. Without turning off PBO, power limiting, it, etc. And dropping its power limit or max clock would surely save more power than the impact to processing throughput due to the way power consumption vs. clock speed scales.
Ryzen 5k is one of the most efficient platforms you can get for compute per watt. You can bet that a cloud provider will charge you cost+margin for their power for the compute you consume. Yeah an M1 is good but you won’t pay off the machine with electricity cost savings in a hurry.
Unless you need the machine to be up 24/7 you could just have it sleep when idle?
Now those numbers above are cpu load only… but that’s going to the the vast majority of your compilation power consumption.
I just don’t think you’ll save money in terms of electricity by running stuff in the cloud. Just turn the machine off or sleep it when idle.
Cloud will however help if you need to run something permanently and have it accessible 24/7.
edit:
I think hardware unboxed and GN do efficiency ratings for CPUs.