How were the three voltages provided by a PSU, 3.3, 5, and 12 volts, decided upon and why is it these three voltages. It seems that 12 volts is used by most everything. While 3.3v is used by almost nothing.
It's hilarious and interesting and has evolved over time.
And actually there are also some more voltages provided -5V and -12V -- these are of course relative to ground.
It goes back to the IBM PC XT -- 5v and 12v were needed because TTL logic of the era was 0v to 5v (0v for logic low, 5v for logic high. Though a lot of TTL logic was implemented as active low, which means things happend when the input hits 0v vs 5v.). This dates from the 70s where Motorola and Intel and later finally IBM were doing stuff with these fancy pants new integrated circuits and 5v was decided to be a reasonable number.
However 5v was not enough run run motors and such. In the original PCXT design, one fan in the power supply provided ample cooling, but the floppy drives and later hard drives had motors for mechanically moving things. The electronics industry of the day had 12v motors.
Okay, so why -5v and -12v? Well, it was mostly down to future expansion and communication. -12v was largely to support the RS232 serial ports which by used positive and negative voltages to encode data, instead of the TTL standard that was 0 to 5v
I believe -5v was made available on the expansion bus or for peripherals as I think some expansion cards could use the negative voltage for signaling but -12v was too much in those cases.
Fast forward to relatively recent vintage Pentium, Pentium II, etc. You get some of these sometimes that have the old 20-pin ATX power supplies. Laptops and portable electronics drove the standard from 5v for TTL level chips down to 3.3v for TTL which made less heat and used less power. And CPU power ran off that 5v rail.
Well, lots of watts/amps on 5v through wire is hard to do on small gauge wire, so they added a lot of them. And here's the trap you fall into sometimes -- you would see someone with a 400-500 watt power supply but most of the amperage was on the 5v rail (because it was an older power supply) and not the 12v rail. Oops! doesn't work!
Modern CPUs use 12v. But wait, you say, aren't the cpus themselves only a couple of volts? with X99 platforms being about 1 volt or less except when extremely stressed? Yes, I would say, this is good thinking and exactly right. But delivering 1 volt and the several hundred (potentially) amps would require wire of a gauge that would make the way your computer looks absolutely comical. Instead they have these phase conversion circuitry to take 12v and 20 amps or whatever and make it 200 amps at 1.2v instead. This is done on the PCB of the motherboard by solid-state components, and they do it very efficiently.
In fact we've also good at doing 'voltage pumps' and the like to synthesize negative voltages, lower voltages, etc. from a common input. So on a lot of OEM computers now (Dell, HP, etc) -- especially at the lower end -- they completely eschew the atx stnadard and have 4, 8 or dual 8 pin 12v connectors ONLY and synthesize whatever other voltages they need from that one 12v input. Laptops have driven a lot of this type of miniaturization.
yum yum good read
Who currently maintains the ATX power supply standard? Would there be any benefit to the enthusiast PC market if the ATX standard was updated and eschewed 3.3 and 5 volts and became a pure 12 volt standard? It would at least shrink the size of the main motherboard connector. Would it reduce the size and complexity of PSU itself?
Also thanks for a great answer to my initial question!
Thank you Wendell for filling my daily nerd intake. You Rock!
Good question, Great answer.
Thanks for that.