Why does overclocking increase temperatures?

As increasing the frequency in a simple circuit with only resistors does not increase the power consumption, I have been confused about why it does so in CPUs. My latest theory is that when the frequency increases, the capacitors give less resistance which will let more amps run through and (P=UI), the power increases.

Am I right?

 

Also, how many amps run through a prossesor? From what I can see the Ivy bridge i5 and i7 can draw 77 W, which at 1,5 V means 50 A! Is this really possible at stock configuration or is the 77 W the limit for overclocking?

Every transistor in the CPU has some small capacitance and so switching between 0 and 1 requires power. Higher frequency = more transitions = more power.

The maximum current for 77W TDP cpu is specified at 112A.

Makes sense. Thanks.

and, realistically, your pross starts at like 1.1-1.2v, so the amp draw is higher....

What nowakpl said.. if you dont increase the voltage and keep it stock.. on an oc you might not even notice the temp increase. Its when you start tweaking the voltage to get more out of the cpu.

hmm >.< 

i feel like the next big thing in system cooling will probably use Liquid ice in a practical closed loop device.

then we can all up the voltage to like 10 

 

nope, thermalelectric. ztrain already started on it

Basically once you hit a certain voltage no matter how cool it is, it will not work. (Said thermoelectric effect)

... then how people do liquid nitro cooling