As increasing the frequency in a simple circuit with only resistors does not increase the power consumption, I have been confused about why it does so in CPUs. My latest theory is that when the frequency increases, the capacitors give less resistance which will let more amps run through and (P=UI), the power increases.
Also, how many amps run through a prossesor? From what I can see the Ivy bridge i5 and i7 can draw 77 W, which at 1,5 V means 50 A! Is this really possible at stock configuration or is the 77 W the limit for overclocking?
Every transistor in the CPU has some small capacitance and so switching between 0 and 1 requires power. Higher frequency = more transitions = more power.
The maximum current for 77W TDP cpu is specified at 112A.
What nowakpl said.. if you dont increase the voltage and keep it stock.. on an oc you might not even notice the temp increase. Its when you start tweaking the voltage to get more out of the cpu.