Each time the clock ticks you're charging or discharging a bunch of capacitors. The energy for charging a capacitor is:
E = 1/2*C*V^2
Where C is the capacitance and V is the voltage to which it was charged.
If your frequency is f[Hz], then you have f cycles per second, and your power is:
P = f*E = 1/2*C*V^2*f
That is why the power goes up at linearly with frequency.
You can see that it goes up quadratically with voltage. Because of that, you always want to run at the lowest voltage possible. However, if you want to raise the frequency you also have to raise the voltage, because higher frequencies require higher operating voltages, so the voltage rises linearly with the frequency.
For this reason, the power rises like f^3 (or like V^3).
Now, when you increase the number of cores, you're basically increasing the capacitance C. This is independent of the voltage and of the frequency, so the power rises linearly with C. That is why it is more power efficient to increase the number of cores that it is to increase the frequency.
Why do you need to increase the voltage to increase the frequency? Well, the voltage of a capacitor changes according to:
dV/dt = I/C
where I is the current. So, the higher the current, the faster you can charge the transistor's gate capacitance to its "on" voltage (the "on" voltage doesn't depend on the operating voltage), and the faster you can switch the transistor on. The current rises linearly with the operating voltage. That's why you need to increase the voltage to increase the frequency.