Power consumption of any computer is the sum of the power draw of all individual components, starting from system memory over the various storage media to the central processing unit, which after all, still eats the lion share of the total power. Likewise, it also produces the most heat. Power consumption and heat dissipation are related but the relation is not necessarily linear. That is, overall, power consumption is always higher than thermal dissipation, after all, against common knowledge, CPUs were not primarily designed as space heater substitutes.
Processor power consumption varies with load. That is, the more work a processor has to do, the more power will it consume. By extension, it will produce more heat. More heat, on the other hand, will cause the efficacy of the processor to decrease according to the thermal derating of the unit. By extension, this also means that under constant load but with the temperature increasing, the power consumption of the processor will increase and that will, in turn, increase the thermal dissipation as well. Luckily for anyone involved, this is not an endless death-by-heat spiral, there is some equilibrium that will be reached when the different slopes of temperature derating and heat dissipation cross. As a rule of thumb, we are looking at roughly 30% increase in power consumption over some 125 centigrades increase in die temperature. For the desktop environment, this amounts to some 10% increase in static power if the temperature increases by some 40 degrees, for example from 30 centigrades at the onset of a given load to 60 centigrades after leveling out. A celeron processor starts at about 60 watts, and then you need to add the wattage for how many other things are attached, add the compensation for heat dissipation and use the WAG formula to find out total wattage at any given time. My guess is an average of 200 watts, unless you game all the time or overclock.
Spike's advice: Backup your data routinely.