Jump to content


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.

Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.


How Much Power Does A Computer Use?

  • Please log in to reply
4 replies to this topic

#1 cowsgonemadd3


    Feed me some spyware!

  • Banned
  • 4,557 posts
  • Local time:12:36 AM

Posted 24 October 2005 - 09:25 PM

With the cost of everything going up I just wondered how much power my pc uses so maybe I can tell how much it costs per month to run.

I leave it on all the time along with my laptop.

My computer has a 400watt power supply. Does that mean its using 400watt 24/7 or it varies on how much I use it and what I have running in it?

Im not maxing out the 400watts.

Just wondering how much power it uses.

BC AdBot (Login to Remove)


#2 gunner


  • Members
  • 337 posts
  • Location:Pensacola, Florida
  • Local time:12:36 AM

Posted 24 October 2005 - 10:59 PM

Power consumption of any computer is the sum of the power draw of all individual components, starting from system memory over the various storage media to the central processing unit, which after all, still eats the lion share of the total power. Likewise, it also produces the most heat. Power consumption and heat dissipation are related but the relation is not necessarily linear. That is, overall, power consumption is always higher than thermal dissipation, after all, against common knowledge, CPUs were not primarily designed as space heater substitutes.

Processor power consumption varies with load. That is, the more work a processor has to do, the more power will it consume. By extension, it will produce more heat. More heat, on the other hand, will cause the efficacy of the processor to decrease according to the thermal derating of the unit. By extension, this also means that under constant load but with the temperature increasing, the power consumption of the processor will increase and that will, in turn, increase the thermal dissipation as well. Luckily for anyone involved, this is not an endless death-by-heat spiral, there is some equilibrium that will be reached when the different slopes of temperature derating and heat dissipation cross. As a rule of thumb, we are looking at roughly 30% increase in power consumption over some 125 centigrades increase in die temperature. For the desktop environment, this amounts to some 10% increase in static power if the temperature increases by some 40 degrees, for example from 30 centigrades at the onset of a given load to 60 centigrades after leveling out. A celeron processor starts at about 60 watts, and then you need to add the wattage for how many other things are attached, add the compensation for heat dissipation and use the WAG formula to find out total wattage at any given time. My guess is an average of 200 watts, unless you game all the time or overclock.

Spike's advice: Backup your data routinely.

#3 jgweed


  • Staff Emeritus
  • 28,473 posts
  • Gender:Male
  • Location:Chicago, Il.
  • Local time:11:36 PM

Posted 25 October 2005 - 01:10 AM

How much wattage you use depends on what is happening in your computer and monitor at any particular time. Certainly not turning it off when you are not using it is a waste of energy, however little it uses just sitting there on your desk.


Edited by jgweed, 25 October 2005 - 02:55 AM.

Whereof one cannot speak, thereof one should be silent.

#4 cowsgonemadd3


    Feed me some spyware!

  • Topic Starter

  • Banned
  • 4,557 posts
  • Local time:12:36 AM

Posted 25 October 2005 - 12:23 PM

Well I leave both computers on all the time. I wonder if I should just shut them off when not in use to save power.

I have a AMD 3200 64 CPU and a 21in montior that is set to go off after 15mins and the pc shuts down the componets after like 1 hour or so.

The laptop is set along the same time limits...

I really would like to know how much money it costs to run these things per month. Id love to run them off of solar energy lol...

#5 boopme


    To Insanity and Beyond

  • Global Moderator
  • 72,934 posts
  • Gender:Male
  • Location:NJ USA
  • Local time:12:36 AM

Posted 25 October 2005 - 12:52 PM

hello CGM3: Perhaps you can figure it from here.....

How do I get help? Who is helping me?For the time will come when men will not put up with sound doctrine. Instead, to suit their own desires, they will gather around them a great number of teachers to say what their itching ears want to hear....Become a BleepingComputer fan: Facebook

0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users