Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

How much electricity does a PC burn per month?

Status
Not open for further replies.

hierogrammate

Technical User
Oct 12, 2003
68
0
0
PR
I don't know if this is way out of topic, so my apologies if it is...

Do any of you know how much electricity a PC consumes per month?

Edwin
 
Modern computers use a lot of juice, but one thing that uses the most is the monitor. TFT are the worst for power useage, but CRT arent great either so if you want to conserve electricity turn off your monitor when you are not using it - it should make a difference.

An example of this, if we leave the TFT monitor on in our server room the UPS power unit lasts about 10 mins rather than an hour... its that bad.
 
Sheesh, and I thought TFTs sucked up less juice than the CRTs... mine's a CRT, btw, and one of the reasons I was considering getting a flat panel display, apart from being lightweight and the coolness factor, was that I thought they consumed less energy.

Thanks for the info :)
 
Oops, time to admit a mistake, i was wrong about TFT using more energy, very sorry

Taken from Microsofts web site

"You'll make up some of the difference on your power bill. Flat panels only use 30-40 watts of electricity. CRTs run about 110 watts. Over a year's time, those savings will make a small dent in your utility payments. That also means flat panels run cooler, making them more comfortable to use."

Sorry :)
 
Depends on what you have and how hard you work it. Peak power delivered by the power supply increased by 10% for conversion ineffiency plus the monitor gives the total power consumed at worst case. Multiply that by the number of hours it is running for the month, divide by 1000 to get the KWH consumed for the month, multiply by the cost per KWH for power from the bill, including any extra charges for peak usage and you will have the PC's direct cost.
The indirect costs may be more. The power consumed generates heat. If you have air-conditioning the heat is transferred outside by a device that probably takes 3 times the power of the PC to shed the heat.
Are you sure that the power cost is the real issue?

Ed Fair
Give the wrong symptoms, get the wrong solutions.
 
2 cents..
Ed cover the costs well, might add the peak usage demand factor penalty( usually commercial rates),runs about $25.00 per kw peak per month in NY. Demand factor is calculated in New york as the highest peak load in any 30 minute period of a billing period. The demand factor cost can equal the actual wattage cost on devices or more. Businesses should worry about the demand factor as much as the wattage cost. Demand penalty charges vary considerably.

An air conditioning example...
As an example, I was a building Engineer for 8 years. The a/c wattage cost was $1300.00/month , the demand factor penalty was $ 2000.00. Total bill $ 3300.00/month. If the A/C was turned on for 1 day in a month (billing period), say May 31st, the demand penalty charge was still $ 2000.00, wattage use have only been $ 43.00 for the 1 day.
Billing periods do not go by the number of days in the month, they overlap other months slightly.
 
You have to look for ghost loads like a power strip, a transformer on a printer, networking gear, cable modem, router, scanner, etc.

If you do not like my post feel free to point out your opinion or my errors.
 
Typicall example of a Modern PC:

400 PSU and a 120W 17"/19" Monitor:

400W + 120W = 520W = .52KW per hour. This means you use a "Unit" of Energy (in KW-Hours) every 2 Hours.

A typical rate of electricity in London is ~7p per KW-Hour.


So if this model PC is left on for 6 hours a day, for a month, that's 180 Hours, which is 90 KW-Hours, which is in turn £6.30 per Month (£75.60 per year).
 
If this is the case, that why is a monitor considered to use more juice than a pc. Most pcs have atleast a 250-300w powersupply. The 120w from the monitor is only half of that. What gives? Also if you have a 300w powersupply is that powersupply always grabbing 300watts of power? Devices can use more power if they are strained. A VCR uses more power when you are playing a tape then if it is off just displaying the clock. Does any one have any insite to this?
 
Monitor power is more nearly a true need. PC power supply has top capability, not necessarily the used amount. Only way to really know what they are pulling is to use an ammeter to measure it.

Ed Fair
Give the wrong symptoms, get the wrong solutions.
 
I also forgot speakers - they can be anything from 5W to 120W Speakers, and so can be quite expensive also...
 
Here is my reccomendation:
Get a list of all your pieces of hardware, so an itemized list of everything that your computer has in it, including processor, sound card, graphics card, hard drive(s), and all your peripherals, external drives, speakers, mouse, monitor, keyboard, printer. Now do some serious internet research on each item to see how much power each piece uses at peak use times, so when it is working the hardest. Next multiply that total by the ammount of time you use the computer in a month. That number will be a little over if you turn your computer off when you aren't using it, and probably a bit low if you leave it on all the time.

Present this number of kilowatt hours to your father, and he can then tell you to use your computer less. It is likely that something else is running in the house that sucks up more power, and your dad just wants to put blame elsewhere. Just make sure you turn off your computer when you're not using it, and leave your printer off unless you are printing
 
And if you don't want to leave the PC off because you can't be bothered to bootup again, just turn off the speakers/monitor/printer/scanner etc. because this'll likely reduce the power used by 40-60%.
 
I have a new "typical" computer sitting here on my bench.
P4 2.2Ghz, 512M Ram, 160G Samsung HD, 17in CTX flat panel monitor with APC ups attached.

The ups has a monitor program that shows the current power load. With the PC sitting here and monitor on the load is 105 Watts. With the monitor off load drops to about 80 watts.

So it is about the same as a 100watt light bulb. Electric rates vary, but if we assume a relatively high rate of 10 cents per kilowatt hour we get.

.1 Kilowatts x 24 hours x $ .10 per kilowatt hour = $ .24 per day.

365 days x $ .24 per day = $88 per year.

Of course in the winter the heat given off by the PC "helps" heat the room. In the summer it works against the air conditioning. Both these will have some effect on your total energy bill.

To summerize:

PC = 100 watt light bulb!
 
Holy crap..... let's have a little clarification here.

First of all, you can't compare speaker wattage and CRT wattage.... that's tottally apples and oranges.

First of all, if you have a 450 watt power supply, that's at 12 volts, not at 110. Easiest way is to look at your power supply, and see how many AMPS it takes. That will be a MAXIMUM before it pops the fuse.

So.... assuming that your power supply says it takes 2 amps.... take amps times the voltage, and you have 240 watts. This is the MAXIMUM... you'll never hit that. According to some sites I've read, a computer and CRT running uses about 150 watts.

Your monitor runs at 120 volts; that wattage on the tag on the back should be right.

Your speakers, however, are a TOTALLY different thing. If you have 200 watt speakers, you're talking about SOUND, not POWER. Plus, if you've got one of those 200 watt computer speaker systems, and it's running on some sort of wall wart, then you're being slightly misled. They're talking about a theoretical PEAK power, not RMS. RMS means sustained output power. Peak is one time (like a kick drum). Computer speakers also pretty much only use power when they're being used... otherwise the light trickle of power for them to be sitting there with no signal is negligable.

The bottom line is, it costs about as much to leave your computer on as to leave a 150 watt light bulb on.

And this is reduced if your monitor is "Green", meaning it goes into standby mode.



Just my $0.02

"In order to start solving a problem, one must first identify its owner." --Me
--Greg
 
There are also issues of efficiency.

For example, the PSU may be capable of supplying 400W to the mo-bo, drives and other peripherals, but it is not 100% efficient. So the power input will be in excess of what is delivered.

While some top class transformers can be up to 98% efficient, 90% is a more typical figure - just feel the waste heat from the vent.

So add on a nominal 10% to cover inefficiencies - equivalent to an energy tax!

A wattmeter can be used to give real time power consumption.

Iechyd da! John
12:44 24/08/2004 BST

Iechyd da! John
Glannau Mersi, Lloegr.
 
Well I think the Monitor is the largest contributor to the electric Bill as far as the computer is involved. Over time using an LCD monitor may save enough to justify the extra cost. Shutting the computer down when not in use may save you some money. This may include the Printer which only needs to be on when in use, power to networking gear, etc.

Keep In mind that a computer that has a 300w power supply does not use as much as a Water Heater, a refrigerator, electric stove, electric heater or a furnace/air conditioner. A TV Uses electricity about like a monitor so how many TV's do you use?

If you want to talk intelligently, perform a survey of all the devices hooked up to the electric from the garage door to the electric alarm clocks. May want to visit this site I sometimes look at for Solar Power:


They may have some links for estimating the power consumption of normal household devices.

If you do not like my post feel free to point out your opinion or my errors.
 

You can also buy the handy dandy device (which I cannot remember right now what it is called), but this device is conntect plugged into the wall outlet and into the electronic device you wish to monitor. The device will monitor the wattage and voltage. From here, you can measure the amout of electric you will be using. You can use this device on anything and can get it from Lowes or Homedepot.

Dont forget, you can turn on your PC power manager which will put your PC to "sleep" when not in use.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top