As an avid computer user I have multiple computers and usually I leave at least one on around the clock. Today I heard somebody claim that leaving a computer on 24/7 will cost $600 for an entire year, which equals $50 each month, just by leaving your computer running constantly. I tried to argue this claim but they stuck to their guns and claimed that they knew it to be true. So I tried to calculate the power usage and cost myself.
Of course all computers use different amounts of power and the price of electricity differs depending on where you live and your local electric company, so we will use averages to reach a estimated conclusion. Most desktops use at maximum about 400-500 watts of power for the computer itself and monitor. Other equipment such as printers, routers, modems, etc. all use extra power too, but no more than a few watts each so they aren't too big of a deal. Laptops are designed to use much less power than desktops so that they can have maximum battery life. Many components in a laptop are smaller and use less power, such as the fans and hard drives. Also the processor, video processor, ram, bus, and other components also use much less power, and modern laptops employ advanced power management techniques, such as lowering the screen brightness and slowing down or turning off components when not in use like the hard drive, processor, screen, networking chips, etc. Because of the power saving features, most laptops use less electricity, usually about 75-100 watts for the entire system. So in order to find out how much it costs we need to know how much power costs. Again this varies from state to state and city to city. The state with the lowest electricity cost is Washington at 8.66 cents per kilowatt, and the most expensive electricity is in Hawaii, where it costs 37.11 cents per kilowatt hour. The average price for electricty in the US is 12.40 cents/kwh. What exactly is a kilowatt hour? A kilowatt is the amount of electricity that would be consumed by something that uses 1000 watts over the period of an hour. So if a computer uses 400 watts that would be equal to 0.4 kwh each hour. In a day the cost would be $1.19 for the computer at maximum power, and over the course of a month it would cost $35.71, and $428 for the entire year. Of course most computers are not running at maximum capacity all the time. If the screen and other components such as the fans and hard drive are set to go to sleep after no activity for a while it would probably average about half of the maximum power so this price would drop to about $17.80 a month. In a year it would amount to $214. Many people these days have laptop computers for their only computer, and as previously mentioned they use a lot less power than regular desktop computers. At an average of 75 watts with power saving settings, it would cost $6.69 a month and $80.28 a year.
So most full sized computers use about $215 worth of electricity a year and most laptops would cost $80 per year. So where did the $500 figure come from? If a computer uses a full 500 watts around the clock and the price of power is still 12.4 cents per kwh, it would end up costing $535 per year. I would imagine that whoever came up with this calculation thought that the 500 watts that some high end computers use is being used consistently over the entire time the computer is on. But that is not true at all, many high end gaming PCs are rated for 500 watts or more but that is the maximum amount of power that can be drawn at one time. A lot of these high end computers have components that drawn whole lot more electricity than a standard consumer PC. Many gaming computers have more than a few very large powerful fans, as well as multiple graphics cards that could use 75 watts each, 1 or more hard drives that spin at speeds up to 10,000 rpms, a central processor that uses 50-100 watts alone, and other extras like lights and liquid cooling systems. But the maximum power is only used when the computer is under full load such as playing a very graphics intensive game, and still the amount of power is always going to less than the maximum amount the power supply puts out, otherwise ther wouldn't be enough power and the computer would crash. Even high end computers use less than maximum power most of the time. If you aren't playing a game the graphics cards use way less than their max. If you aren't using the processor to 100% capacity it uses less power as well. And if the hard drive hasn't been accessed in a while it is turned off to save power as well. And when the CPU and other components are not at full capacity they produce much less heat which means that the fans don't have to go as fast and can run at a lower speed or even be turned off if the temperature is low enough. So even a high end PC, such as those used for gaming, don't use their full amount of power around the clock. Even a top of the line machine with a 600-700 watt power supply and all the top components would use about half of that amount of power on average. So it would take a computer with a power supply rated over 500 watts and components that add up to 500 watts total working at 100% efficiency 24 hours a day 7 days a week to end up costing more than $500. Which backs up my first claim that the figure stated is an exaggeration and that it usually costs much less for an average computer to be left on all the time and being actively used for many hours each and every day. If you don't use your computer that much, say only a dozen or so hours a week, then your computer won't cost an arm and a leg to run, even if you leave it on all the time. So in conclusion it does cost money to leave your computer on all the time but it really matters how much you actively use it. If you run programs in the background, like constantly downloading files or streaming or sharing files with other systems or devices then it would use much more power, but most of us do not. And even if you do leave programs running that use the hard drive and networking devices, the CPU probably won't be running at peak and other components like the fans, graphics chips, and monitor will be using less than their maximum amount of power consumption. Of course turning off your computer will use no power and therefore cost you nothing, but leaving it on won't likely break the bank either.