# Should I always leave my computer on?

Scott L. Barber

Scott L. Barber first posted this to Quadlist. It is reprinted with his permission.

By request, I've searched my personal archives and stumbled on my thesis. It's rather short compared to the ones I write now, but should get the point across. Now, there are two complete reposts here, so don't get confused when the subject changes.

Hope someone gets a kick out of this, and I'll be happy to answer questions concerning it, now that most of the typing is out of the way.

The person I relied on often and heavily for computer repair told me once that it is less wearing on the equipment to leave a computer running than to turn it on and off every day. I never quite bought into it considering the amount of electricity that number of machines would use during those 16 hours between the end of the workday and the beginning of the next. But then when our kids aged, and our home personal computer started getting used from about 6:30 a.m. (gotta check the email before leaving for school) to around 10.30 or 11 at night for various homework, chat room, listserv, checkbook, etc., I started doing that. When I mentioned this once to a friend they were horrified and said I should definitely at least turn off the monitor at night.

OK, this involves a computer into two separate halves. Both people are right, and, yes, there is an electrical issue.

Here's the theory of leaving everything on: At the instant that an electrical device is turned on to a constant and unchangeable power source, there is an electrical burst of energy comparable to a mechanical "jerk." The rate of change is instantaneous and infinite. This search, for an instantaneous moment, is higher than 5v or 12v, or whatever - therefore, it's a surge. Notable as well, motors such as the electrical motors in the hard drives, floppy drives, and fans take the biggest torque making that instantaneous movement from completely still to moving. Because of this, motors usually have two coils, a drive coil and a starter coil. The starter coil is the high torque "push" that gets the motor moving.

All of this means that at the moment the machine is turned on, an incredible amount of amperage is necessary to get things moving, and the mechanical torque and the electrical absorption is at it's highest, usually instantaneously higher than the tolerances of the equipment. It is the point where equipment usually fails, as it is the greatest amount of instantaneous load that can be put on the equipment.

The common analogy is, "A light bulb never blows when it's already on, it only blows when it's turned on." For the most part, this is true - the above reason is given why.

Here's the theory of the other camp - turning everything off: The theory of leaving everything on is in two parts: The conspiracy and the electrical theory. The conspiracy is that electrical power companies have conspired to convince people to leave their equipment on to increase profitability. Most often this is the first and only reason that this camp has offered to me. (There's a considerable amount of conspiracy theory out there . . . )

The electrical theory is that during operation a piece of equipment is subjected to more surges in power than the initial surge is worth, which means over time more damage is done with the power on. Also, electrical parts wear out, and over time the resistance will increase in the equipment, making the equipment wear out sooner. It's more of a high mileage theory, and it's analogy is commonly this, "But all light bulbs are rated for so many hours of use, and once they're used, the bulb will blow".

I'm personally inclined to believe the first way, up to a point. I have serious problems with the conspiracy theory, because it does not disprove the first, but simply ignores the empirical data (which, as an Electrical Engineer, I believe is correct). I also feel that the "light bulb will blow anyway" theory, or the high mileage theory, is incorrect for certain electrical reasons. I do, however, feel that the surge reason is correct, and because of this I suggest that all of my clients purchase a specific brand and type of surge suppressor - and IsoTel (made by Tripp-Lite, and very expensive compared to the crappy \$7 variety). With this surge suppressor, which is constructed differently than any other suppressor I've ever seen, I feel that the last fact of the second theory is taken care of.

There is a side note. Monitors are simple, stupid display devices. They pull more power normally than the computer does while it's running. They are cheap, can take a myriad of surges, and are designed for rough use. When there are power problems, the screen simply wavers or resizes, or has waves appear in the screen - the CRT is the ballast that compensates for surges and power problems. A monitor has the ability to absorb the instantaneous power-on through the tube (which is why the tube usually flashes when the power is turned on). Turn the monitor off when not in use - even the energy standards don't really work right - they leave a trace wattage of 5 to 10 watts (or sometimes more). Save the electricity, but leave the computer on if you can.

Someone from the other camp, please jump in . . . I would really like to hear an answer of why to turn the equipment off that is convincing. If there is a factor I'm missing, or I've been somehow misinformed (or missed a few graduate classes on triple integral electrical calculations), please feel free to let me know.

But to turn the monitor on and off would defeat any benefit of leaving the computer on.

That's where the separate surge suppressor comes in . . . it's close to being the argument against, but there are devices that protect against this problem. Just make sure you don't plug the monitor through the power supply, like some Macs can do.

Scott L. Barber <serker@earthling.net>
Pres/CEO, SERKER Worldwide, Inc.
Providing Hardware/Networking/Telecomm for 13 years