This past week I've been in California visiting my father. He's
got several computers, and he always seems to have problems with
all of them. He had just moved into a new apartment, and they gave
him a new wireless cable modem. Since his ThinkPad has built in
wireless Internet, he figured it would detect the modem and connect
automatically - and was frustrated when it simply wasn't
working.
I tried to use my PowerBook to connect, and a few seconds after
I booted the machine I was automatically connected to the wireless
network. It turns out that you actually have to configure Windows
to connect to a new wireless network. It's not something the
operating system does automatically.
Would an average consumer know that? Probably not. They expect
it to just work.
I really wasn't expecting the Mac not to work - I would've been
surprised if it hadn't.
Not connecting would make something seem "broken" to an average
computer user. Really, how many people would have figured out that
you had to both search for and configure a wireless network
manually?
Then, of course, there's his Dell, which was running so slowly
that it was practically unusable. Why? It was a combination of Dell
preinstalled garbage, spyware, adware, and registry corruption that
caused this 1.1 GHz machine to feel closer to 100 MHz.
Running several different utilities managed to make the problem
less noticeable, but it's still nowhere near as fast as it was when
it was new.
But his blue G3 remains pretty stable, not too slow,
and is generally reliable - not that he uses it much anymore with
the more modern ThinkPad around. I updated it to OS X 10.3 (it
was running 10.2), and I'm surprised at how quick it is for a
five-year-old computer.
In fact I was just watching a QuickTime video on it that played
without the slightest lag. I expected it to slow down a bit if I
dragged it across the screen while it was running, but it didn't
affect it one bit. I know that my 350 MHz G3 at home is fast and
reliable running OS 9, but (as a slight digression) I was
really unaware what a great OS X machine a blue G3 makes as
well (see Why Apple's blue &
white G3 is a best buy).
Consumers really want a machine that will remain as reliable as
it was when it was purchased. On PCs, manufacturers love to install
utilities that will make your computing experience "trouble free,"
but those utilities often make things more complicated for users,
and they often get in the way and slow the computer down.
Since Windows is vulnerable to spyware - which can be just as
bad as having a virus - they should be bundling software to help
eliminate some of it instead of software that makes your computer
even more useless.
But when it comes to the Macintosh, the 40 GB hard drive that is
now in my blue G3 was formatted last April, and the OS was copied
over from my beige G3's original
6 GB drive. It had been installed on there for at least four
months. There hasn't been a single problem with it, and I don't
plan on doing a reinstall anytime soon.
Apple simply doesn't add on a bunch of trash to the OS that no
one can use.
Consumers expect things to work. They don't think about having
to scan for spyware, viruses, or anything else. To most computer
users the computer either works or doesn't work.
Why didn't Cindy hand in her essay on time? The computer was
broken. It didn't have registry corruption, it didn't have a bad
boot sector, and it didn't even have a virus. It was just
broken.
The thing that's always been nice about the Mac is that it tends
to work fairly well all of the time - unless there's a hardware
failure or you do something to it (installing too many system hacks
can kill it, which I've learned from experience).
I'm don't mean to bash Windows PCs - after all, I do use them,
and many other Mac users use PCs as well. But their disadvantage is
that one small issue can render the whole computer essentially
unusable.
And that's enough for Cindy to say, "My computer's broken. I'll
hand in the paper on Friday."