Good Surveys and Bad Surveys

I get some interesting letters about the Best of the Mac Web survey each time I run it. I suspect I’ll get more about the Rest of the Mac Web survey we’re launching today. Let’s look at some of the common threads:

It’s biased. It’s weighted to sites that link. Someone can spike the results.

In an effort to avoid bias, I try to make sure every site knows about the survey via my daily mailings – and individual mailings to the publishers of sites that may not be on those lists. Since the survey is always mentioned on MacSurfer, I feel that even the publishers of sites without email links have the opportunity to know about and link to the survey.

By giving as many sites as possible the opportunity to link and send their users over to take the survey, the effect of any single site – let alone any single user – to skew the results is minimized.

Also, the survey manager used by Master.com is instructed not to let the same person vote twice. I don’t know how it tracks that, but even if a few users manage to vote two or three times, out of 2,000 votes that won’t carry a lot of weight.

Aren’t these just your favorite sites? What about XYZ?

All surveys are flawed, but we try to list the sites we anticipate will be the best known and best respected on the Mac Web. We deliberately skip Apple, since this is a survey of independent sites. We also skip the online retailers and link pages, since they don’t provide original content. That was the reason ramseeker was not listed in the most recent survey.

We missed a fair number of magazine-related sites; they are listed in the Rest of the Mac Web survey. By looking at results on that survey, we’ll have an even better list of sites for the Spring 2002 Best of the Mac Web survey.

But some of these are my favorite sites – I have a real fondness for Macs Only, for instance. But many are not among my favorite sites. Some I don’t care for at all, but let’s not name names.

I’m not familiar with all the Mac-related sites out there. I suspect there are somewhere around 200, many deserving of their obscurity and many gems waiting to be uncovered. Reader feedback is the reason a lot of sites in the Rest of the Mac Web survey got listed.

Isn’t this just to feed your ego?

Maybe. The human heart is deceitful above all things. But it’s more to find out what you, the reader, think of the Mac Web. Nobody else seems to be running this kind of survey, so I took it on myself.

So, none of the other lists are good enough? And why is your list so different from SiteLink and other top Mac site lists?

Everyone has their own methodology. I think asking people what theط think about sites they are familiar with is the best way to find out what they think. Other “top site” lists have different agendas.

For instance, SiteLink rates the Top 20 sites and the Top 3 in each category, but not by asking anyone’s opinion. All they do is count how many times people have clicked through to a site listed on their home page. Nothing more. Nothing less. Once a SiteLink user bookmarks another site and visits from that bookmark, the site no longer gets clicks on the SiteLink home page. I think that’s a very flawed way to determine the best sites on the Web – or even most popular ones.

Another scheme used frequently is links from the sites on the list. By clicking on a link the “Best Mac Sites” on some site, not only do I see a list of sites, but I also cast a vote for the site that sent me over. Again, it says nothing about my opinion of the site, only that they published the link. This is another was to create a popularity contest without producing meaningful results.

What would be a good reflection of site popularity, and thus ba reflection of which sites we consider worth visiting, would be a listing of sites by daily, weekly, or monthly traffic levels or number of unique visitors. But there are some problems with that:

  • many publishers are very protective of their site data
  • different people mean different things by “hits”
  • it’s not fair to compare raw hits against those of sites that filter out spiders when analyzing data

Still, it would be interesting to find out how popular the leading Mac sites are. I know it stroked my ego to be one of the better rated sites in the Best of the Mac Web survey; I wonder how I’d feel on learning how 800,000 pages per month (after spiders) served by Low End Mac compares to other sites.

Your methodology is all backwards. Besides, someone objective should do the research.

The only reason I run the Best of the Mac Web survey is that nobody else seems to be doing it. If some graduate students want to perform such a study, I don’t think the end results would be much different.

Then there’s the idea of doing a poll like Gallup and Harris do – going to the public, not having them come to you. With Mac users representing maybe 5% of the population and a perhaps half of all computer users online, it would take 30-35,000 phone calls to survey 1,000 Mac users. By having Mac sites send over their viewers, we make sure the entire sample includes those who surf the Mac Web.

Still, doesn’t it skew the results when all the listed sites don’t link?

That’s certainly what Gene Steinberg of Mac Night Owl would like you to think. (See his email posted on Bite.org and his letter to Charles W. Moore of Applelinks.) While we agree that a site might have received more votes if it had promoted the survey, we don’t think the scores themselves would have changed a lot. As evidence of this, many sites that posted no links to the survey received a whole lot of votes and high scores.

We tried to contact the sites listed after we launched the survey. We began the survey on Thursday, November 15 and ran it for 8 days. Steinberg knew about the survey by the weekend, as evidenced by an email we received from him about the survey. Based on his knowledge of the survey, we believed it was not necessary to send him a release.

It did take several days to send out all the news releases, and because some sites don’t make it easy to find a point of contact, some sites were never notified. Still, the point of the survey was first to find out what Mac users think of the sites and secondly to discover how well known they are.

I think Steinberg is mostly upset because he didn’t get a chance to spike the results with his 10,000 member mailing list – but I could be wrong. (For the record, I didn’t spike the results with the 8,000+ members of the Low End Mac email lists, either. Maybe I should have….)

Sour grapes?

C’mon, aren’t you just doing the to build site traffic?

Increased traffic a wonderful byproduct of the survey. We had over 35,000 hits on Tuesday, nearly 10,000 hits on the article explaining the survey, over 2,000 survey participants, and over 6,000 people have already read the results.

Sure, it’s boosted traffic, but we’re already averaging about 30,000 hits every weekday. This might spike us past 800,000 pages in a month for the first time, but hits aren’t all they’re cracked up to be.

If we have any problem at Low End Mac, it’s a surplus of pages vs. what advertisers are prepared to buy. To get the top ad dollar, we need to provide a top rated site (which the survey says we do) and the optimal number of page views. The law of supply and demand dictates that our very popularity may contribute to the lower than expected income we’ve had this year. Go figure!

But truth be told, I am doing this to build site traffic. I see this as a public service, exposing Mac users to other Mac-related resources. It may create some new regular visitors to Low End Mac – that’s always a good thing. Better yet, it may bring new users to other worthy sites, which is good for the whole family of Mac-related sites. If I help other good sites grow, I feel as good about that as I do about LEM making the Top 10.

I’m a firm believer in building community. You’ll find it on the LEM email lists. You’ll also find it among Mac webmasters, many of whom have more sense of community than rivalry. And you’ll find it on LEM, where we link promiscuously to other sites (Mac and otherwise) with good content.

And when part of the community benefits, we are all better off.