`

'Survey says...' Not much. Don't depend upon them: RETURN ON TECHNOLOGY

April 16, 2007

I've noticed a trend in restaurants lately. Most of them seem to have dropped those customer comment cards they used to keep clamped at each table.

I can see why. In my whole life, I've probably filled out fewer than half a dozen of those things, and always when I was particularly irked about the service. I've never turned them in, consoling myself with filling them out to blow off steam and then heaving a sigh as I tore them up because I just knew my voice would mean nothing to the massive forces at work in chain restaurant management.

Studies have shown that you have to be a) livid with anger, b) ecstatic with satisfaction, or c) massively bored, to fill out one This is an of what

designers call "self-selection," where the survey participant volunteers for the task, rather than being selected at random or being part of a huge, designed study.

Those same designers are rightly suspicious of results from the cards. Written comments might have some value, but the one-throughfive little square boxes do not have much, despite their numerical window dressing.

Such grasping at decimals assumes that numbers, any numbers, are valuable. It's wrong, but compelling. The idea that you can get your customers to rate you anonymously and en masse is too alluring to ignore.

Imagine going before the executive committee with PowerPoint slides that prove you've gone from an average satisfaction of 4.1 to an average of 4.2, and listen to the appreciative murmurs all around. You're gaining traction. The ad campaign is working. The new training program is taking hold. Overlook that the numbers are all but meaningless, because the samples were small, the respondents were atypical, the calculation of a "mean" for a range of five numbers is worthless, and the differences were almost certainly due to chance, anyway.

It would appear that restaurant management learned the little cards were more trouble than they're worth. But as restaurants have given up on them, Web owners are embracing them, in the form of popup surveys. It still amazes me that such surveys are avidly scrutinized, when their flaws are numerous and deadly.

Let's start with the surface absurdity. The common wisdom is that Web pages must load quickly to satisfy users, and anything over about 10 seconds is considered pokey. So Web owners do everything they can to get the page down and into a browser as quickly as possible.

And what then? The user gets to see a survey form pasted on top of the very page he was so desperate to get. He must now transact some survey business, at minimum clicking a "No" link that makes the survey form go away. If he is in danger of clicking away after only 10 seconds, what will keep him nailed to a site that insists on shoving a survey in his face? It takes an extraordinarily bored, cooperative or pleased user to tolerate it.

Then there is the sampling problem. It's well-known to statisticians that the best numbers come from truly random samples, because then you can use the sample's numbers to make good guesses about the rest of the population. The worst case is self-selection, which can ruin any idea of extending those numbers anywhere outside the sample itself. In effect, self-selection is good only for those who selfselect, and they're hardly representative of all the site visitors.

Survey companies also will trumpet the idea of "statistical significance," to reassure the customer the numbers mean something. That assurance falls to earth like a wounded duck when

icance is a mathematical artifact, and might not reflect the real world at all. Garbage in, garbage out. Measure the wrong things, and no statistical analysis will save you. Significance isn't always significant.

I'm not saying you shouldn't try to measure Web site satisfaction. I'm saying there are better ways that don't rely on users' whims. For instance, try counting returning visitors and watch where they go. Satisfied users will generally turn up again and head right to things they know they want. See if they end up going all the way through the "conversion funnel," becoming customers instead of visitors. Don't listen to what they say so much as watch what they do. Make good use of Web analytics. Is your conversion rate rising? If so, somebody's happy.

By all means, talk to users whenever you can, but don't redesign based on a few remarks. Use them as input for the next version. Users commonly say one thing and do another. Buying is a behavior, not an intention. You make money on what they do; you don't make money on what they say.



Altom is an independent local technology consultant. His column appears every other week. He can be reached at timaltom@sbcglobal.net.
Source: XMLAr03200.xml
ADVERTISEMENT

Recent Articles by Tim Altom

Comments powered by Disqus