Site Analytics: 'Why' Is as Vital as 'What'
Behavioral data reveals where a site falls short of goals. Attitudinal data tells why.
Macs or PCs, Fords or Chevys, VHS or Betamax... all debates that elicit strong opinions. Thanks to you, good readers, I've found another: behavioral data or attitudinal data?
When I wrote in this space about using behavioral data to calculate exit ratios and pinpoint site problems, I got a bushel of e-mail from attitudinal data devotees asking: What about surveys? Customer feedback? Questionnaires? Don't they count?
Well, I was getting to that.
Just as Macs and PCs have their particular marketplace sweet spots, behavioral and attitudinal data are both vitally useful. Used together, the sum is definitely greater than the parts. We maximize return on investment (ROI) for site optimization by using each in the most effective way.
We use behavioral (Web analytics) data to zero in on problems within a site's architecture. Behavioral data can reveal where a site falls short of business goals. Attitudinal data can tell us why.
Pure analytics shows the impact a problem has on all visitors, not just a core sample. It's an unvarnished, unedited, macro view of site traffic patterns. On a site built to generate leads, behavioral data might show most visitors find the crucial contact page, yet quit before completing it. You'll know the exact drop-off rate and the defectors' entry and exit routes.
That's more useful than vague survey data calling the sign-up process "poor." But it's only half the battle. Unfortunately, analytics data won't tell us why so many people aren't filling out the form or why they exit the site in a huff.
Behavioral data can tell you what's happening and where. Attitudinal data can tell you why.
So after using straight data to isolate a problem, we move to attitudinal analysis. Surveys, follow-up e-mail, customer sessions, and feedback-form submissions can tell why a problem is a problem. Maybe the form is too long or perceived as too invasive. Perhaps the form is fine, but the nomenclature of the link to it is seen as misleading. Maybe the site didn't offer enough orientation information before asking for data. Or it could be these particular target leads prefer phone contact and abandon when they can't find one. Behind every what may lurk a dozen whys.
Had we stuck to attitudinal data alone and talked only with a sample of customers, connecting their complaints to degraded site performance might have taken a long time. (Bless them, but customers give you all kinds of issues to wade through. Though important to them, these issues can distract you from what's really important to site performance.)
On the flip side, had we ignored attitudinal data, we might not isolate the root causes of performance problems. That would leave us to try one fix after another, hoping something will move the needle.
Once the whys are corralled and addressed and the improvements are implemented, we're back to behavioral data again. That's the surest measure of effectiveness. We want to weigh new analytics figures against the old baseline to see if those changes actually did improve performance.
So, attitudinal data lovers: relax. Used in complementary ways, behavioral and attitudinal data can lead to maximum site change effectiveness. The market wasn't big enough for Betamax and VHS. But in the growing field of Web data analysis, there's no need to choose. We can not only accommodate both attitudinal and behavioral data, but we need both, too.