Brandon started a discussion on an internal mailing list, asking, “The usability lab is now the ________ ?”, and explaining:
10 to 15 years ago the usability laboratory was the must-have for vetting and testing your design ideas. But more nimble development processes and new tools seem to have superseded the usability lab. Some of these are:
- remote screen sharing and screen recording tools and services
- voice of the customer feedback systems, like Get Satisfaction and other SaaS tools
- A/B testing and multi-variant testing
- remote co-design tools like online card sorting
- survey tools for straight-up surveys and concept evaluations
- public betas, previews, and opt-ins
By picking a smart set of these tools, I think an organization can make smarter decisions throughout the design and development process in a live dialog with more users than a traditional usability lab ever could.
The gist of the internal response was, “Analytics!” As Jesse put it, “Studying actual user behavior will beat the lab every time.” Analytics alone isn’t sufficient, so it’s best augmented with live intercepts of actual users engaging in real tasks.
As I see it, though, this is not news. We’ve known about analytics, multi-variant and A/B testing, and the like for over a decade. Companies like Amazon have demonstrated just how powerful it is to instrument your site and use that data to drive key decisions. Heck, 5 years ago, Adaptive Path began work on what became Measure Map, an analytics tool designed specifically for bloggers, our reaction to a marketplace full of bloated tools.
However, even today, when we ask our clients for analytics data, it is common that they do not have any. How is it that in 2010, there are still many organizations not taking advantage of this rich vein of information?
The first reason (and, really, it’s my standard answer for almost anything to do with product development) is mindset. Most organizations simply don’t have the mindset to measure, analyze, and iterate to improve. Particularly product development teams. This is one place that marketing seems to have product development beat.
For decades, marketing has tracked the impact of its efforts, and that practice has carried over to the Web. Product development often does not — before ubiquitous internet, there was no value in instrumenting software interfaces, because there was no way to get that data. So product development learned to make decisions using other methods, such as usability engineering. And once something ships, product groups tend to move on to the next thing, and aren’t particularly interested in how well what they shipped is performing — that’s now in the past. Product dev teams need to incorporate analytics feedback into their processes.
Another issue is that came up in our company dialogue was that many companies that do use analytics tools don’t know what to do with the data. It’s not clear how to turn that data into information, and that information into valuable insights about what action take. They don’t know what questions to ask the data; or they ask too many questions, and they get so many answers, they don’t know how to prioritize the feedback; or they have a focused set of answer, but they don’t know what to change in their efforts to desirably “move the needle”.
So what to do? Well, if you’re not embracing analytics, do so, now. Google Analytics is free. Use it. But don’t go crazy. Identify a few key measures. Allow a little while for the data to be dependable. Develop some hypotheses as to what’s happening, and make some small changes to test those hypotheses. Lather, rinse, and repeat. You’ll grow more confident, and as you do, you’ll get more sophisticated. Always remember that analytics are not “the answer” — but they are an input into a continually improving and iterating development process.