When I was deciding on a research topic for my master’s thesis, my advisor asked me, “Is this going to be a qual or quant thesis?” Qualitative and quantitative methods are usually considered two separate approaches to research, as if the answer to my advisor’s question couldn’t be “both.”
We’ve all read trend pieces on “Big Data,” and lately it seems like most companies prioritize quantitative methods and huge datasets. With all that data, so carefully tracked, surely we have everything we need to understand where and how a website can be improved. And analytics can give you valuable information that users may not be able to report. For instance, users are generally pretty bad at knowing how much time they spend on tasks. If you asked me how long I spend on the Internet each week, I guarantee I would underestimate it by quite a lot. Software, on the other hand, is much better at tracking and measuring time accurately.
There’s just one problem: analytics doesn’t give us the whole picture. It doesn’t tell us why users do what they do or how they feel about it. These are key concerns that designers and researchers must address to deliver the best user experience possible.
“Analytics doesn’t tell us why users do what they do or how they feel about it. These are key concerns of designers and researchers.”
At Electronic Ink, we prioritize qualitative research and the valuable information it gives us. A few weeks ago, I conducted in-person usability tests at our lab in Philadelphia to test a new version of the client’s website. We uncovered some very useful insights, including that some users had trouble navigating back to the homepage from certain parts of the site. One of the client observers hadn’t considered that users might not know that they could click the company logo to return to the homepage: “I thought everyone knew that.” It’s unlikely we could have found that insight using big data, because until we saw users interacting with the site, we didn’t even know to ask the question, “Do users know that logo = homepage?”
Qualitative research can illuminate questions that need asking and real problems that need solving. At a recent PhillyCHI event, Jared Spool argued that researchers and designers can benefit from analytics, as long as they ground it in user research and use metrics that more clearly relate to the problem they need to solve. These kinds of qualitatively-considered metrics are better for product assessment than the out-of-the-box metrics in typical analytics software.
For instance, when testing your email marketing product, ask users to rate the difficulty of a task (e.g. sending an email newsletter) at the beginning of the test, then after they’ve used your product. Including those metrics with all the qualitative feedback from your testing session will help gauge the product’s success better than a survey question asking “Would you use this feature?” or a print out of session times from the analytics report.
Quantitative and qualitative research are not at odds, but work best as a team.