Is usability testing necessary?

For over a decade, I've consistently told design teams that the most effective way to ensure they build usability products is to start incorporating usability tests into their process. I'd say, "Usability test now. Test as early as possible. Test as many times as you can before launch."

I have a really good reason for this recommendation. All too often I'm brought into organizations that haven't had the opportunity to learn first-hand about their users. In many cases, the design team has never interacted with an actual user of the product.  No up-front user research, no usability testing. Not even a focus group or survey. The team is often forced to base the design on their (sometimes faulty) assumptions about what users want in a product.

When teams haven't had a process for incorporating user needs into their designs, I recommend they conduct a usability test right away. I've found that usability tests are the most effective method to sell the importance of a user-focused design to teams. By conducting usability tests, teams get an opportunity to see how real people interact with their products. In many cases, this unfortunately means they see users struggling. The good news is that when this happens, I have very little trouble convincing the design team to start conducting more rigorous research with users. These activities typically include field studies, ethnographic user interviews, and user profile (or persona) development.

But what about the teams that do a good job of incorporating user research into their design process? Is it always necessary for them to conduct usability tests? I've come to the conclusion that usability testing isn't always necessary.

For example, the good folks at 37signals are the designers of many highly successful applications, including the Basecamp project management app. To create their products, 37signals doesn't conduct usability tests. Instead, they build for themselves. They create products that work well for them and satisfy their specific needs. By doing so, they also create products that satisfy the needs of a much larger audience base. Among the reasons 37signals's products work so well is because they consciously focus on their users. However, in their case, they didn't need to conduct up-front user research. This is because they are the target users for products such as Basecamp.

Second, while 37signals doesn't conduct formal usability tests, they are actively learning from their customers all the time. With their blog, email, and Twitter responses, they listen and respond to customers. They've also made design changes based on the customer feedback. I've encountered other design teams that have successfully launched many products without conducting a single usability test. While these teams don't test, they do conduct rigorous, up-front user research. 

At the beginning of projects, they conduct user interviews or field studies, develop user profiles and design requirements based on the research, and use those user requirements to drive the design. While I find it's always ideal to usability test to evaluate and validate a design, it's not a required activity for these teams. To assess whether usability testing is required on a project, you'll want to ask yourself the following questions. Did you:

  • Interview subject matter experts within your organization who have knowledge about the business or users?
  • Conduct rigorous up-front research with users?
  • Identify user goals and needs based on solid research?
  • Build a design tied to the user research?
  • Review the design with your subject matter experts?
  • Iterate based on feedback from users, designers, or usability professionals?

If you "answered "yes" to all of these questions, usability testing may not be necessary. If you answered "no", "Usability test now. Test as early as possible. Test as many times as you can before launch."