Dr. Carol BarnumIn too many organizations, content and usability are woefully separated. How many times have you witnessed or experienced usability testing where there is a placeholder for content, rather than the actual content? Dr. Carol Barnum points out that usability testing without considering content misses the whole point.
In many organizations, content and usability often operate in different siloes. Why do they need to be working closer together?
Content is an integral part of user experience. Even if content is coming from different places within an organization, and even if you have a core development team of interaction designers responsible for usability, it behooves a company to involve everyone with a hand in user experience – including content strategists and developers – to plan together and learn from users through usability testing.
Unfortunately, not only do many organizations fail to bring content and usability together, but they also often don’t have content developed during usability testing. For example, I’ve seen usability testing where the Help content is not yet created. I’ll have to tell participants that Help isn’t there but we’d like for them to click on Help when they think they would need it so we know where they will want help. I’ve even done usability testing where absolutely no content has been developed.
For me, these situations speak to the underlying problem of not understanding the important role that content plays in any user experience. Sure, the usability group is looking at important parts of the user experience such as interaction design, terminology, and information architecture. But they’re missing out and not focusing at all on what the users are actually there for, which is not just to find something but to be able to use something.
How do you keep content usability testing scientific and unbiased?
In my view, usability is not about science. Many companies will ask me, “How in the world can you validate findings when you only have five or six participants in a single day?” So, we first disabuse people of the notion that usability is a science. We don’t validate the user experience. What we do is explore, understand, diagnose, observe, and learn from users. And we acquire highly valid results from this exploratory diagnostic process.
We narrow the user population so that those five or six people represent a specific, similar demographic. They may (or may not) have knowledge of the domain, they may (or may not) have prior experience with competing products, and we make sure they clearly have a need for the content so that they are motivated to perform the task that we ask them to perform. We quantify certain aspects of usability testing, but the goal is not to scientifically prove that something works or does not work.
Our focus is on very small studies and to engage the client with as much usability observation as possible, either through coming in to our executive viewing room and observing the users in action or watching summarized video highlights from the sessions. I’ve found that it only takes two or three people in a row saying the same exact thing about something being confusing or unclear, or that they would not buy a product because they’re not convinced by the content, to make an impact on a client. Even a small study helps companies quickly understand content usability problems that need to be fixed.
What end results do you see in organizations that use formal content usability testing?
I have personally witnessed some great success stories. Just to share one example, I worked with a company called Ipswitch that makes IT-related business software that manages networks, helps with security, and sets up email platforms. They faced three problems. First, they had traditionally dealt with a technical audience but they were trying to roll out a product for consumers. They knew very little about a home consumer. Second, every time Ipswitch upgraded its product, the call center would receive a high spike in calls from technical users. Third, Ipswitch had a 30-day “try-and-buy” cycle and unfortunately saw a high drop-off rate after those 30 days.
We performed several usability studies with Ipswitch as the company went through product development. When they launched the product, the metrics showed that several of the key measures significantly improved. The company only had a very low increase in calls to the support center and they saw a major drop in the typical spike that normally occurred. Ipswitch also saw an increased conversion rate after the 30-day trial and healthy purchasing from the home consumer audience. This example illustrates the kinds of positive metrics that I see across a wide variety of companies that make an investment in content usability testing.
What advances in content usability testing most excite you right now?
Usability as it applies to different mobile platforms is an exciting area. Mobile provides an entirely new focus on content. For example, how do you create a content strategy and a good user experience when you’re looking at the screen size of a typical mobile phone? Mobile opens up opportunities to really focus on only the most critical content and to streamline that content so that it gives users exactly what they need. And mobile content and usability are going to require significant involvement from content strategists.
For those who are working in various mobile platforms, understand that you need a mobile first design strategy. You cannot just import what you have on a website onto the mobile device. Instead, you must rethink your entire content strategy and user experience for the small mobile screen.
What’s good is that the principles and strategies for user experience don’t change when considering mobile. The underlying principles hold up: learning from users, observing users, creating test protocols that allow users to step into the roles that make sense to them, and having users perform tasks that match real user goals.
For more tips and success stories about content usability testing, check out Carol’s bookUsability Testing Essentials: Ready, Set...Test!