It gets easier for anyone to launch an online business. With the right tools, and without a single line of code you can run a website. And market it with a newsletter, a social media presence and, even, track and gather a few analytics.
The hard part starts there. The aim is to survive and grow in front of the competition. Service apart, it’s to create a website that is user friendly. A website with good infrastructure and effective interfaces that are triggering positive feedbacks from as many users as possible. In other words : to generate a great user experience. In the UX field, in particular, this is achieved by conducting User Testing or Usability Test. Yet many companies aren’t conducting this kind of research. Why? Because of a lack of knowledge, resources and often the apprehension of the costs implied. It’s understandable as some methods for User Research may have limits, or done wrong. Let’s review that!
1. Consultancy
You can choose to hire a research specialist (which will not be Kevin, the cousin of the friend of your uncle please…) to do the job for you. He will surely help you with the strategy, but will never have the level of knowledge you have on your product, users and market. That’s why you must spend enough time thinking about what you are about to put on a test and define your tasks scenario : What would you want to know? Are the users experiencing any difficulties while visiting and using your website? What kind? An issue with a tool or button maybe? Errors while using a specific feature or during critical step like payment? Do you want the research to be performed on 10 users or 100? If the User Testing is not well prepared and organised, there is a huge risk that the information gathered is of no utility and can go to the garbage. It would be a loss of time and money. On the other side, you could be tempted to “test everything”. The result would be pretty much the same, with even more time wasted and maybe worst : you could be flooded with information that are contradicting each other. You won’t be able to decide what needs to improved and set priorities. It’s not as easy as it sounds!
2. Internal Bug Hunt
This option is easily confused with “user testing” while it’s not. Many are tempted to link user behavior to bugs on a website. So on a regular basis, conducting an “inside” research held by employees (often designers, developers or support team) because “obviously” they know the product better and are immediately available. This thinking can get you and your team in trouble! It seems like good practice at first, larger companies have instead dedicated Quality Assurance resources to do that, for a reason. Asking someone to judge his own work will rarely bring valuable feedback. First, they won’t want to ditch the work they are payed for, so they will not see this task with much objectivity in such situation. Then, even with the best intentions, the designer will see the graphic layer, the developer the interface, the support the issues he already raised, etc … but none of them will re-transcript the user experience as it can be. They’re simply not the primary product target. Having employees testing their own work can as well create a competitive and unfriendly environment for them, which will bring misunderstanding. Far away from the company culture, intended to keep the office working in harmony and mutual respect.
3. Survey and Polls
Another common wrong way to conduct user testing is to ask your community. It can be via polls on Facebook, Twitter, via email surveys or even by opportunism when you get into a discussion with a customer. Such methods will help understanding the trends but are again far from the spirit of User Testing. Since you remove from the equation the context, you will judge the “mood” of your customers, and not their behavior. There’s a risk that you orient the question in such way you’ll get a black or white answer, without any reason you can interpret. In addition, odds are that detractors will hurt your feelings and lost customers not around anymore to provide the insights you needed the most.
What you are missing!
Thanks to 3 exemples, we just listed what can totally ruin your attempts to improve your product thanks to your users. What those examples have in common?
- lack of the original context ;
- testing on the wrong users ;
- a deductive approach instead of inductive – you’re working on hypothesis ;
- observations / conclusions not cross-validated ;
- tests not enough planned and organised ;
- tests are either too or not enough focused ;
How to succeed?
Conducting User Testing, as hard and costly as it sounds, is essential to your business. Now to get the most out of it, and before jumping to any conclusion or assumption, start planning and understanding where you’re heading to. Make sure first to know how your service is used, and by who. Measure the behaviors and identify the critical KPIs you want to follow and/or improve. There are tools which can track every user that visits your website and give you usable information on them. From there, you’ll see if users respond well to the new feature you just released, or if the adoption is not quite there. You’ll see if the new signup page is converting more leads, and so on. Then it’s time to understand the why and how behind the user behavior by leveraging Usability Testing. Our advice here would be for you to start and focus on a limited scope around your most important funnels and features. You won’t go blind as you’ll know ahead at least where you have your conversion drops & usage below your expectations. Even better, you’ll have the context, a clearer idea of the target audience, a legitimate list of tests to perform … well, almost everything to conduct your user testing in good conditions! UX fields are moving incredibly fast, and companies like UserTesting (don’t miss Peek, their free demo) can help you to get rid of physical interviews and save a few extra cost. Let’s bet that we’ll see more fully automated solutions popping-up in the near future. In the meantime, you’re now ready to measure, test, test, implement, measure, test, implement, measure, … 🙂
Also published on Medium.