I’m helping run usability tests with a startup; though, I have not formally “run” one in the past. It’s been a great learning experience.
The company’s launching a new site in education, and with the product having been in development for a few months, it’s ready for users to test. For us, we’re soliciting feedback on user experience, shaping our product roadmap, and testing marketing messaging.
I’ve uploaded a Usability Testing Guidelines template I created leveraging others I’ve found with adjustments here: Usability Testing Guidelines. I’ve removed some specific portions, but left in others for illustrative purposes. You’ll want to adjust as you see fit.
(Note: I use “site” here, but substitute the product/ service you’re testing.)
Here are the key components of the Guidelines:
- Testing – Observer Guidelines. General instructions for the observer when moderating tests.
- Agenda and Set-Up. Establish the logistics for testing. Ensure the location is ready, devices are set up, and have back-up plans in case things don’t go as planned.
- Testing Script. Set the guidelines between tester and test-taker including language to encourage honest, impulsive feedback.
- Usability Tasks. Ensure specific elements of the site are tested. Tasks help navigate testing along timeline.
- Closing Questions. When in front of target users, it’s advantageous to ask specific and open questions as time allows. These questions help determine the product roadmap, key in on marketing messaging, identify trouble areas not identified before, etc.
What lessons do you have from usability tests as a tester or test-taker? How have usability tests been poorly run? Effectively run?