1. Practical Usability Rating by Experts (PURE)
- Author
-
Sara Cole, Frederick Boyle, Christian Rohrer, Jeff Sauro, and James Wendt
- Subjects
Cognitive walkthrough ,Pluralistic walkthrough ,Computer science ,Usability inspection ,02 engineering and technology ,computer.software_genre ,Usability lab ,Software ,Heuristic evaluation ,Usability engineering ,0202 electrical engineering, electronic engineering, information engineering ,0501 psychology and cognitive sciences ,Think aloud protocol ,Web usability ,Component-based usability testing ,050107 human factors ,business.industry ,System usability scale ,05 social sciences ,020207 software engineering ,Usability ,Usability goals ,Data mining ,Software engineering ,business ,computer ,Tree testing - Abstract
Usability testing has long been considered a gold standard in evaluating the ease of use of software and websites-producing metrics to benchmark the experience and identifying areas for improvement. However, logistical complexities and costs can make frequent usability testing infeasible. Alternatives to usability testing include various forms of expert reviews that identify usability problems but fail to provide task performance metrics. This case study describes a method by which multiple teams of trained evaluators generated task usability ratings and compared them to metrics collected from an independently run usability test on three software products. Although inter-rater reliability ranged from modest to strong and the correlation between actual and predicted metrics did establish fair concurrent validity, opportunities for improved reliability and validity were identified. By establishing clear guidelines, this method can provide a useful usability rating for a range of products across multiple platforms, without costing significant time or money.
- Published
- 2016
- Full Text
- View/download PDF