Usability Inspections by Groups of Specialists: Perceived Agreement in Spite of Disparate Observations

Morten Hertzum, Niels E. Jacobsen, Rolf Molich

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskning


Evaluators who examine the same system using the same usability evaluation method tend to report substantially different sets of problems. This so-called evaluator effect means that different evaluations point to considerably different revisions of the evaluated system. The first step in coping with the evaluator effect is to acknowledge its existence. In this study 11 usability specialists individually inspected a website and then met in four groups to combine their findings into group outputs. Although the overlap in reported problems between any two evaluators averaged only 9%, the 11 evaluators felt that they were largely in agreement. The evaluators perceived their disparate observations as multiple sources of evidence in support of the same issues, not as disagreements. Thus, the group work increased the evaluators’ confidence in their individual inspections, rather than alerted them to the evaluator effect.
TitelCHI 2002 : Extended Abstracts
ForlagAssociation for Computing Machinery
ISBN (Trykt)1-58113-454-1
StatusUdgivet - 2002
Udgivet eksterntJa
BegivenhedCHI2002 - Minneapolis, MN, USA
Varighed: 20 apr. 200225 apr. 2002


ByMinneapolis, MN

Bibliografisk note

Publikationen skal IKKE medtages i årsberetningen, den indberettes til brug i udtræk internt på datalogi

Citer dette