Inter-observer agreement and reliability assessment for observational studies of clinical work

Scott R. Walter*, William T. M. Dunsmuir, Johanna I. Westbrook

*Corresponding author for this work

Research output: Contribution to journalComment/opinionpeer-review

7 Citations (Scopus)

Abstract

Inter-observer agreement (IOA) is a key aspect of data quality in time-and-motion studies of clinical work. To date, such studies have used simple and ad hoc approaches for IOA assessment, often with minimal reporting of methodological details. The main methodological issues are how to align time-stamped task intervals that rarely have agreeing start and end times, and how to assess IOA for multiple nominal variables. We present a combination of methods that simultaneously addresses both these issues and provides a more appropriate measure by which to assess IOA for time-and-motion studies. The issue of alignment is addressed by converting task-level data into small time windows then aligning data from different observers by time. A method applicable to multivariate nominal data, the iota score, is then applied to the time-aligned data. We illustrate our approach by comparing iota scores to the mean of univariate Cohen's kappa scores through application of these measures to existing data from an observational study of emergency department physicians. While the two scores generated very similar results under certain conditions, iota was more resilient to sparse data issues. Our results suggest that iota applied to time windows considerably improves on previous methods used for IOA assessment in time-and-motion studies, and that Cohen's kappa and other univariate measures should not be considered the gold standard. Rather, there is an urgent need for ongoing explicit discussion of methodological issues and solutions to improve the ways in which data quality is assessed in time-and-motion studies in order to ensure the conclusions drawn from such studies are robust.

Original languageEnglish
Article number103317
Pages (from-to)1-8
Number of pages8
JournalJournal of Biomedical Informatics
Volume100
DOIs
Publication statusPublished - 1 Dec 2019

Keywords

  • Emergency department
  • Inter-observer agreement
  • Inter-rater reliability
  • Time-and-motion studies
  • Work observations

Fingerprint

Dive into the research topics of 'Inter-observer agreement and reliability assessment for observational studies of clinical work'. Together they form a unique fingerprint.

Cite this