TY - JOUR
T1 - Reliability of the direct observation of procedural skills assessment tool for ultrasound-guided regional anaesthesia
AU - Chuan, A.
AU - Thillainathan, S.
AU - Graham, P. L.
AU - Jolly, B.
AU - Wong, D. M.
AU - Smith, N.
AU - Barrington, M. J.
PY - 2016/3/1
Y1 - 2016/3/1
N2 - The Direct Observation of Procedural Skills (DOPS) form is used as a workplace-based assessment tool in the current Australian and New Zealand College of Anaesthetists curriculum. The objective of this study was to evaluate the reliability of DOPS when used to score trainees performing ultrasound-guided regional anaesthesia. Reliability of an assessment tool is defined as the reproducibility of scores given by different assessors viewing the same trainee. Forty-nine anaesthetists were recruited to score two scripted videos of trainees performing a popliteal sciatic nerve block and an axillary brachial plexus block. Reliability, as measured by intraclass correlation coefficients, was -0.01 to 0.43 for the individual items in DOPS, and 0.15 for the 'Overall Performance for this Procedure' item. Assessors demonstrated consistency of scoring within DOPS, with significant correlation of sum of individual item scores with the 'Overall Performance for this Procedure' item (r=0.78 to 0.80, P<0.001), and with "yes" versus "no" responses to the 'Was the procedure completed satisfactorily?' item (W=24, P=0.0004, Video 1, and W=65, P=0.003, Video 2). While DOPS demonstrated a good degree of internal consistency in this setting, inter-rater reliability did not reach levels generally recommended for formative assessment tools. Feasibility of the form could be improved by removing the 'Was the procedure completed satisfactorily?' item without loss of information.
AB - The Direct Observation of Procedural Skills (DOPS) form is used as a workplace-based assessment tool in the current Australian and New Zealand College of Anaesthetists curriculum. The objective of this study was to evaluate the reliability of DOPS when used to score trainees performing ultrasound-guided regional anaesthesia. Reliability of an assessment tool is defined as the reproducibility of scores given by different assessors viewing the same trainee. Forty-nine anaesthetists were recruited to score two scripted videos of trainees performing a popliteal sciatic nerve block and an axillary brachial plexus block. Reliability, as measured by intraclass correlation coefficients, was -0.01 to 0.43 for the individual items in DOPS, and 0.15 for the 'Overall Performance for this Procedure' item. Assessors demonstrated consistency of scoring within DOPS, with significant correlation of sum of individual item scores with the 'Overall Performance for this Procedure' item (r=0.78 to 0.80, P<0.001), and with "yes" versus "no" responses to the 'Was the procedure completed satisfactorily?' item (W=24, P=0.0004, Video 1, and W=65, P=0.003, Video 2). While DOPS demonstrated a good degree of internal consistency in this setting, inter-rater reliability did not reach levels generally recommended for formative assessment tools. Feasibility of the form could be improved by removing the 'Was the procedure completed satisfactorily?' item without loss of information.
UR - http://www.scopus.com/inward/record.url?scp=84962106587&partnerID=8YFLogxK
U2 - 10.1177/0310057X1604400206
DO - 10.1177/0310057X1604400206
M3 - Article
C2 - 27029652
AN - SCOPUS:84962106587
VL - 44
SP - 201
EP - 209
JO - Anaesthesia and Intensive Care
JF - Anaesthesia and Intensive Care
SN - 0310-057X
IS - 2
ER -