1. Using Consensus Building Procedures With Expert Raters to Establish Comparison Scores of Behavior for Direct Behavior Rating
- Author
-
Sayward E. Harrison, Rose Jaffery, Sandra M. Chafouleas, T. Chris Riley-Tillman, Mark C. Bowler, and Austin H. Johnson
- Subjects
Alternative methods ,Psychometrics ,Best practice ,Applied psychology ,Direct observation ,Expert consensus ,computer.software_genre ,Education ,Behavioral data ,Direct Behavior Rating ,General Health Professions ,Developmental and Educational Psychology ,Data mining ,Psychology ,computer - Abstract
To date, rater accuracy when using Direct Behavior Rating (DBR) has been evaluated by comparing DBR-derived data to scores yielded through systematic direct observation. The purpose of this study was to evaluate an alternative method for establishing comparison scores using expert-completed DBR alongside best practices in consensus building exercises, to evaluate the accuracy of ratings. Standard procedures for obtaining expert data were established and implemented across two sites. Agreement indices and comparison scores were derived. Findings indicate that the expert consensus building sessions resulted in high agreement between expert raters, lending support for this alternative method for identifying comparison scores for behavioral data.
- Published
- 2015