The Reliability of Expert Opinion in Specifying Course Content.
Technical bulletin (Final),
NAVAL PERSONNEL AND TRAINING RESEARCH LAB SAN DIEGO CALIF
Pagination or Media Count:
In the development and revision of curricula, subject-matter experts are frequently called upon to aid in specifying skill or knowledge requirements and to provide opinions on the degree of importance associated with the requirements for the establishment of a curriculum. If, however, these expert judgments are not reliable, their validity is automatically questionable. In order to evaluate this problem, data from a previous study was supplemented and analyzed for rate-rerate reliability of a group of Navy subject-matter experts, including an analysis of individual differences. The rate-rerate reliability for all 16 judges over a six month period was .59. Absolute changes in rating averaged slightly over half a scalar unit on a 6-point scale. When four raters were selected on the basis of their individual reliability, their combined rate-rerate reliability was .68. Other measures indicate the selection of raters on an index of individual stability may increase reliability, although the effect this has on validity is not known. These findings indicate that a large number of raters does not insure high reliability, and that selecting a subset may result in an increase in precision in overall reliability. Author
- Humanities and History