Publication Date

6-1-2006

Abstract

Previous studies indicate that as many as 25-50% of applicants in organizational and educational settings are retested with measures of cognitive ability. Researchers have shown that practice effects are found across measurement occasions such that scores improve when these applicants retest. This study uses meta-analysis to summarize the results of 50 studies of practice effects for tests of cognitive ability. Results from 107 samples and 134,436 participants revealed an adjusted overall effect size of .26. Moderator analyses indicated that effects were larger when practice was accompanied by test coaching, and when identical forms were used. Additional research is needed to understand the impact of retesting on the validity inferences drawn from test scores.

Comments

Suggested Citation
Hausknecht, J. P., Halpert, J. A., Di Paolo, N. T., & Moriarty Gerrard, M. O. (2007). Retesting in selection: A meta-analysis of practice effects for tests of cognitive ability. Retrieved [insert date], from Cornell University, School of Industrial and Labor Relations site:
http://digitalcommons.ilr.cornell.edu/articles/13/

Required Publisher Statement
This article may not exactly replicate the final version published in the APA journal. It is not the copy of record. Final paper published as Hausknecht, J. P., Halpert, J. A., Di Paolo, N. T., & Moriarty Gerrard, M. O. (2007). Retesting in selection: A meta-analysis of coaching and practice effects for tests of cognitive ability. Journal of Applied Psychology, 92, 373-385.