Automated essay scoring AES is the use of specialized computer programs to assign grades to essays written in an educational setting. It is a method of educational.
Rationale: Health professions’ admission committees across the world are faced with the difficult task of selecting, from among many eligible applicants, the select few who will be admitted to their training programs. The determinants of this admissions process are often a combination of cognitive measures, such as Grade Point Average (GPA) or standardized tests such as the Medical College Admission Test (MCAT) and non-cognitive measures, including interviews and essays. However, there has been limited success in the development of evaluation tools that will provide reliable and valid measures of an applicant’s non-cognitive qualities. The exception to this is the MMI, in essence an admissions OSCE. The MMI has been shown to predict intramural and licensing examination performance. However, like any OSCE the MMI has practical limitations; the sheer volume of candidates for many institutions makes it necessary to develop a reliable and valid strategy for screening candidates’ non-cognitive attributes in a more efficient fashion. To this end a new measure, using video scenarios and written or audio responses was developed and a pilot study was completed. In 2006, 110 applicants to McMaster’s medical school completed this Computer-based Multiple Sample valuation of Non-cognitive Skills (CMSENS). Of those applicants, 78 completed the CMSENS by verbally recording their responses in an audio file while 32 typed their responses. The overall test generalizability was .86 for the audio CMSENS, and .72 for the written. The written CMSENS also demonstrated predictive validity, correlating with the MMI at .51. However, conclusions from this study are limited because of the small sample and the one-time nature of the findings.
princeton review essay grading mcat