Authors

Joshua T. Hanson, J.T. Hanson is associate professor of medicine and associate dean for student affairs, University of Texas Health San Antonio, Joe R. and Teresa Lozano Long School of Medicine, San Antonio, Texas;
Kevin Busche, K. Busche is associate professor of neurology, Department of Clinical Neurosciences, and assistant dean, clerkship for undergraduate medical education, University of Calgary Cumming School of Medicine, Calgary, Alberta, Canada.
Martha L. Elks, M.L. Elks is professor of medical education and senior associate dean of educational affairs, Morehouse School of Medicine, Atlanta, Georgia
Loretta E. Jackson-Williams, L.E. Jackson-Williams is professor of emergency medicine and vice dean of medical education, University of Mississippi Medical Center School of Medicine, Jackson, Mississippi; ORCID
Robert A. Liotta, R.A. Liotta is associate dean of recruitment and admissions, F. Edward Hébert School of Medicine, Uniformed Services University, Bethesda, Maryland;
Chad Miller, C. Miller is professor of internal medicine and senior associate dean for undergraduate medical education, Saint Louis University School of Medicine, St. Louis, Missouri
Cindy A. Morris, C.A. Morris is professor of microbiology and immunology and associate dean for admissions, Tulane University School of Medicine, New Orleans, Louisiana
Barton Thiessen, B. Thiessen is associate professor of anesthesia and assistant dean for admissions, Faculty of Medicine, Memorial University of Newfoundland, St. John's, Newfoundland, Canada
Kun Yuan, K. Yuan was director of MCAT research, Association of American Medical Colleges, Washington, DC, at the time this was written, and is now director of research and data science, Graduate Management Admission Council, Reston, Virginia

Document Type

Article

Publication Date

9-1-2022

Abstract

PURPOSE: This is the first multisite investigation of the validity of scores from the current version of the Medical College Admission Test (MCAT) in clerkship and licensure contexts. It examined the predictive validity of MCAT scores and undergraduate grade point averages (UGPAs) for performance in preclerkship and clerkship courses and on the United States Medical Licensing Examination Step 1 and Step 2 Clinical Knowledge examinations. It also studied students' progress in medical school.

METHOD: Researchers examined data from 17 U.S. and Canadian MD-granting medical schools for 2016 and 2017 entrants who volunteered for the research and applied with scores from the current MCAT exam. They also examined data for all U.S. medical schools for 2016 and 2017 entrants to regular-MD programs who applied with scores from the current exam. Researchers conducted linear and logistic regression analyses to determine whether MCAT total scores added value beyond UGPAs in predicting medical students' performance and progress. Importantly, they examined the comparability of prediction by sex, race and ethnicity, and socioeconomic status.

RESULTS: Researchers reported medium to large correlations between MCAT total scores and medical student outcomes. Correlations between total UGPAs and medical student outcomes were similar but slightly lower. When MCAT scores and UGPAs were used together, they predicted student performance and progress better than either alone. Despite differences in average MCAT scores and UGPAs between students who self-identified as White or Asian and those from underrepresented racial and ethnic groups, predictive validity results were comparable. The same was true for students from different socioeconomic backgrounds, and for males and females.

CONCLUSIONS: These data demonstrate that MCAT scores add value to the prediction of medical student performance and progress and that applicants from different backgrounds who enter medical school with similar ranges of MCAT scores and UGPAs perform similarly in the curriculum.

Share

COinS