Misheva, T., Nesse, R. M., Grunspan, D. Z., & Brownell, S. E. (2023). The EvMed Assessment. Evolution, Medicine, and Public Health, 11(1), 353–362. https://doi.org/10.1093/emph/eoad028 (open access)

Background and objectives: Universities throughout the USA increasingly offer undergraduate courses in
evolutionary medicine (EvMed), which creates a need for pedagogical resources. Several resources offer
course content (e.g. textbooks) and a previous study identified EvMed core principles to help instructors
set learning goals. However, assessment tools are not yet available. In this study, we address this need
by developing an assessment that measures students’ ability to apply EvMed core principles to various
health-related scenarios.
Methodology: The EvMed Assessment (EMA) consists of questions containing a short description of a
health-related scenario followed by several likely/unlikely items. We evaluated the assessment’s validity and reliability using a variety of qualitative (expert reviews and student interviews) and quantitative
(Cronbach’s α and classical test theory) methods. We iteratively revised the assessment through several
rounds of validation. We then administered the assessment to undergraduates in EvMed and Evolution
courses at multiple institutions.
Results: We used results from the pilot to create the EMA final draft. After conducting quantitative validation, we deleted items that failed to meet performance criteria and revised items that exhibited borderline
performance. The final version of the EMA consists of six core questions containing 25 items, and five
supplemental questions containing 20 items.
Conclusions and implications: The EMA is a pedagogical tool supported by a wide range of validation
evidence. Instructors can use it as a pre/post measure of student learning in an EvMed course to inform
curriculum revision, or as a test bank to draw upon when developing in-class assessments, quizzes or