Atsusi Hirumi, University of Central Florida
Hyunchang Moon, Augusta University
Okan Arslan, University of West Georgia
Dina Kurzweil, Uniformed Services University
Meredith Ratliff, University of Central Florida
Purpose
Evidence-based medical education (EBME) relies on the effective acquisition and appraisal of educational research. However, traditional methods are resource-intensive, limiting widespread EBME implementation. This study aims to advance EBME by exploring the feasibility and desirability of using GenAI to facilitate each step of the systematic [instructional] design process, starting with the acquisition and appraisal of educational research evidence.
Methods
Three teams of instructional designers and Health Professions educators used mixed research methods to answer three primary research questions: Team 1: Served as a control group, employing traditional methods (databases, search engines, etc.). Team 2: Utilized ""out-of-the-box"" GenAI chatbots with prompts. Team 3: Leveraged ""customizable"" GenAI chatbots trained with Retrieval-Augmented Generation (RAG). Each team acquired and appraised research evidence, comparing processes and outcomes to assess GenAI's feasibility and desirability.
Results
RQ1: Feasibility: GenAI chatbots (Teams 2 & 3) accelerated the process. All teams identified similar relevant evidence. Team 3 experienced longer process times due to RAG training, though results showed higher quality evidence. RQ2: Desirability: GenAI offered time savings, proving desirable. While Team 2 encountered prompt engineering challenges, Team 3 showed stronger alignment between evidence and instructional design objectives. RQ3: Other Constraints: Team 3 encountered more financial and technical constraints, while Team 2 faced prompt engineering challenges. Both teams underscored cost, data security, and ethical considerations as major barriers.
Conclusions
The results will be used to guide future research and evidence-informed design practices by providing insights into how GenAI: (a) affects the use of tools, techniques, and decisions made during the systematic design process; and (b) impacts the effectiveness and efficiency of the systemic design process and resulting curriculum resources. Researchers must also recognize the necessity for continual evidence collection and assessment.