Professional Development Workshops (PDWS)
This year, the Professional Development Workshops are offered across four different themes. Participants will receive a Certificate of Workshop Attendance for each workshop completed. Participants who complete two workshops within the same theme will receive a Certificate of Series Attendance listing both workshops and the overarching theme.
In addition to these four sets of themed workshops, we also have one full-day workshop sponsored by Aquifer.
This foundational workshop equips participants with essential knowledge and practical skills for evaluating and using AI tools in health professions education. Through hands-on practice and collaborative learning, participants will develop competencies in systematic tool evaluation, advanced prompting techniques, critical validation frameworks, and ethical AI implementation.
While building on the successful 2025 pilot program, this workshop features significantly updated content reflecting the rapid evolution of AI tools and capabilities. New elements include enhanced frameworks for responsible AI use (incorporating AAMC principles and IACAI recommendations), expanded coverage of AI agents and autonomous capabilities, updated tool comparisons, and refined prompting techniques. Previous attendees will gain valuable new knowledge and skills aligned with current developments in the field.
Content Areas:
- Understanding AI capabilities and limitations
- Systematic Generative AI tool evaluation and comparison
- Advanced prompting techniques and AI agents
- Critical validation and evaluation frameworks
- Ethical and responsible AI use (incorporating AAMC and IACAI frameworks)
- Hands-on skills building with participant materials
Instructional Methods: Brief conceptual presentations, structured hands-on exercises, small group comparative analysis activities, real-world scenario discussions, and artifact creation with personalized feedback.
Target Audience: Health professions educators, curriculum designers, and academic leaders with basic to intermediate AI familiarity seeking to enhance their capabilities in evaluating and implementing AI tools in educational settings.
Learning Objectives
- Analyze the capabilities, limitations, and appropriate applications of large language models (LLMs) in health professions education contexts, including recognition of common biases and training data constraints.
- Evaluate and compare generative AI tools using established rubrics and systematic frameworks to select appropriate tools for specific educational needs.
- Apply advanced prompting techniques and iterative strategies to optimize AI output quality, relevance, and educational utility for diverse medical science education applications.
- Assess AI-generated content for reliability, accuracy, and appropriateness using critical evaluation frameworks before implementation in educational settings.
- Integrate ethical principles and responsible AI practices into educational workflows, addressing academic integrity, data privacy, HIPAA/FERPA compliance, and distinctions between open and closed AI systems.
This advanced workshop focuses on implementing AI across core educational functions in health professions education, with emphasis on creating pedagogically sound, authentic artifacts. Through guided hands-on activities, participants will transform teaching materials, redesign assessments, and leverage AI for data-informed decision-making while maintaining educational integrity and academic rigor.
This workshop incorporates the latest developments in AI applications for health professions education, including expanded coverage of agentic AI workflows, enhanced learning analytics capabilities, and low-code AI solutions for educators. Previous 2025 attendees will benefit from new case scenarios, updated assessment frameworks that address evolving academic integrity challenges, and emerging best practices in AI-enhanced curriculum design.
Content Areas:
- Optimizing teaching with AI (content transformation, interactive learning design)
- Enhancing assessment with AI (authentic assessment design, rubric development, feedback generation)
- Data-informed decision-making and learning analytics
- Emerging applications: Agentic AI, machine learning, and low-code solutions
- Hands-on artifact creation with implementation planning
Deliverables: Participants leave with tangible artifacts (transformed learning activity, assessment items, data analysis report) created during the session.
Instructional Methods: Guided hands-on activities using participants' own educational materials, collaborative problem-solving sessions, case scenario analysis, implementation planning exercises, and peer feedback on created artifacts.
Learning Objectives
- Transform traditional educational materials into interactive, AI-enhanced learning experiences using structured workflows that maintain pedagogical soundness and academic rigor.
- Design authentic assessments that leverage AI capabilities while addressing academic integrity concerns, including development of rubrics and AI-driven feedback mechanisms.
- Apply AI tools to analyze qualitative and quantitative educational data, generating actionable insights for curriculum enhancement and program improvement.
- Create implementation-ready artifacts (transformed learning activities, assessment items, or data analysis reports) tailored to their specific institutional contexts.
- Evaluate emerging AI applications (including agentic AI and machine learning) for their potential impact on teaching, assessment, and educational decision-making in health professions education.
Target Audience: Health professions educators, curriculum designers, and academic leaders with basic AI experience (or Workshop 1 completion) seeking to implement advanced AI applications in teaching, assessment, and educational analytics.
The workshop will begin with an introduction to principles of student-centered perspectives to the learning process, and the importance of being cognizant of student potential, skills, and interests, as well as demographic backgrounds, intersectionalities, and insecurities. Participants in the workshop will be able to define and recognize concepts of dual-processing theory, and how the so-called “curse of knowledge” can contribute to learning and coaching environments.
Using a learning framework of didactic instruction followed by role play of case scenarios, participants will become familiar with coaching models and techniques that are effective in different situations of teaching students how to become critical thinkers during the formative early stages of their professional education.
Participants will learn basic tenets of the following coaching models: Goals, Reality, Options, and Will (GROW); Outcome, Scale, Know-how, Affirmation, Action and Review (OSKAR); Contract, Listen, Explore, Action and Review (CLEAR); Frame the Conversation, Understand the Current State, Explore the Desired Goal, and Lay Out the Plan (FUEL); and Peer-Coaching.
Participants will learn the primary principles, and appropriate usage of the following coaching techniques: Directive; Non-directive/facilitative; Autocratic; Democratic/collaborative; Laissez-faire; Situational.
The workshop will conclude with a group discussion to address individualized participant needs for specific situations at their home institutions. By leveraging the power of collective coaching, participants can expect to leave the workshop equipped with new skills to improve their teaching and coaching, and a network of collaborators who can provide continued coaching and support.
Learning Objectives
- Participants will recognize students’ explicit and implicit needs to be successful at making the transition from rote learning to critical thinking.
- Participants will compare and contrast GROW, OSKAR, CLEAR, FUEL, and peer-learning coaching models and determine how best any or either combination of coaching models will be the best fit for their curriculum/courses.
- Participants will describe and apply different coaching techniques - directive; non-directive/facilitative; autocratic; democratic/collaborative; Laissez-faire; situational - to common case-based situations through analysis and role play.
This foundational workshop will help preclinical and clinical instructors understand the importance of outreach/service-learning for preclinical education and learner professional/personal development. Participants will learn how to design service-learning experiences that cultivate professional identity and meaningful community impact. The workshop will introduce practical and straightforward toolkits to align with goals, set up reciprocal partnerships, and track outcomes that participants can use to address their institution’s program(s) impact on preclinical learners and community partners. The participants will then work in small groups with facilitators to address core establishment plans, sustainability elements, and/or evaluation for their program needs.
Content areas include:
- Applying a crosscultural and structural competency lens to frame service learning as professional identity formation
- Exposure to ethnographically informed care to align curricular goals with community priorities while considering ethics, consent, and positionality
- Working with community clients and other stakeholders to establish partnerships
- Developing curricula and programming that address community partnership needs
- Funding mechanisms through sponsorship, foundations, and federal organizations
- Engaging learners through volunteerism and/or course requirements
- The scholarship of outreach/service-learning
Learning Objectives
- Participants will recognize the importance for and role of outreach/service-learning in a preclinical medical sciences curriculum.
- Participants will describe the logic model, curriculum development, and funding steps toward establishing and sustaining community partnerships that foster ethnographically informed outreach/service-learning experiences that influence preclinical education.
- Participants will develop a plan for the scholarship of outreach/service-learning that benefits all stakeholders in the community partnership.
- Participants will apply a practical toolkit to outline how community partnerships can be structured and maintained to support reciprocal, sustainable outreach/service-learning with preclinical learners.
- Participants will design a preliminary outreach/service-learning experience for preclinical learners that aligns curricular goals with community-identified needs.
This workshop will focus on a variety of ways to assess learning beyond the traditional multiple-choice questions (MCQs). In the dynamic world of health professions education, our challenge is to evaluate not just what learners know, but how they think, analyze, and problem-solve under complexity. This workshop moves beyond the limitations of MCQs to equip you with the tools to effectively measure the crucial skills of critical thinking, clinical judgment, and applied knowledge. We will focus on practical, non-MCQ strategies that demand a synthesis of information and independent judgment. Leave this session with immediate, actionable blueprints for assessment items that authentically measure your learners' ability to use knowledge, not just possess it. Elevate your assessment strategy to truly foster the critical thinkers our professions demand.
Learning Objectives
- Design novel, high-fidelity assessment items using non-multiple-choice formats that are explicitly engineered to measure advanced cognitive skills, specifically demonstrating the ability to prioritize, synthesize, and resolve ambiguity within complex health professions scenarios.
- Analyze the cognitive demands of various non-multiple-choice assessment formats to determine their suitability for measuring complex skills like clinical judgment and critical thinking in their specific content area.
- Create "blueprint-ready" assessment tasks that authentically measure learners' ability to make independent clinical decisions and which can be immediately implemented or piloted in their next teaching cycle.
The traditional student feedback survey offers only one piece of the evaluation puzzle. This workshop will provide a practical framework for developing multi-source evaluation systems that integrate diverse, high-quality metrics to drive genuine improvement. We will move beyond common pitfalls and demonstrate how to synthesize data from multiple sources including assessments of learners. You will learn to align evaluation data with educational outcomes and create a powerful narrative of quality assurance and continuous improvement. Leave with the strategies to build an evaluation system that is defensible, insightful, and focused on programmatic excellence in the health professions.
The goal of this workshop is to equip participants with the strategic framework and practical tools necessary to design a robust, multi-source program evaluation system and develop an evidence-based narrative that demonstrates quality assurance and drives continuous programmatic improvement in the health professions.
Learning Objectives
- Design a multi-source evaluation for a program of their choice by identifying diverse, high-quality data metrics including student satisfaction, performance metrics, and other data metrics.
- Develop a structured strategy for data synthesis from multiple evaluation sources to effectively identify and communicate key areas for evidence-based improvement.
- Formulate a clear and defensible quality assurance narrative that aligns with the evaluation finding and supports continuous programmatic excellence.
This workshop guides participants through both the principles and practices of quantitative research methods in medical education. Participants will learn how to develop questionnaires, rating scales, and observational studies. You will understand the concepts of reliability and validity, as both are essential for publishing findings. The workshop will provide you with the tools to understand your data, describe it adequately, and present it in a meaningful way, to help ensure informed decision-making consistent with the information the data provides. Participants are also encouraged to attend Workshop 2, which focuses on mixed methods, to gain a comprehensive understanding of current research approaches in medical education.
Learning Objectives
- Give a brief overview of the various quantitative methods and consider the advantages and disadvantages of each method.
- Understand the design of questionnaires.
- Discuss the concepts of validity and reliability, particularly in relation to questionnaires.
- Use appropriate measures to describe data.
- Conduct basic inferential statistical tests and learn how to interpret the results.
This foundational workshop aims to provide a fresh perspective on educational issues, problems, and research. The presenters will introduce you to a range of research methods for collecting qualitative data. This will enable an understanding of how these methods can be applied in an educational environment. The presenter will guide you through the process of Mixed Methods Research and Evaluation, including research designs and Sampling in Qualitative Research Methods. Participants are invited to take part in Workshop 1 on quantitative research in the morning to gain a deeper understanding of quantitative methodologies, which form an essential component of mixed-methods research.
Learning Objectives
- Differentiate between different types of qualitative methods.
- Identify the steps necessary to organize and manage data collection, including different interview methods.
- Describe the qualitative data coding process.
- Discuss the various factors that may influence the credibility of the data analysis process.
- Outline what must be considered before qualitative data analysis can be accurately represented and appropriate conclusions reached
- Describe different mixed method approaches and consider the advantages and disadvantages of each method.























