Catherine Gabelica, IESEG School of Management
Vitaliy Popov, University of Michigan
Effective collaboration in high-stakes environments depends on teams’ ability to reflect on performance, interpret feedback collectively, and translate experience into improved future action. Simulation-based training is increasingly used to develop these capabilities in domains such as emergency response, healthcare, and other time-critical team settings. However, evidence remains limited regarding which instructional design features most effectively support team-level learning during post-simulation debriefings. The present study examines how multisource feedback and guided facilitation with video review influence the depth and level of feedback processing in team training contexts.
The study draws on a team research that conceptualizes feedback as a socially embedded process in which team members must jointly interpret performance information and construct shared understanding. Prior research suggests that feedback is most effective when it stimulates collective reflection rather than individual reactions alone, yet many debriefing practices fail to elicit such team-level processing. We therefore focus on two design features frequently used in simulation-based training but rarely examined together: (1) multisource feedback that combines evaluations from multiple perspectives, and (2) structured facilitation supported by video-assisted review of team performance.
Participants were assigned to teams engaged in high-fidelity emergency simulation exercises designed to require coordination, information sharing, and rapid decision-making. Teams completed simulation scenarios followed by post-simulation debriefings under four conditions in a 2×2 factorial design: multisource feedback present vs. absent, and guided facilitation with video review present vs. absent. Feedback sources included self-assessment, peer observation, instructor evaluation, and team-level ratings. Debriefings were audio-recorded and coded to capture the level (individual vs. team) and depth (low vs. high) of feedback processing, as well as reflective behaviors such as evaluation, exploration of alternatives, and decision-oriented planning.
To analyze team interaction patterns, we applied Ordered Network Analysis (ONA), a methodological approach that models the temporal structure of communication and allows comparison of team sense-making processes across conditions. This approach enables a fine-grained examination of how feedback is discussed, integrated, and transformed into collective understanding during debriefings.
Results indicate that teams receiving both multisource feedback and guided facilitation demonstrated the highest levels of deep, team-level processing and reflective discussion. Multisource feedback alone increased attention to performance discrepancies but did not consistently produce collective reflection without facilitation. Guided facilitation supported more structured discussion and helped teams move from individual interpretations toward shared sense-making. The combination of multisource feedback and facilitation produced the most balanced pattern of evaluative, exploratory, and decision-oriented behaviors, suggesting a synergistic effect of these design features.
These findings contribute to the science of team science by identifying specific instructional mechanisms that enhance team learning in training environments. The results highlight the importance of designing feedback processes that promote collective reflection rather than isolated evaluation and demonstrate how analytic approaches such as ONA can be used to study team cognition in action. Implications are discussed for the design of team training programs in high-stakes, interdisciplinary, and transdisciplinary settings.