Using an objective structured video exam to identify differential understanding of aspects of communication skills.
- Resource Type
- Article
- Authors
- Baribeau, Danielle A.; Mukovozov, Ilya; Sabljic, Thomas; Eva, Kevin W.; Delottinville, Carl B.,
- Source
- Medical Teacher; Apr2012, Vol. 34 Issue 4, pe242-e250, 1p, 6 Charts, 1 Graph
- Subject
- Communication education
Assessment of education
Audiovisual materials
Communicative competence
Statistical correlation
Curriculum planning
Curriculum
Experiential learning
Research methodology
Case studies
Medical students
Rating of students
Data analysis
Pre-tests & post-tests
Education theory
Inter-observer reliability
Evaluation
Analysis of variance
Conceptual structures
Study & teaching of medicine
Multivariate analysis
Research funding
Simulated patients
Statistics
T-test (Statistics)
Client relations
Repeated measures design
Patient-centered care
- Language
- ISSN
- 0142159X
Background: Effective communication in health care is associated with patient satisfaction and improved clinical outcomes. Professional schools increasingly incorporate communication training into their curricula. The objective structured video exam (OSVE) is a video-based examination that provides an economical way of assessing students' knowledge of communication skills. This study presents a scoring strategy that enables blueprinting of an OSVE to consensus guidelines, to determine which aspects of communication skills create the most difficulty for students to understand and to what degree understanding improves through experiential communication skills training. Methods: Five interactions between a healthcare professional and client were scripted and filmed using standardized patients. The dialogues were mapped onto the Kalamazoo consensus statement by having five communication experts view each video and identify effective and ineffective use of communication skills. Undergraduate students enrolled in a communications course completed an OSVE on three occasions. Results: A total of 79 students completed at least one testing session. The scores assigned supported the validity of the scoring strategy as an indication of knowledge growth. Considerable variability was observed across Kalamazoo sub-domains. Conclusion: With further refining, this scoring approach may prove useful for educators to tailor their education and assessment practices to specific consensus guidelines. [ABSTRACT FROM AUTHOR]