FreshRSS

🔒
☐ ☆ ✇ BMJ Open

Validity evidence for communication skills assessment in health professions education: a scoping review

Por: Dorrestein · L. · Ritter · C. · De Mol · Z. · Wichtel · M. · Cary · J. · Vengrin · C. · Artemiou · E. · Adams · C. L. · Ganshorn · H. · Coe · J. B. · Barkema · H. · Hecker · K. G. — Septiembre 5th 2025 at 13:49
Objective

Communication skills assessment (CSA) is essential for ensuring competency, guiding educational practices and safeguarding regulatory compliance in health professions education (HPE). However, there appears to be heterogeneity in the reporting of validity evidence from CSA methods across the health profession that complicates our interpretation of the quality of assessment methods. Our objective was to map reliability and validity evidence from scores of CSA methods that have been reported in HPE.

Design

Scoping review.

Data sources

MEDLINE, Embase, PsycINFO, CINAHL, ERIC, CAB Abstracts and Scopus databases were searched up to March 2024.

Eligibility criteria

We included studies, available in English, that reported validity evidence (content-related, internal structure, relationship with other variables, response processes and consequences) for CSA methods in HPE. There were no restrictions related to date of publication.

Data extraction and synthesis

Two independent reviewers completed data extraction and assessed study quality using the Medical Education Research Study Quality Instrument. Data were reported using descriptive analysis (mean, median, range).

Results

A total of 146 eligible studies were identified, including 98 394 participants. Most studies were conducted in human medicine (124 studies) and participants were mostly undergraduate students (85 studies). Performance-based, simulated, inperson CSA was most prevalent, comprising 115 studies, of which 68 studies were objective structured clinical examination-based. Other types of methods that were reported were workplace-based assessment; asynchronous, video-based assessment; knowledge-based assessment and performance-based, simulated, virtual assessment. Included studies used a diverse range of communications skills frameworks, rating scales and raters. Internal structure was the most reported source of validity evidence (130 studies (90%), followed by content-related (108 studies (74%), relationships with other variables (86 studies (59%), response processes (15 studies (10%) and consequences (16 studies (11%).

Conclusions

This scoping review identified gaps in the sources of validity evidence related to assessment method that have been used to support the use of CSA methods. These gaps could be addressed by studies explicitly defining the communication skill construct(s) assessed, clarifying the validity source(s) reported and defining the intended purpose and use of the scores (ie, for learning and feedback, for decision making purposes). Our review provides a map where targeted CSA development and support are needed. Limitations of the evidence come from score interpretation being constrained by the heterogeneity of the definition of communication skills across the health professions and the reporting quality of the studies.

❌