Abstract
Purpose
Coronavirus disease 2019 (COVID-19) restrictions resulted in an increased emphasis on virtual communication in medical education. This study assessed the acceptability of virtual teaching in an online objective structured clinical examination (OSCE) series and its role in future education.
Methods
Six surgical OSCE stations were designed, covering common surgical topics, with specific tasks testing data interpretation, clinical knowledge, and communication skills. These were delivered via Zoom to students who participated in student/patient/examiner role-play. Feedback was collected by asking students to compare online teaching with previous experiences of in-person teaching. Descriptive statistics were used for Likert response data, and thematic analysis for free-text items.
Results
Sixty-two students provided feedback, with 81% of respondents finding online instructions preferable to paper equivalents. Furthermore, 65% and 68% found online teaching more efficient and accessible, respectively, than in-person teaching. Only 34% found communication with each other easier online; Forty percent preferred online OSCE teaching to in-person teaching. Students also expressed feedback in positive and negative free-text comments.
Conclusion
The data suggested that generally students were unwilling for online teaching to completely replace in-person teaching. The success of online teaching was dependent on the clinical skill being addressed; some were less amenable to a virtual setting. However, online OSCE teaching could play a role alongside in-person teaching.
Restrictions due to the coronavirus disease 2019 (COVID-19) pandemic in 2020 had a significant impact on medical student teaching across universities [1]. Increased adoption of technology was needed to mitigate the risk of infecting students, vulnerable patients, and university staff, while ensuring that core competencies of medical school training were achieved [2-4].
Live and pre-recorded video tutorials were well-suited to be delivered over an online platform, with evidence showing them to be effective during the pandemic [5]. However, teaching and assessment in the style of objective structured clinical examinations (OSCEs) remain a central pillar of medical education. They present a greater logistical challenge, as they necessitate a high degree of interpersonal communication and practical skill demonstration to be able to target the “shows how” level of Miller’s pyramid [6]. They continue to be essential as a tool to mirror clinical scenarios both in formative settings to recognize poor performance, and in summative settings to assess the application of knowledge [7]. Although a wealth of information exists on traditional in-person OSCEs, the authors found scarce information on analyzing online OSCE teaching in medical education.
With the restrictions of the COVID-19 pandemic and the need to find an alternative to the classical OSCE setting, the authors developed a Surgical OSCE-Focused Teaching (“SOFT”) Series to trial the use of an online platform for teaching medical students in their first year of clinical studies at a UK medical school. We aimed to assess whether online teaching in the style of mock OSCE stations could ever replace in-person teaching and be accepted in the future training of undergraduate medical students.
Ethical approval was not required for this study, as per the National Health Service Research Ethics Committee tool. This study did not include a clinical trial and did not collect any personal data. Completion of the feedback survey was optional for participants, and consent was received to use anonymous information to analyze the teaching series.
The authors describe a descriptive study based on feedback via a post-teaching survey distributed to the students who had attended our teaching series. The findings are reported in the structure described by the STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) checklist [8].
This study was conducted at University College London (UCL) Surgical Society with students from UCL Medical School, between February and May 2021. Students were asked to complete an optional feedback survey (Supplement 1) via Google Forms (Google LLC, Mountain View, CA, USA) at the end of the teaching to assess the perceived characteristics of the session compared to their previous experiences of in-person teaching before COVID-19.
During the development of this OSCE series, the authors identified 6 common surgical topics that were likely to be seen in clinical practice (Table 1). Scenarios that would provide an adequate foundation upon which to test data interpretation, clinical knowledge, and communication skills were selected. The stations were independently verified by the Deputy Head of Year 4 MBBS (Bachelor of Medicine, Bachelor of Surgery) at the medical school for face validity and adjusted accordingly to be in keeping with the curriculum. Each clinical context was broken down into specific tasks, including history-taking, structured communication with healthcare professionals, imaging interpretation, safe prescription, low-fidelity assessment of an unwell patient, and developing management plans. The stations were piloted with a focus group, allowing the resolution of logistical issues and ambiguity in instructions. The final OSCE stations are available in Supplements 2–7.
The teaching series was delivered over Zoom (Zoom Video Communications Inc., San Jose, CA, USA), which included the use of breakout rooms to closely emulate the partitions present in an in-person OSCE. Each breakout room consisted of 3 to 4 students and 1 near-peer facilitator, who was an academic year above the students. The facilitator’s role was to keep time and guide the discussion and feedback. Read-only Google Docs (Google LLC) files containing instructions for each station were distributed to the participants.
Participants rotated through the role of student, patient, and examiner in the 6 stations and were given the opportunity to reflect on their experiences of each role. To simulate in-person OSCEs, participants were encouraged to keep their webcams on to maximize both verbal and visual interaction and facilitate a more natural conversation.
Medical students in the first clinical year of their primary medical qualification at UCL Medical School were offered this teaching session via the student-led UCL Surgical Society and were invited to complete the post-teaching feedback survey. There were no exclusion criteria.
The feedback survey requested participants’ opinions on the efficiency and accessibility of the format, the ease of understanding online instructions, the interaction with tutors and communication with peers through video calls, and whether the students would prefer this format to in-person teaching in the future. Participants were also asked in free-text fields for what they liked most about the series and for any suggestions for improvement.
A 5-point Likert scale, from strongly disagree to strongly agree, was used for 6 questions as described above. These questions were assessed for face validity by the 3 authors. The internal reliability of the survey was calculated as Cronbach α=0.86. Responses were grouped together to give a combined percentage of “agree” and “strongly agree” responses. Comments from the 2 free-text fields were only edited for grammatical corrections by the two authors (V.M. and A.B.). The two authors (A.B. and V.M.) used open coding to generate initial codes for the free-text comments on the survey. The codes were reviewed to identify and define themes through further discussion between 2 authors (A.B. and V.M.). The themes defined for positive comments were “efficiency or convenience,” “interactivity,” and “general,” and those for negative comments were “difficulties with individual stations” and “general.” Examples of each of the themes highlighted by the participants were selected for inclusion in the results. Comments were not included if they were incomplete, were regarding specific tutors, or the content was covered by a comment already selected. The raw data can be found in Dataset 1.
The teaching series was advertised via the social media channels of UCL Surgical Society and participants meeting the inclusion criteria were permitted to join. Completion of the feedback survey was optional for anyone who participated in the session as a student.
The feedback survey was sent to all 66 students who attended the online OSCE teaching series. A total of 62 responses were recorded over the course of the series, giving a response rate of 94.0%.
Descriptive data of the survey respondents were not collected; the responses were anonymous. All participants were in the first clinical year of their primary medical qualification at UCL Medical School.
Forty respondents (64.5%) agreed or strongly agreed that the online OSCE format was more time-efficient than in-person seminar room teaching. Forty-two (67.7%) agreed or strongly agreed that this format was more accessible. Twenty-seven (43.6%) agreed or strongly agreed that interaction with tutors was easier through video calls and 21 (33.8%) agreed or strongly agreed that communication with peers was easier through video calls. Twenty-five (40.4%) agreed or strongly agreed that they would prefer this format of teaching over in-person seminar room teaching. Regarding online instructions, 50 (80.7%) of respondents found them easier to understand and retain than paper handouts (Table 2).
Positive free-text comments highlighted that the online format was efficient and convenient (Table 3). Some students found engagement through this medium to be positive. A few students valued the clarity of online resources. There were no comments suggesting that the online medium was better than in-person OSCE teaching, although some students did mention that the 2 methods could be used together.
Negative free-text comments expressed the unsuitability of some stations, such as prescribing and assessment of an unwell patient, in the online format (Table 4). Some respondents also highlighted that they would not want the online format in the future over in-person teaching.
The format of online OSCEs used in this teaching session was found to be time efficient and accessible to the majority of survey respondents. The issues highlighted by the feedback were surrounding interaction with tutors and communication with colleagues. Twenty-five (40.4%) of respondents agreed that they would prefer the online format over in-person teaching. Free-text comments highlighted that the communication issues were more pertinent with certain stations that may be less suited to the online format.
The clear benefit of running an OSCE teaching series online during the pandemic is infection control. Participants could access the session from their own homes and the development of the series was conducted remotely. The risk of COVID-19 to participants was minimized, as in other examples of online teaching [2-4]. No patients or professional actors were utilized in the teaching series, further minimizing exposure and protecting vulnerable individuals.
There was a predominant theme of efficiency and accessibility across the series. This may be due to the reduced cost and time of travel to the sessions. Similarly, an online OSCE using Zoom at Harvard School of Dental Medicine highlighted that students also valued the convenience of the online setup [9].
The authors found the main issue in the development of the stations to be the logistical limitations in teaching certain types of clinical scenarios. Despite the prior use of a focus group to uncover potential issues, our feedback revealed that students still found the assessment of an acutely unwell patient to be difficult over an online platform. The focus group also highlighted issues with a discarded joint examination station, in which they expressed that having to describe examination techniques felt cumbersome. Using such low-fidelity simulation settings, the suspension of disbelief is more difficult to attain online due to the lack of tactile and non-verbal feedback from other participants [1]. In our patient briefs, observation and examination findings were provided for this station; however, without a mannequin to examine, students found it difficult to elicit these findings. In this format they were only able to show attainment of the “knows how” level of Miller’s pyramid, rather than the “shows how” level [6].
The feedback the students received may have been limited as it omitted discussion of non-verbal cues when interacting online. Felthun et al. [10] reported that in this format, examiners would be unable to comment on positive and negative body language, which would otherwise play an important role in a real consultation. This inability to replicate some clinical scenarios could be detrimental to students’ learning as it does not address competencies that are essential for students. However, comments such as “the personalized feedback and small group setting was really useful” suggests that some students found online teaching less intimidating and were still able to learn and reflect from the process.
Through the absence of critical comments for some stations, it can be argued that certain clinical scenarios are well replicated in an online platform. History-taking is an example of a clinical competency that can be well assessed and reviewed in this format. Despite limited non-verbal communication, students still valued feedback on their questions and wording. Similarly, communication of management plans and interpretation of radiology results are other examples that can be well replicated online. The authors suggest that viewing radiology results on a screen is more akin to current clinical practice than reviewing print-outs. Kakadia et al. [9] reported that half of the respondents believed future OSCEs would be conducted online. As many aspects of medicine and medical education continue to transition in this way, teaching formats such as ours provide one more opportunity for students to further their online decorum.
Extrapolating from the student feedback, the authors suggest that there are other potential topics that could be explored in this online format. Data interpretation is one such topic. This could involve interpretation and explanation of blood test results, microbiology samples, arterial blood gas values or radiographs. Similarly, this modality could be extended to the assessment and description of fundoscopy, otoscopy, or skin lesions via the use of photographs. Other stations that focus on communication skills could also be incorporated. Conversations regarding the interface between ethics, law, and medicine relevant to the country in which the students are training are potential sources of stations. This could include contraception counseling, abortion, and capacity or end-of-life discussions.
Our results show that online platforms pose some obstacles in communication, which will be important to be aware of when developing stations for this format. They are likely a combination of non-verbal and verbal issues. Reasons could include connection difficulties, audio-visual lag, overlapping of voices, and difficulty replicating eye contact. Students in a previous study agreed that telehealth encounters, which could be comparable to the consultations in our OSCE stations, came with issues in maintaining eye contact and tone of voice [11]. However, our results also show that online handouts may be efficient in disseminating information, suggesting that despite barriers to online communication, it is possible to provide clear instructions through parallel means such as Google Doc links.
The authors acknowledge that the interpretation of the results from this paper is limited because the survey used to measure acceptance of the online format was assessed for face validity but not content validity. Since the students were not asked to directly compare an in-person teaching session with this online session, the responses of the students may be affected by recall bias. The use of an extensively validated survey could have given more objective data. This paper largely draws on the subjective comments of the participants.
This paper only collected data from a single medical school in London, England and as such, extrapolating to other medical settings across the globe may be limited due to factors such as access and exposure to technology and the internet. Further research involving other similar implementations of online OSCEs at other institutions will provide broader insights into student perceptions. This could be further expanded to studies where the same clinical stations are trialed in both online and in-person formats, which would then allow a direct comparison of the strengths and weaknesses of each. Additional research could be conducted incorporating the views of tutors as other stakeholders, which was not covered by this study.
As technology use in medical education further accelerates, virtual and mixed reality products could overcome some current limitations of online teaching. For example, a 3-dimensional augmented environment in which to visualize clinical signs, interpret body language, and simulate the use of clinical equipment would allow a wider range of stations than can be provided by an online session such as ours [12].
This study tested a novel online OSCE approach to teach medical students in response to restrictions imposed by the COVID-19 pandemic. The teaching series was generally well received, with students commending the accessibility and efficiency of the sessions. Some clinical scenarios were well accepted in an online format. Others, such as examination and assessment of an unwell patient, were not. Overall, the data revealed that students were largely unwilling to completely replace in-person teaching with an online method. There is potential for this format to be accepted in conjunction with another in-person teaching. It would be interesting to further investigate how these views may change as online medical teaching becomes more prevalent and to expand the sample size with the addition of a multi-center analysis.
Notes
Authors’ contributions
Conceptualization: VM, AB, NY. Data curation: VM, AB. Formal analysis: VM, AB. Methodology: VM, AB, NY. Project administration: VM, AB. Visualization: VM, AB. Writing–original draft: VM, AB, NY. Writing–review & editing: VM, AB, NY.
Data availability
Data files are available from Harvard Dataverse: https://doi.org/10.7910/DVN/N3BLTZ
Dataset 1. Raw response data from participants.
Supplementary materials
Supplementary files are available from Harvard Dataverse: https://doi.org/10.7910/DVN/N3BLTZ
References
1. Savage A, Minshew LM, Anksorus HN, McLaughlin JE. Remote OSCE experience: what first year pharmacy students liked, learned, and suggested for future implementations. Pharmacy (Basel). 2021; 9:62. https://doi.org/10.3390/pharmacy9010062.
2. Zuo L, Dillman D, Miller Juve A. Learning at home during COVID-19: a multi-institutional virtual learning collaboration. Med Educ. 2020; 54:664–665. https://doi.org/10.1111/medu.14194.
3. Ahmed S, Shehata M, Hassanien M. Emerging faculty needs for enhancing student engagement on a virtual platform: version 1. MedEdPublish. 2020; 9:75. https://doi.org/10.15694/mep.2020.000075.1.
4. Hannan TA, Umar SY, Rob Z, Choudhury RR. Designing and running an online objective structured clinical examination (OSCE) on Zoom: a peer-led example. Med Teach. 2021; 43:651–655. https://doi.org/10.1080/0142159X.2021.1887836.
5. Dost S, Hossain A, Shehab M, Abdelwahed A, Al-Nusair L. Perceptions of medical students towards online teaching during the COVID-19 pandemic: a national cross-sectional survey of 2721 UK medical students. BMJ Open. 2020; 10:e042378. https://doi.org/10.1136/bmjopen-2020-042378.
6. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990; 65(9 Suppl):S63–S67. https://doi.org/10.1097/00001888-199009000-00045.
7. Terry R, Hing W, Orr R, Milne N. Do coursework summative assessments predict clinical performance?: a systematic review. BMC Med Educ. 2017; 17:40. https://doi.org/10.1186/s12909-017-0878-3.
8. Vandenbroucke JP, von Elm E, Altman DG, Gotzsche PC, Mulrow CD, Pocock SJ, Poole C, Schlesselman JJ, Egger M; STROBE Initiative. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): explanation and elaboration. PLoS Med. 2007; 4:e297. https://doi.org/10.1371/journal.pmed.0040297.
9. Kakadia R, Chen E, Ohyama H. Implementing an online OSCE during the COVID-19 pandemic. J Dent Educ. 2020; 85(Suppl 1):1006–1008. https://doi.org/10.1002/jdd.12323.
10. Felthun JZ, Taylor S, Shulruf B, Allen DW. Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review. J Educ Eval Health Prof. 2021; 18:11. https://doi.org/10.3352/jeehp.2021.18.11.
11. Skoy ET, Eukel HN, Frenzel JE, Schmitz TM. Performance and perceptions: evaluation of pharmacy students’ consultation via telepharmacy. J Pharm Technol. 2015; 31:155–160. https://doi.org/10.1177/8755122514568123.
12. Kye B, Han N, Kim E, Park Y, Jo S. Educational applications of metaverse: possibilities and limitations. J Educ Eval Health Prof. 2021; 18:32. https://doi.org/10.3352/jeehp.2021.18.32.
Table 1.
Table 2.
Item |
Likert scale responsesa) |
||||
---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | |
I found this format of teaching more time efficient than in-person seminar room teaching | 1 (1.6) | 11 (17.7) | 10 (16.1) | 21 (33.9) | 19 (30.6) |
I found this format of teaching more accessible than previous in-person seminar room teaching | 4 (6.5) | 6 (9.7) | 10 (16.1) | 18 (29.0) | 24 (38.7) |
The online instructions and learning objectives were easier to understand and retain than paper handouts | 1 (1.6) | 4 (6.5) | 7 (11.3) | 22 (35.5) | 28 (45.2) |
I found it easier to interact with the tutors through video call than in-person | 8 (12.9) | 10 (16.1) | 17 (27.4) | 12 (19.4) | 15 (24.2) |
I found it easier to communicate within a group of peers through video call than in-person | 9 (14.5) | 13 (21.0) | 19 (30.6) | 10 (16.1) | 11 (17.7) |
I would prefer this format of teaching in the future rather than in-person seminar room teaching | 5 (8.1) | 15 (24.2) | 17 (27.4) | 13 (21.0) | 12 (19.4) |