This article has been
cited by other articles in ScienceCentral.
Abstract
Background
Although competency-based education (CBE) is becoming a popular form of medical education, it has not been used to train residents. Recently, the Korean Society of Anesthesiologists completed a pilot implementation and evaluation of a CBE program.This study aims to outline the experience.
Methods
The chief training faculty from each hospital took a one-hour online course about CBE. Emails on the seven core competencies and their evaluation were sent ahead of a pilot core competency evaluation (CCE) to residents and faculty. The pilot CCE took place in late 2021, followed by a survey.
Results
A total of 68 out of 84 hospitals participated in the pilot CCE. The survey response rate was 55.9% (38/68) for chief training faculty, 10.2% (91/888) for training faculty, and 30.2% (206/683) for residents. More than half of the training faculty thought that CCE was necessary for the education of residents. Residents’ and training faculty’s responses about CCE were generally positive, although their understanding of CCE criteria was low. More than 80% of the hospitals had a defibrillator and cardiopulmonary resuscitation manikin while the rarest piece of equipment was an ultrasound vessel model. Only defibrillators were used in more than half of the hospitals. Thoughts about CCE were related to various factors, such as length of employment, location of hospitals, and the number of residents per grade.
Conclusions
This study’s results may be helpful in improving resident education quality to meet the expectations of both teaching faculty and residents while establishing CBE.
Keywords: Competency-based education, Internship and residency, Medical education, Medical faculty, Professional competence, Surveys and questionnaires
Introduction
Competency-based education (CBE) has been the leading form of medical education for several decades [
1–
4]. In some countries, in addition to medical schools, it is also used in educating residents. Recently, it has become increasingly popular in medical schools in South Korea [
5]. However, it is not widely used to educate residents in Korea.
In Korea, the first resident curriculum systemization project was implemented over eight months, from May 2020 to January 2021, and the second project was implemented over six months, from July to December 2021. These projects were implemented to reduce medical residents’ training time following the passage of a law about residency programs and increasing demands by residents for systematic education and reorganization of the training curriculum due to low satisfaction with them. Accordingly, the Korean Society of Anesthesiologists (KSA) applied to participate in and conducted the first and second implementation projects, which were financially sponsored and encouraged by the government. The resident curriculum systemization projects involved developing a competency-based residency program, developing training guides for training faculty members, establishing evaluation guidelines for core competencies, establishing feedback channels for evaluation results, developing an operations plan, and developing an e-portfolio. The KSA conducted a pilot core competency evaluation (CCE) from the end of November to the beginning of December 2021. In January 2022, it conducted a survey on CBE for residents, training faculty members, and chief training faculty members about their knowledge and feelings about the core competencies, their experience with pilot CCE, and their equipment used in CCE. This study was conducted to analyze the survey results, identify problems with incorporating CBE into the residency program, and find ways that it can be improved.
Materials and Methods
The KSA’s Training and Education Committee set seven core competencies and related learning objectives and milestones for evaluation (
Supplementary Materials 1 and
2 [Korean]). The seven core competencies are preoperative assessment, difficult airway management, central venous catheter insertion using ultrasound, spinal and epidural anesthesia, treatment of myofascial pain syndrome, advanced cardiovascular life support, and mechanical ventilator management. The committee provided this information to training faculty members during the first and second resident curriculum systemization projects. Before the pilot CCE, the chief training faculty members completed one-hour online education courses related to CBE. Brief evaluation instructions were sent to residents and faculty members (
Supplementary Material 3 [Korean]). Residents were evaluated on their mastery of the core competencies according to the KSA’s resident training curriculum (
Supplementary Material 4). The chief training faculty members and the KSA Training and Education Committee held an online meeting prior to the pilot evaluation. The survey was conducted over the course of one week, from January 14 to January 21, 2021. The survey respondents were divided into three groups (chief training faculty members, training faculty members and residents). Chief training faculty members were asked about the importance and necessity of CCE and were asked to respond on a five-point Likert scale about whether the equipment necessary for CCE was provided and whether they thought it was necessary. In addition, they were asked how important they thought each core competency was. They were also given a multiple-choice questionnaire about the difficulties with CCE and how to improve it (
Supplementary Material 5 [Korean]). Training faculty members were asked the same questions except for those questions about equipment. In addition to the importance and necessity of competency evaluation, residents were asked about several things that were shown to be important by a previous study [
6]. Other information that may have been related to the survey results was also collected, such as the resident’s grade, years of experience of training faculty, hospital location, and the number of residents in each grade. All continuous variables were analyzed by Student’s t-test or Mann-Whitney
U test according to the results of the normality test. Categorical variables were compared using Chi-square or Fisher’s exact test. A P value of less than 0.05 was considered statistically significant. This study was approved by the Institutional Review Board of Inje University Ilsan Paik Hospital (IRB no. 2022-05-014). The requirement that participants provide informed consent was waived.
Results
The KSA has 84 training hospitals. Among those, 68 participated in the pilot evaluation. The number of chief training faculty members, training faculty members, and residents that participated in the pilot evaluation was 68, 888, and 683, respectively. Of those participants, 38 (55.9%), 91 (10.2%), and 206 (30.2%) responded to the surveys at the end of the program, respectively.
Table 1 shows the demographic data of respondents by position. The responses to Likert scale items are shown in
Figs. 1–
3.
More than half of the chief training faculty and training faculty thought that CCE was necessary for educating residents. However, unlike the training faculty, less than half of the chief training faculty thought that it was important for educating residents. Less than half of the residents considered CCE to be both necessary and important for their education. Each core competency was considered important by all positions, though residents considered them to be more important than the faculty. The chief training faculty and training faculty thought the treatment of myofascial pain syndrome and central venous line insertion using ultrasound were less important than residents did. Residents’ responses to competency-related items about CCE and training faculty were generally positive. However, their understanding of CCE criteria was low.
Fig. 4 shows the results about whether various models were equipped and used and whether the respondent thought that they were useful. More than 80% of hospitals had a defibrillator and cardiopulmonary resuscitation manikin, but the rarest piece of equipment was an ultrasound vessel model. Defibrillators were used in more than half of the hospitals during the pilot CCE programs. The chief training faculty generally indicated that they thought that the models were useful for assessing competency.
Factors significantly related to each other were as follows. Among the chief training faculty, hospital location was related to whether it had a defibrillator (P = 0.013) and use of CPR manikin (P = 0.022). The chief training faculty member’s gender was related to the perceived need for a spinal anesthesia training model (P = 0.031). Thoughts about the importance of spinal anesthesia as a core competency differed by hospital location (P = 0.001).
Among the training faculty, length of employment was related to thoughts about the importance of cardiopulmonary resuscitation as a core competency (P = 0.026). The number of residents per grade was related to their thoughts about the importance of mechanical ventilation as a core competency (P = 0.025).
Training faculty as a whole (including chief training faculty) had different thoughts about the importance of preoperative assessment (P < 0.001), spinal anesthesia (P = 0.017), and mechanical ventilation (P = 0.005) according to hospital location. Length of employment was related to thoughts about the importance of advanced cardiovascular life support as a core competency (P = 0.015).
The number of residents per grade affected their thoughts about the need (P = 0.025) and importance (P = 0.041) of core competencies generally and their thoughts about each competency individually (spinal and epidural anesthesia: P = 0.013, mechanical ventilation: P = 0.011). All of the P values of the relationships are presented in
Supplementary Material 6. Frequency plots are provided for significantly related factors (
Supplementary Material 7 [Korean]).
Discussion
According to the survey result analysis after this pilot competency evaluation, both training faculty and residents thought that the transition to CBE was necessary and important, but they thought so to different degrees. Given the responses about understanding core competencies and evaluation methods, it seems necessary to provide education about individual core competencies and competency evaluation methods in the future. Also, compensation or workload should be adjusted to avoid overloading training faculty. Interestingly, the degree of understanding of CBE differed by region among residents. Moreover, the number of residents per grade was negatively correlated with how many knew about competency evaluation well and thought it was important. Also, the number of residents per grade was positively correlated with their rating of training faculty (
Supplementary Material 7 [Korean]).
Medical education has been transitioning to CBE for decades in the United States, Canada, and the United Kingdom [
4]. The shift from knowledge, time-based education to task-based CBE is taking place in medical student education [
7]. In order to nurture well-trained anesthesiologists who have not only the knowledge but also the skills, behaviors, and attitudes necessary to succeed in their profession, anesthesiology residents must be educated and evaluated accordingly, so the resident curriculum must provide CBE.
The KSA quickly introduced CBE, as described above. There were relatively few personnel involved in the process, including the Training and Education Committee members and members of related task force teams under the committee. The time provided for the validation, distribution, and education about each core competency was also insufficient.
As a result, as shown by this study’s results, although most training faculty and residents agree on the importance and necessity of CCE, only a small percentage of them knew the content well. Thus, CBE should continue to be conducted in the future.
One of the problems with CBE is that related terms are used interchangeably in various literatures. Competence refers to the array of abilities across multiple domains or aspects of physician performance in a specific context. On the other hand, competency means an observable ability of health professionals, integrating multiple components, such as knowledge, skills, values, and attitudes [
8]. Entrustable professional activities (EPAs) are tasks that learners can execute unsupervised once they have attained a sufficient level of competency [
9]. Milestones are achievements or behaviors presented by a physician that reflect their competency to execute EPAs [
10]. However, in practice, these terms’ definitions can vary significantly [
8]. The Korean Society of Otorhinolaryngology-Head and Neck Surgery’s competency-based residency program teaches eight clinical competencies and four conceptual EPAs. The Korean Association of Internal Medicine defines 18 EPAs and 80 competencies (
Supplementary Material 8 [Korean]). The KSA residency program defines seven core competencies and its evaluation guidelines define EPAs and milestones (
Supplementary Material 1,
2 [Korean]). The KSA deliberately minimized the number of core competencies to avoid overloading teaching faculty. It is unclear whether reducing the number of core competencies being taught undermines the quality of education. Thus, this small number of competencies should be reevaluated when establishing CBE or modifying current core competencies.
The KSA’s core competencies were intentionally designed to avoid textbook knowledge transfer and promote the learning of clinical techniques. As a result, they can be criticized for addressing only part of the resident education. However, the KSA’s CBE is only in its early stage. Furthermore, as shown in the survey results, most teaching faculty complained about the workload of teaching even this small number of core competencies. Increasing the number of competencies and content in the curriculum should be done gradually, even if it is essential.
This study’s results showed that thoughts about the core competencies varied by various variables, such as the number of residents per grade, hospital location, length of employment, and gender. Moreover, the number of residents per grade was negatively correlated with thoughts about CCE. The difference between faculty and residents should be considered when designing CBE education programs and distributing resources.
The first limitation of this study was that well over 30% of the residents responded that they did not know about CBE or their evaluation standards, which is not small. This result was likely a product of the fact that residents are receiving insufficient education, so this result would be expected to change as residents become better acquainted with CBE. The second limitation was that the overall response rate was low, particularly for training faculty, so there may have been a selection bias in the results. The third limitation was that the questionnaire asked what respondents thought about CCE, not CBE. Most of the faculty and residents likely did not know about CBE, so asking about the pilot CCE was the only feasible option. Thus, the survey results may not reflect their thoughts about CBE. The fourth limitation was that the training faculty’s thoughts about each competency may have differed by subspecialty, but the survey did not collect the respondents’ subspecialty, so this relationship was not analyzed.
This article is the only one that contains the results of an extensive survey conducted on educators and trainees after the pilot implementation of a CCE. This article may provide useful information on what needs to be implemented and to be corrected for the successful implementation of CBE in residency programs in the future.
To conclude, the KSA’s establishment of CBE is in its beginning stage. This study’s results may be used to improve resident education quality to meet the expectations of both teaching faculty and residents.