Journal List > J Korean Med Sci > v.32(12) > 1108323

Kim, Ro, Shin, Wi, Jeong, Park, Sun, and Bae: Assessment of Competence in Emergency Medicine among Healthcare Professionals in Cameroon

Abstract

Development of a competence-based curriculum is important. This study aimed to develop competence assessment tools in emergency medicine and use it to assess competence of Cameroonian healthcare professionals. This was a cross-sectional, descriptive study. Through literature review, expert survey, and discrimination tests, we developed a self-survey questionnaire and a scenario-based competence assessment tool for assessing clinical knowledge and self-confidence to perform clinical practices or procedures. The self-survey consisted of 23 domains and 94 questionnaires on a 5-point Likert scale. Objective scenario-based competence assessment tool was used to validate the self-survey results for five life-threatening diseases presenting frequently in emergency rooms of Cameroon. Response rate of the self-survey was 82.6%. In this first half of competence assessment, knowledge of infectious disease had the highest score (4.6 ± 0.4) followed by obstetrics and gynecology (4.2 ± 0.6) and hematology and oncology (4.2 ± 0.5); in contrast, respondents rated the lowest score in the domains of disaster, abuse and assault, and psychiatric and behavior disorder (all of mean 2.8). In the scenario-based test, knowledge of multiple trauma had the highest score (4.3 ± 1.2) followed by anaphylaxis (3.4 ± 1.4), diabetic ketoacidosis (3.3 ± 1.0), ST-elevation myocardial infarction (2.5 ± 1.4), and septic shock (2.2 ± 1.1). Mean difference between the self-survey and scenario-based test was statistically insignificant (mean, −0.02; 95% confidence interval, −0.41 to 0.36), and agreement rate was 58.3%. Both evaluation tools showed a moderate correlation, and the study population had relatively low competence for specific aspects of emergency medicine and clinical procedures and skills.

Graphical Abstract

jkms-32-1931-abf001

INTRODUCTION

Infectious disease outbreaks such as malaria and human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) have resulted in growing demand for medical care in sub-Saharan Africa. In recent years, incidence of road traffic injuries and cardiovascular disease has also increased due to rapid urbanization, industrial development, and health behavior changes. However, emergency medical services (EMS) system is still underdeveloped, and mortality and disability from acute medical conditions and injuries are markedly higher compared to developed countries (12). The rising burden of acute medical conditions puts more emphasis on emergency medical care in prehospital as well as hospital settings (3).
In 2009, the African Federation for Emergency Medicine (AFEM) was established with joint efforts of South Africa, Ghana, and Ethiopia, to develop emergency care system in partnership with advanced countries. In addition, countries including Madagascar, Rwanda, Sudan, Uganda, and Zambia developed residency training programs in emergency medicine (4). Through international partnerships, several African countries tried to strengthen emergency care system by training local directors in emergency medicine, developing training modules, establishing an emergency training center, and creating an academic training program for residents and nurses (4567).
Development of emergency care system in Cameroon, however, is still in a nascent stage. Total health expenditure in Cameroon was a 5% of the gross domestic product during the past 10 years, and poor coordination between stakeholders impeded quality development of services in emergency departments (8). Recently, the Cameroonian government has taken an initiative to establish training schools for emergency medicine to address lack of health resources, revise education curricula, promote permanent employment of healthcare workers in the public sector, and increase funds from external sponsorships (9).
Well-trained emergency physicians serve as key human resources in emergency care system. Emergency physicians should have well-rounded competence in both the diagnosis and treatment of emergency patients, be able to manage patients, effectively utilize emergency resources, cooperate with local community, and develop and collaborate with prehospital EMS system. In Cameroon, however, residency training program in emergency medicine has not yet been implemented, and development of a training curriculum to strengthen the competence of emergency physicians remains as an urgent task for providing reliable emergency care.
Competence is the ability of an individual to apply one's knowledge, understanding, skills, and judgment in performing effectively in the field of professional practice (10). Development of a competence-based curriculum has become central to the education and training of healthcare professionals (1112). However, tools to evaluate competence of emergency personnel have not yet been developed. Therefore, this study aimed to develop a competence assessment tool in emergency medicine and to validate it by applying the tools to Cameroonian healthcare professionals.

MATERIALS AND METHODS

Study design and setting

This was a cross-sectional, descriptive study. The Korea International Cooperation Agency (KOICA), which is a government-run bilateral aid agency in Korea, has progressed a project with the Cameroon Ministry of Public Health to construct the Yaoundé National Emergency Center. Between October 27 and 31, 2014, a short-term educational program was provided to 58 healthcare professionals (13 specialists, 10 general physicians, and 35 nurses) in the Central Hospital of Yaoundé who are expected to work at the Yaoundé National Emergency Center after its opening in 2015. The educational program for doctors included Advanced Trauma Life Support (ATLS), emergency ultrasound, electrocardiogram course, etc.
In Cameroon, medical student educational program is a 7-year course including a 1-year research activity at a college; and continuous medical education programs are a 2-year general physician training and a 4-year residency programs. For emergency medicine, a 2-year special training program is offered but is not recognized as a certified residency program.

Development of competence evaluation tools

We developed a self-survey questionnaire and a scenario-based assessment tool to evaluate competence in clinical knowledge and self-confidence to perform clinical practices or procedures.
The self-survey questionnaire was developed based on the European curriculum for emergency medicine, core curriculum of the International Federation for Emergency Medicine, learning objectives of the Korean Academy of Medical Sciences, and emergency resident training programs of Korean Society of Emergency Medicine (1314). It is composed of a total of 113 questions: 17 domains with 82 questions about core clinical knowledge, 4 domains with 11 questions about specific aspects of emergency medicine, and 2 domains with 20 questions about clinical procedures and skills.
A two-step pretest was conducted to improve the validity of the competence assessment tool. The first step involved an expert consensus survey with 7 Korean emergency physicians who have previously contributed to the educational program for healthcare professionals in Yaoundé. Questions in which more than 4 emergency specialists reached consensus were selected as appropriate items for the survey, and those in which more than 4 emergency specialists found confusing were modified as appropriate. As a result of the expert consensus survey, a total of 94 questions were selected as potential candidates of the competence assessment tool; seventeen questions on core clinical knowledge and two questions on clinical procedures and skills were excluded. The second step assessed the discrimination power of each question between junior (1st or 2nd year) and senior (3rd or 4th year) emergency residents. Korean junior (n = 7) and senior (n = 5) emergency residents participated in the evaluation. All 94 questions in the 23 domains showed high discrimination power, and as a result, were selected for the competence assessment tool for emergency healthcare professionals in Yaoundé (Table 1). Full results of the two-step pretest are reported in Supplementary Table 1.
Table 1

Comparison of self-survey competency assessment results in Korean emergency residents and Cameroonian healthcare professionals

jkms-32-1931-i001
Topic-domain Question No. Korean emergency residents Cameroonian healthcare professionals
Junior (n = 7) Senior (n = 5) AUC (95% CI) Total (n = 19) GP (n = 9) Specialist (n = 10) P value
I. Core clinical knowledge
 1. Cardiovascular 5 2.8 ± 0.2 4.2 ± 0.1 1.00 (1.00–1.00) 3.8 ± 0.6 3.4 ± 0.5 4.1 ± 0.5 0.007
 2. Pulmonary 5 3.5 ± 0.2 4.2 ± 0.1 0.85 (0.61–1.00) 3.8 ± 0.5 3.6 ± 0.4 4.0 ± 0.5 0.114
 3. Gastrointestinal 5 3.5 ± 0.3 4.2 ± 0.3 1.00 (1.00–1.00) 3.9 ± 0.6 3.9 ± 0.4 4.0 ± 0.7 0.753
 4. Renal and genitourinary 6 2.8 ± 0.3 3.9 ± 0.2 1.00 (1.00–1.00) 3.7 ± 0.5 3.5 ± 0.5 3.9 ± 0.4 0.116
 5. Obstetrics and gynecology 1 2.4 ± 0.3 3.9 ± 0.4 1.00 (1.00–1.00) 4.2 ± 0.6 4.4 ± 0.5 4.1 ± 0.6 0.275
 6. Pediatrics 5 3.2 ± 0.2 3.9 ± 0.1 0.90 (0.69–1.00) 4.1 ± 0.4 4.1 ± 0.3 4.1 ± 0.5 0.692
 7. Infectious disease 5 2.5 ± 0.4 3.5 ± 0.3 0.94 (0.78–1.00) 4.6 ± 0.4 4.6 ± 0.4 4.6 ± 0.4 0.629
 8. Neurological disorder 6 3.1 ± 0.2 4.0 ± 0.1 0.92 (0.73–1.00) 4.0 ± 0.5 3.8 ± 0.5 4.2 ± 0.5 0.126
 9. Toxicology 2 2.3 ± 0.3 3.6 ± 0.0 0.96 (0.84–1.00) 3.3 ± 0.6 3.2 ± 0.7 3.4 ± 0.5 0.512
 10. Endocrine and metabolic 3 2.5 ± 0.4 3.8 ± 0.3 1.00 (1.00–1.00) 4.0 ± 0.5 3.9 ± 0.6 4.1 ± 0.5 0.384
 11. Hematologic and oncologic 4 2.8 ± 0.5 3.8 ± 0.3 0.98 (0.90–1.00) 4.2 ± 0.5 4.0 ± 0.4 4.4 ± 0.5 0.103
 12. Eyes, ears, nose, throat, oral, and neck 8 3.1 ± 0.2 3.8 ± 0.1 0.79 (0.50–1.00) 3.2 ± 0.6 3.3 ± 0.5 3.1 ± 0.6 0.429
 13. Dermatologic 1 3.4 ± 0.4 4.6 ± 0.4 0.92 (0.74–1.00) 4.1 ± 0.6 4.1 ± 0.5 4.2 ± 0.7 0.626
 14. Trauma 4 2.8 ± 0.3 3.9 ± 0.5 0.92 (0.74–1.00) 3.7 ± 0.5 3.5 ± 0.4 3.8 ± 0.5 0.195
 15. Musculoskeletal 2 2.9 ± 0.4 4.1 ± 0.3 0.98 (0.90–1.00) 3.8 ± 0.6 3.4 ± 0.6 4.1 ± 0.5 0.032
 16. Psychiatric and behavior 1 2.6 ± 1.0 3.6 ± 0.5 0.88 (0.65–1.00) 2.8 ± 1.0 3.0 ± 1.0 2.7 ± 0.9 0.511
 17. Resuscitation 2 3.0 ± 0.3 4.3 ± 0.3 0.94 (0.78–1.00) 3.3 ± 0.7 2.9 ± 0.6 3.7 ± 0.5 0.017
II. Specific aspects of emergency medicine
 18. Disaster 1 1.9 ± 0.9 3.0 ± 1.4 0.77 (0.43–1.00) 2.8 ± 1.1 2.6 ± 1.0 3.1 ± 1.1 0.277
 19. Abuse and assault 1 2.6 ± 1.0 3.5 ± 0.6 0.75 (0.44–1.00) 2.8 ± 1.0 2.4 ± 0.9 3.2 ± 1.0 0.106
 20. Environmental injuries 4 2.4 ± 0.3 3.6 ± 0.4 0.83 (0.57–1.00) 3.1 ± 0.6 2.8 ± 0.4 3.4 ± 0.6 0.045
 21. Prehospital care 5 1.6 ± 0.3 3.1 ± 0.2 0.79 (0.45–1.00) 3.2 ± 0.6 3.0 ± 0.4 3.3 ± 0.6 0.204
III. Clinical procedures and skills
 22. CPR skills 9 2.1 ± 0.3 4.3 ± 0.3 1.00 (1.00–1.00) 3.0 ± 0.7 2.6 ± 0.4 3.4 ± 0.7 0.007
 23. Procedure 9 2.4 ± 0.4 3.5 ± 0.1 0.92 (0.73–1.00) 3.5 ± 0.5 3.3 ± 0.5 3.8 ± 0.4 0.028
Total 94 2.7 ± 0.4 3.8 ± 0.3 1.00 (1.00–1.00) 3.6 ± 0.4 3.5 ± 0.3 3.8 ± 0.3 0.038
Values are presented as mean ± SD.
SD = standard deviation, AUC = area under the curve, CI = confidence interval, GP = general physician, CPR = cardiopulmonary resuscitation.
To test validity of the self-survey and to objectively assess competence in emergency medicine, we also developed a scenario-based competence assessment tool based on a simulation workbook and a relevant web site (151617). Five life-threatening diagnoses frequently presenting in the emergency room of the Central Hospital of Yaoundé were selected as scenario topics: ST elevation myocardial infarction (STEMI); diabetes ketoacidosis (DKA); septic shock due to necrotizing fasciitis; multiple trauma by road traffic injury; and anaphylaxis. For these scenarios, 14 questions were chosen among the self-survey questions, and a total of 26 sub-questions (3 for STEMI, 6 for DKA, 11 for septic shock, 3 for multiple trauma, and 3 for anaphylaxis) and answers were also developed to assess the competence in diagnosis and treatment of the 5 diseases. Detailed information of the scenario-based competence assessment tool is reported in Supplementary Table 2.

Study population and study protocols

Eligible population was Cameroonian doctors (13 specialists and 10 general physicians) who had taken the education program in 2014. Competence was assessed in two steps with healthcare professionals who agreed to participate in the study.
The first step of competence assessment was conducted in November 2014 using a 17-page self-survey on clinical knowledge and self-confidence to perform clinical practices or procedures. The survey took roughly 40 minutes to complete including an introduction to purpose and methods of the survey. Each question was answered in a 5-point Likert scale. All results were coded using Microsoft Excel (ver. 14.0, Microsoft®, Los Angeles, CA, USA) by an appointed data entry clerk.
The second scenario-based assessment was conducted with 6 doctors (3 general physicians and 3 specialists) who had previously participated in the self-survey on December 10 and 11, 2014, in the form of an individual interview with two Korean emergency physicians. One emergency physician explained a scenario, and the physicians evaluated the answers in which a Cameroonian respondent provided about his or her medical knowledge on patient assessment, interpretation, diagnosis, and treatment. For increased objectivity, both emergency physicians independently assessed the respondent's answers on a 5-point Likert scale.

Main outcomes

Main outcome was competence in emergency medicine, captured by self-survey and scenario-based competence assessment on a 5-point Likert scale: 1) don't know at all; 2) don't know; 3) average; 4) know; and 5) know well. Secondary outcomes were agreement rate and difference of scores between the two methods of self-survey and scenario-based competence assessment.

Statistical analysis

Descriptive statistics of the self-survey and scenario-based competence assessment results on a 5-point Likert scale were expressed with means and standard deviations (SDs). Differences of assessment results between groups were compared using the Student's t-test.
The power of discrimination for the self-survey between junior and senior residents was measured as the area under the generated receiver operating characteristic (ROC) area under curve (AUC); and AUC greater than 0.7 was said to have high discrimination power. For the scenario-based competence assessment, inter-rater reliability between the two Korean emergency physicians was analyzed using the weighted kappa.
For every doctor participating in both competence assessment methods, we compared the score difference of all 14 questions between the first self-survey and the second scenario-based competence assessment. The difference between the two assessment results was analyzed using paired t-test, agreement rate, and weighted kappa. When the difference between the result of the first self-survey and the second scenario-based assessment was within one point, it was said that the two results had an agreement. Level of statistical significance was defined as P < 0.05.

Ethics statement

The study received and approved by the Institutional Review Board of the Seoul National University Hospital (IRB No. E-1506-048-679). Written informed consent was obtained from all study participants.

RESULTS

The first competence assessment: self-survey

Among a total of 23 doctors, 10 specialists and 9 general physicians participated in the first competence assessment (82.6%). Mean (SD) age of the 19 participants was 33.1 (5.6). And 6 (31.6%) were male. Main areas of practice for the 10 specialists included anesthesia (n = 3), internal medicine (n = 3), general surgery (n = 2), gynecology and emergency medicine (n = 1), and laboratory medicine (n = 1).
In the self-survey competence assessment, the Cameroonian healthcare professionals scored the highest in the domain of infectious disease (mean ± SD, 4.6 ± 0.4), followed by obstetrics and gynecology (4.2 ± 0.6) and hematology and oncology (4.2 ± 0.5); in contrast, they scored the lowest grade in the domains of disaster, abuse and assault, and psychiatric and behavior disorder (each with a mean of 2.8). Domains of cardiovascular, musculoskeletal, procedure, resuscitation, cardiopulmonary resuscitation (CPR) skills, and environmental injuries had significantly higher scores in specialists compared to general physicians (all P < 0.05). The average scores in general physicians were higher than specialists only in the 3 domains of obstetrics and gynecology; eyes, ear, nose, throat, oral, and neck; and psychiatric and behavior; however, the score differences were not statistically significant (Table 1).

The second competence assessment: scenario-based

For the 14 questions of the scenario-based competence assessment which were answered by the 6 study participants, the inter-rater reliability between the two interviewers had a weighted kappa 0.88 and 95% confidence interval (CI) 0.82–0.95 (Table 2). Highest scores of the scenario-based assessment were observed in the knowledge of multiple trauma (mean ± SD, 4.3 ± 1.2), followed by anaphylaxis (3.4 ± 1.4), DKA (3.3 ± 1.0), and STEMI (2.5 ± 1.4). Knowledge of septic shock scored the lowest in the scenario-based assessment (mean ± SD, 2.2 ± 1.1) (Table 3).
Table 2

Inter-rater reliability between two interviewers (Korean emergency physicians)

jkms-32-1931-i002
Interviewer    Interviewer 2
1 2 3 4 5 Total
Interviewer 1 1 12 2 0 1 0 15
2 1 12 3 0 0 16
3 0 2 13 2 0 17
4 0 0 2 18 0 20
5 0 0 0 0 16 16
Total 13 16 18 21 16 84*
Agreement rate, 84.5%; weighted kappa, 0.88; 95% CI, 0.82 to 0.95.
*A total score of 84 was derived from 14 questions answered by 6 Cameroonian doctors.
The gray color cells are the number of items that match the score of two reviewers.
Table 3

Scenario-based competency assessment between GPs and specialists in Cameroon

jkms-32-1931-i003
Clinical scenario Question No. Detailed item No. Total (n=6) GP (n=3) Specialist (n=3) P value
1. Septic shock 3 11 2.2±1.1 1.9±0.8 2.6±1.2 0.186
2. Diabetes ketoacidosis 3 6 3.3±1.0 2.8±1.0 3.8±0.7 0.018
3. STEMI 3 3 2.5±1.4 2.0±1.3 3.0±1.4 0.136
4. Multiple trauma 2 3 4.3±1.2 4.2±1.6 4.5±0.8 0.656
5. Anaphylaxis 3 3 3.4±1.4 3.6±1.5 3.3±1.3 0.740
Total 14 26 3.1±1.4 2.8±1.5 3.4±1.3 0.062
Values are presented as number or mean ± SD.
STEMI = ST-elevation myocardial infarction, SD = standard deviation, GP = general physician.

Comparison between first and second competence assessment

Mean difference between the self-survey and scenario-based assessment was statistically insignificant (mean, −0.02; 95% CI, −0.41 to 0.36). In terms of individual participants, the mean difference (95% CI) between the scenario-based assessment and the self-survey ranged from −1.21 (−2.15 to −0.28) to 0.64 (−0.25 to 1.54) (Table 4). Results of the 14 questions in the self-survey and the scenario-based assessment showed 58.3% agreement rate, 17.9% under-estimation rate, and 23.8% over-estimation rate (Table 5).
Table 4

Difference between scenario-based test and self-survey test

jkms-32-1931-i004
Subjects Position Mean difference* 95% CI
Doctor 1 GP −0.50 −1.80 to 0.80
Doctor 2 S 0.00 −0.82 to 0.82
Doctor 3 S 0.36 −0.54 to 1.25
Doctor 4 S 0.57 −0.43 to 1.58
Doctor 5 GP 0.64 −0.25 to 1.54
Doctor 6 GP −1.21 −2.15 to −0.28
Total −0.02 −0.41 to 0.36
GP = general physician, S = specialist, CI = confidence interval.
*Mean difference was calculated as the average of scenario-based scores subtracted by the average of self-survey score of 14 questions.
Table 5

Comparison of scores between self-survey and scenario-based competency assessments

jkms-32-1931-i005
Self-survey score Scenario-based score
1 2 3 4 5 Total
1 4 2 0 2 1 9
2 0 5 3 3 4 15
3 6 2 6 9 5 28
4 3 4 6 5 5 23
5 2 3 2 1 1 9
Total 15 16 17 20 16 84*
Agreement rate, 58.3%; under-estimation rate, 17.9%; over-estimation rate, 23.8%.
*A total score of 84 was derived from 14 questions answered by 6 Cameroonian doctors.
Gray color cells are the number of items with similar survey-based and scenario-based scores.

DISCUSSION

This study involved development of a self-survey and a scenario-based competence assessment tool for emergency medicine using literature review, expert consensus, and a discrimination test with pilot survey. We administered the self-survey to Cameroonian healthcare professionals and later conducted an objective, scenario-based competence assessment to test validity of the self-survey. Mean difference between the self-survey and scenario-based assessment was negligible, and scores between two assessments showed moderate agreement rate (58%). Competence-based education has become an important key of curriculum for healthcare professionals, which highlights the needs for reliable, valid, and feasible competence assessment tools in mid- and long-term education courses (181920). The areas of practice which showed poor competence in this study should comprise the main contents of emergency medicine curriculum for the Cameroonian healthcare professionals.
The self-survey competence assessment is one of the most formative and summative forms of evaluation. Repetitive self-surveys can benefit both learners and educators through continuous monitoring of self-competence and education impacts, respectively. In this study, the Cameroonian participants showed high competence in the domains of infectious disease, obstetrics and gynecology, hematology and oncology. Specialists who had completed advanced education and training showed high competence in the domains of clinical procedure, CPR skills, and resuscitation. However, all Cameroonian participants had relatively low competence in specific aspects of emergency medicine including disasters and prehospital care. There are specific aspects of emergency medicine as well as clinical skills and procedures which function as key competence of an emergency physician (14); therefore, those domains which showed poor competence should be of particular interest for the development of emergency medicine curriculum for the Cameroonian healthcare professionals. Furthermore, periodic monitoring of physicians' competence can also be utilized to evaluate the effects of curriculum in the future.
The self-survey has its advantage of convenience and cost-effectiveness, but its results may be subjective. When the self-survey is combined with objective assessment methods, we may expect to obtain more valid and practical results (1421). In this study, we developed scenario-based assessment tools for the 5 life-threatening diseases which frequently present in the emergency rooms of Cameroon. When comparing results of the subjective self-survey and the objective scenario-based competence assessment, the scores were comparable with mean difference of −0.02 (−0.41 to 0.36) and had a moderate agreement rate of 58.3%. However, one of the study participants overrated his or her own competence with self-survey than did in the scenario-based assessment, and the calculated overestimation rate (23.8%) was higher than the underestimation rate (17.9%). In previous literature, the correlations between self-survey and externally observed measurement of competence had been controversial. Some studies reported that self-assessment for selected fields or categories was a reliable predictor of clinical performance (15222324). In another meta-analysis, however, only 35% of studies showed positive correlation between self-assessments and external assessments (2125). In our study, self-surveys had a moderate correlation with the externally observed measurement. Methodologic quality of a self-survey, knowledge level and training experience of a respondent, and quality and skills of an evaluator are important factors to consider for an accurate evaluation of competence using self-survey.
Self-survey assessment in medical knowledge should accurately measure competence as well as actual performance as reflected by an external assessment (18). There are many external, objective competence assessment methods that compensate for the weaknesses of a self-survey, such as oral examinations, procedure tests, simulation examinations, objective structured clinical examination (OSCE), standard patient examinations, and clinical record reviews (1418). In this study, we developed self-survey and scenario-based oral examination which were designed to assess competence of clinical performance in an environment which resembles an actual situation. Furthermore, we applied both assessment tools to the Cameroonian healthcare providers, and results of the two showed a moderate correlation. Using the two competence assessment tools, we expect to develop curriculums for emergency medicine that focuses on specific aspects of emergency medicine and clinical skills and procedures in which the Cameroonian healthcare professionals had poor competence on.
There are several limitations in this study. First, only 14 selected questions among the 94 self-survey questions were used in the scenario-based test. Although the selected topics were based on their high frequency of incidence in Cameroon, not all questions in the self-survey were evaluated with the objective scenario-based assessment, resulting in limited validity of the self-survey. Second, since Korea and Cameroon have different medical environment, there is a possibility that some questions were not considered as important and were excluded during the process of expert consensus and discrimination test in Korea. Supplementary assessment such as clinical behavior observation might be helpful to improve the validity of competence evaluation tool. Third, there was a one month gap between the administration of self-survey and scenario-based assessment. Therefore, we could not measure if there had been any individual efforts to improve competence in the interim.
In conclusion, we developed and administered self-survey and objective scenario-based competence assessment tools to evaluate the competence in emergency medicine among Cameroonian healthcare professionals. Results of the two evaluation tools showed a moderate correlation, and study participants showed relatively low competence in specific aspects of emergency medicine and clinical procedures and skills. Results of this assessment can be used to develop curriculums for emergency medicine which is tailored to the needs of Cameroonian healthcare professionals. The assessment tools can be used as valuable resources for assessing competence and developing need-based curriculum for emergency medicine in developing countries including those in Africa.

Notes

Funding This study was financially supported by the Korea International Cooperation Agency (KOICA).

DISCLOSURE The authors have no potential conflicts of interest to disclose. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Author Contributions

  • Conceptualization: Kim SC, Ro YS, Shin SD, Wi DH, Jeong J, Park JO.

  • Data curation: Kim SC, Ro YS, Sun KM, Bae K.

  • Formal analysis: Kim SC, Ro YS.

  • Funding acquisition: Shin SD, Jeong J.

  • Investigation: Kim SC, Sun KM, Bae K.

  • Resources: Sun KM, Bae K.

  • Supervision: Ro YS, Shin SD, Wi DH, Jeong J.

  • Writing - original draft: Kim SC, Ro YS.

  • Writing - review & editing: Shin SD, Jeong J.

References

1. Alagappan K, Holliman CJ. History of the development of international emergency medicine. Emerg Med Clin North Am. 2005; 23:1–10.
2. Ro YS, Shin SD, Jeong J, Kim MJ, Jung YH, Kamgno J, Alain EM, Hollong B. Evaluation of demands, usage and unmet needs for emergency care in Yaoundé, Cameroon: a cross-sectional study. BMJ Open. 2017; 7:e014573.
3. Terry B, Bisanzo M, McNamara M, Dreifuss B, Chamberlain S, Nelson SW, Tiemeier K, Waters T, Hammerstedt H. Task shifting: meeting the human resources needs for acute and emergency care in Africa. Afr J Emerg Med. 2012; 2:182–187.
4. Aufderheide TP, Nolan JP, Jacobs IG, van Belle G, Bobrow BJ, Marshall J, Finn J, Becker LB, Bottiger B, Cameron P, et al. Global health and emergency care: a resuscitation research agenda--part 1. Acad Emerg Med. 2013; 20:1289–1296.
5. Busse H, Azazh A, Teklu S, Tupesis JP, Woldetsadik A, Wubben RJ, Tefera G. Creating change through collaboration: a twinning partnership to strengthen emergency medicine at Addis Ababa University/Tikur Anbessa Specialized Hospital--a model for international medical education partnerships. Acad Emerg Med. 2013; 20:1310–1318.
6. Osei-Ampofo M, Oduro G, Oteng R, Zakariah A, Jacquet G, Donkor P. The evolution and current state of emergency care in Ghana. Afr J Emerg Med. 2013; 3:52–58.
7. Reynolds TA, Mfinanga JA, Sawe HR, Runyon MS, Mwafongo V. Emergency care capacity in Africa: a clinical and educational initiative in Tanzania. J Public Health Policy. 2012; 33:Suppl 1. S126–S137.
8. The World Bank. Health expenditure, total (% of GDP): Cameroon [Internet]. accessed on 27 September 2017. Available at https://data.worldbank.org/indicator/SH.XPD.TOTL.ZS.
9. Kingue S, Rosskam E, Bela AC, Adjidja A, Codjia L. Strengthening human resources for health through multisectoral approaches and leadership: the case of Cameroon. Bull World Health Organ. 2013; 91:864–867.
10. Kane MT. The assessment of professional competence. Eval Health Prof. 1992; 15:163–182.
11. Epstein RM, Hundert EM. Defining and assessing professional competence. JAMA. 2002; 287:226–235.
12. Jung HY, Kim JW, Lee S, Yoo SH, Jeon JH, Kim TW, Park JS, Jeong SY, Oh SJ, Kim EJ, et al. A study of core humanistic competency for developing humanism education for medical students. J Korean Med Sci. 2016; 31:829–835.
13. Hsia RY, Mbembati NA, Macfarlane S, Kruk ME. Access to emergency and surgical care in sub-Saharan Africa: the infrastructure gap. Health Policy Plan. 2012; 27:234–244.
14. International EM Core Curriculum and Education Committee for the International Federation for Emergency Medicine. International Federation for Emergency Medicine model curriculum for emergency medicine specialists. CJEM. 2011; 13:109–121.
15. Jansen JJ, Tan LH, van der Vleuten CP, van Luijk SJ, Rethans JJ, Grol RP. Assessment of competence in technical clinical skills of general practitioners. Med Educ. 1995; 29:247–253.
16. Thoureen TL, Scott SB. Emergency Medicine Simulation Workbook: a Tool for Bringing the Curriculum to Life. Chichester: Wiley-Blackwell;2013.
17. Howard Z, Siegelman J, Guterman E, Hayden EM, Gordon JA. Simulation casebook [Internet]. accessed on 27 September 2017. Available at http://mycourses.med.harvard.edu/ResUps/GILBERT/pdfs/HMS_7607.pdf.
18. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001; 357:945–949.
19. Sherbino J, Bandiera G, Frank JR. Assessing competence in emergency medicine trainees: an overview of effective methodologies. CJEM. 2008; 10:365–371.
20. Epstein RM. Assessment in medical education. N Engl J Med. 2007; 356:387–396.
21. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006; 296:1094–1102.
22. Kramer AW, Jansen JJ, Zuithoff P, Düsman H, Tan LH, Grol RP, van der Vleuten CP. Predictive validity of a written knowledge test of skills for an OSCE in postgraduate training for general practice. Med Educ. 2002; 36:812–819.
23. Ram P, van der Vleuten C, Rethans JJ, Schouten B, Hobma S, Grol R. Assessment in general practice: the predictive value of written-knowledge tests and a multiple-station examination for actual medical performance in daily practice. Med Educ. 1999; 33:197–203.
24. Remmen R, Scherpbier A, Denekens J, Derese A, Hermann I, Hoogenboom R, van Der Vleuten C, van Royen P, Bossaert L. Correlation of a written test of skills and a performance based test: a study in two traditional medical schools. Med Teach. 2001; 23:29–32.
25. Liaw SY, Scherpbier A, Rethans JJ, Klainin-Yobas P. Assessment for simulation learning outcomes: a comparison of knowledge and self-reported confidence with observed clinical performance. Nurse Educ Today. 2012; 32:e35–e39.

Supplementary Materials

Supplementary Table 1

Self-survey competency assessment tool and its results in Korean emergency residents and Cameroonian doctors

Supplementary Table 2

Scenario-based competency assessment tool of 26 sub-questions developed for 14 questions of the self-survey competency assessment
TOOLS
ORCID iDs

Sang Chul Kim
https://orcid.org/0000-0001-9377-4993

Young Sun Ro
https://orcid.org/0000-0003-3634-9573

Sang Do Shin
https://orcid.org/0000-0003-4953-2916

Dae Han Wi
https://orcid.org/0000-0002-5658-1137

Joongsik Jeong
https://orcid.org/0000-0003-1682-0161

Ju Ok Park
https://orcid.org/0000-0002-1024-3626

Kyong Min Sun
https://orcid.org/0000-0002-2977-0880

Kwangsoo Bae
https://orcid.org/0000-0001-5140-4601

Similar articles