Journal List > Korean J Anesthesiol > v.71(3) > 1156540

Monroe, Evans, Mukkamala, Williamson, Jabaley, Mariano, and O’Reilly-Shah: Moving anesthesiology educational resources to the point of care: experience with a pediatric anesthesia mobile app

Abstract

Background

Educators in all disciplines recognize the need to update tools for the modern learner. Mobile applications (apps) may be useful, but real-time data is needed to demonstrate the patterns of utilization and engagement amongst learners.

Methods

We examined the use of an anesthesia app by two groups of learners (residents and anesthesiologist assistant students [AAs]) during a pediatric anesthesiology rotation. The app calculates age and weight-based information for clinical decision support and contains didactic materials for self-directed learning. The app transmitted detailed usage information to our research team.

Results

Over a 12-month period, 39 participants consented; 30 completed primary study procedures (18 residents, 12 AAs). AAs used the app more frequently than residents (P = 0.025) but spent less time in the app (P < 0.001). The median duration of app usage was 2.3 minutes. During the course of the rotation, usage of the app decreased over time. ‘Succinylcholine’ was the most accessed drug, while ‘orientation’ was the most accessed teaching module. Ten (33%) believed that the use of apps was perceived to be distracting by operating room staff and surgeons.

Conclusions

Real-time in-app analytics helped elucidate the actual usage of this educational resource and will guide future decisions regarding development and educational content. Further research is required to determine learners’ preferred choice of device, user experience, and content in the full range of clinical and nonclinical purposes.

Introduction

The current generation of learners has matured in an ever-advancing technological world, with access to practically limitless digital information ‘at their fingertips’ [1]. Educators must now consider learners’ preferences for technological approaches when adapting to different learning styles [2,3]. As a large portion of learning occurs at the bedside, medicine exhibits an ever-present need for educational resources at the point of care. Readily accessible mobile devices and applications are commonly used for real-time clinical decision support [4,5]. Prior work has explored the use of mobile devices for the education of medical students, residents, and other health professions students. However, metrics regarding these devices and applications have generally been limited to post hoc or periodic surveys to assess usage and user satisfaction [6–13]. Thus, we lack prospectively gathered realtime data to demonstrate how these tools are actively used.
Using a mobile application (‘app’) that was custom built as an educational resource for pediatric anesthesia, we designed this study to describe and evaluate patterns of utilization in a mixed sample of learners (‘trainees’): anesthesiology residents and anesthesiologist assistant (AA) students. We used quantifiable data that was collected by an in-app analytics platform. Our primary goal was to disprove the null hypothesis that app usage between these groups of trainees would be the same.

Materials and Methods

This study was approved by Emory University’s Institutional Review Board (IRB#00083784), including a waiver of written informed consent due to the anonymous and minimal risk nature of the study, and that written consent would have been the only identifiable information collected. Participants consented electronically and anonymously via the app; trainees could use the tablet and app regardless of their status as a participant. Our reporting protocol adheres to applicable Enhancing the Quality and Transparency Of Health Research (EQUATOR) guidelines for a cohort study.

App development

We customized a free Android-based anesthesia calculator app for this work. The app was written in Java (Oracle, USA), using the Android software development kit (Google, USA) and the Eclipse integrated development environment (Eclipse Foundation, Canada); details of its development have been described previously [14]. The app accepts patient age and weight as inputs; it then calculates airway equipment, normal ranges for physiological parameters, and weight-based drug doses. To customize the app, we included didactic information that we considered to be foundational knowledge for a pediatric anesthesiology rotation. This information included orientation materials, protocols for certain procedures (e.g., management of spine fusions), and lecture materials provided by faculty at our institution. The app was loaded onto Acer Iconia tablet computers (Acer, Taiwan) for distribution to trainees. Screenshots are provided in Fig. 1.
The app was fitted with the Survalytics module, which allows customized, real-time, cloud-based collection of survey data and information regarding app use [15]. Information is transmitted to and from the app, utilizing cloud services provided by Amazon Web Services (Amazon, USA). The platform stores each ‘event’ (e.g., consent, survey response, in-app click, app closure) to an on-device database. When internet connectivity is detected, the information is uploaded from the app to a cloud database. Each packet contains relevant details of the event (e.g. what was clicked), as well as a generic set of information, including the event time and an anonymous globally unique identifier generated when the app is first started. The identifier allows for subsequent collation of events by the device. Together with the timestamps, the sequence of events for each mobile device can be reconstructed from the cloud database. See Online Resource 1 and the publication describing Survalytics for further details [15].
Pre- and post-rotation surveys were developed to assess trainee demographics and attitudes. A 25-question multiple-choice didactic test was also developed to assess knowledge at the start and end of the rotation. The test was scored by summing the total number of correct responses, with a maximum possible score of 25 on the test. Surveys and the didactic test are available in Online Resource 1 (Supplementary materials), Tables S1S3.

Participants

Anesthesiology residents and AA students who were rotating at Children’s Healthcare of Atlanta (Egleston campus) were invited to participate. Trainees were provided study information via email, prior to beginning their rotations. No randomization was performed. Inclusion criteria were: rotating anesthesiology residents and AA students. Exclusion criteria were: trainees with any prior pediatric anesthesiology rotation experience; inability or unwillingness to use the tablet. All trainees were permitted to use the tablet without participating. After receiving the tablets, trainees who opted-in completed the knowledge test and pre-rotation survey. Trainees participated for the duration of their rotation, a one- or two-month block. Near the end of the rotation, participants were asked to complete the post-test and survey.

Outcomes

The primary outcome was the frequency of app usage, calculated using the methodology described in Online Resource 1 (Supplementary materials). This methodology was used, instead of a simpler calculation, to compensate for the fact that the date of discontinuation of app use was not known (as a result of the anonymous recruitment protocol). Secondary outcomes included data related to patterns of usage (e.g., time of day, content accessed), participants’ performance on post-tests, and aggregated survey results. Usage of the tablet outside the app could not be tracked.

Statistical analysis

Data analysis was conducted in R v. 3.3.2 [16]. Differences in age were compared with a t-test; differences in gender and ‘comfort with technology’ survey questions were analyzed using Fisher’s exact test. Mann-Whitney U test was performed to compare the frequency of app use between trainee roles (residents vs. AA students), as well as to compare the total time of app use for each trainee role. Matched pair data regarding pre- and post-test outcomes were compared with the Wilcoxon signed-rank test.

Results

The participant flow is shown in Fig. 2. Notably, there were a significant number of participants who did not complete the post-rotation survey (16/30, 53%). Of the 30 who completed the initial survey, basic demographics, as well as attitudes about technology and apps, are shown in Tables 1 and 2.

Primary outcome

The median app usage frequency for AA students was 1.0 times per day (interquartile range, 0.9–1.8 times per day). This was a more frequent app use than that exhibited by residents (P = 0.025), who had a median usage of 0.4 times per day (interquartile range, 0.3–0.7 times per day).
Conversely, the total amount of time spent in the app was significantly longer amongst residents than amongst AA students (P < 0.001). The median in-app time for residents was 3.4 minutes (interquartile range, 0.6–12.1 min). The median in-app time for AA students was 1.4 minutes (interquartile range, 0.2–4.3 min). Total aggregated in-app time per trainee was 112 minutes amongst AA students and 236 minutes amongst residents.
Fig. 3 demonstrates the decline in the use of the app over the course of the rotation, both overall and by trainee type. After the first week of the rotation, app usage stabilized at approximately 15 uses per day for the entire cohort. Many of the devices were returned with the battery completely discharged, although specific data regarding this aspect were not recorded.

Secondary outcomes

Participants’ scores on the didactic post-test improved but were not significantly higher (P = 0.104). Excluding the participant with a post-test score of 5 (manual review showed that the participant did not complete the post-test), the mean score for residents (n = 6) ranged from 16.8 to 18.8, whereas the mean score for AA students (n = 6) ranged from 11.3 to 13.6. The overall mean score ranged from 14.1 to 16.3, and participants demonstrated overall improvement of 2.2 ± 3.3 (mean ± SD).
We used alteration of age and weight in the app by the user as a proxy for patient care/clinical-decision-support. As shown in Fig. 4D, the performance of these calculations peaked in the morning.
In-app clicks on drugs provided us with information regarding areas where trainees felt they needed additional information. These clicks were primarily focused on drugs that are commonly used in anesthesia, such as succinylcholine and ondansetron (Table 3). We also collected information regarding which didactics were accessed and when. Lectures tended to be accessed more frequently in the 12 pm–2 pm hours and in the 4 pm–5 pm hour (Fig. 4C). Didactic materials that were accessed most often (Table 4) included orientation materials and information regarding core topics (e.g., ‘Preoperative Evaluation of the Pediatric Patient’).
We assessed general attitudes towards various educational modalities for instruction in the field of anesthesia (Table S4, top). We also assessed whether participants felt that their use of the app was viewed as distracting by staff or surgeons (Table S4, bottom).

Discussion

We found that AA students used a mobile app, providing both clinical decision support and educational resources customized for pediatric anesthesia, more frequently than anesthesiology residents, while on clinical rotation. In both groups, there was notable attrition in app usage by the end of the rotation. On a daily basis, use of the app occurred primarily at the beginning of the workday (when participants were likely preparing for the day’s cases); the most frequently accessed educational resources were related to clinical care (e.g., protocols) and core didactics (e.g., preoperative evaluation).
Detailed analytics related to the use of educational apps have not been previously described. Where usage data has been collected, the collection has typically occurred through self-reports or other forms of retrospective survey [5,10]. Therefore, the phenomenon of attrition has not been reported, nor has details regarding specific timing or content accessed. Consistent with a recent review of the use of mobile devices by health professions students, we found that participants used our app for both clinical decision support and to support self-directed learning [11]. Our survey of attitudes towards various educational modalities also supported previous findings by Ellaway and colleagues; specifically, even with the increased ease of access at the point of care, the use of apps on mobile devices may augment conventional learning approaches in medical education, but will likely not replace them [10,17].
AAs are a group of anesthesia providers unique to the United States; we have included in Table 5 a summary of the educational requirements, training experience, and healthcare system roles filled by AAs, certified registered nurse anesthetists, and physician anesthesiologists in the United States. This is important because the difference in the rate of app use between residents and AA students may have several potential explanations. First, at the time of the rotation, AA trainees will have had objectively less medical training than their resident counterparts. This could lead to the greater use of adjuncts for decision support in a new environment, such as a pediatric anesthesia rotation. Secondly, the group of AA students was younger (P = 0.075), which may also, in part, explain greater app use in this group. An argument against this is supported by similar levels of comfort with the use of mobile technology (Table 2, P = 0.613).
The contrast between the frequency of app use, which was higher amongst AA students and the duration of time spent in the app, which was longer amongst residents was also an interesting observation. The duration of time spent in the app, on the order of minutes, is consistent with known app-use patterns [18]. The net result was that residents exhibited nearly double the exposure to the app, compared with AA students. We speculate that residents tended to use the app, although less frequently, for a deeper investigation into specific topics; however, more work is needed to truly understand these differences.
An overall decline in app use after the first week of the rotation was not completely unexpected. Our observation that many of the devices were returned in a discharged state was consistent with this observation. This may reflect the accumulation of knowledge by the trainee during the rotation, with a decreasing need to access the app for reference. Alternatively, the decline in usage may reflect participant preference to access resources on personal devices rather than to track and charge a separate device. Learners’ preference for personal devices has been reported by others; further, the use of technological devices in the workplace is influenced by ease of use, speed of access, and reliability [10].
Although the majority of app use occurred during the day, participants also used the app late in the evening and overnight, suggesting a role for the app in providing just-in-time support for on-call and emergency cases. Our data also suggest that participants utilized nonclinical downtime to access didactic materials. These resources were primarily accessed prior to the start of the first case, at midday (e.g. during the lunch break), and in the early evening (after scheduled cases were complete). Survey results suggest that trainees are not completely beholden to digital learning: participants provided a higher rating to a variety of other educational resources, including traditional lectures and intraoperative teaching.
Given the importance of vigilance to the safe practice of anesthesia, we assessed the perceptions of tablet usage as a distraction in the operating room. It is concerning that any trainee, let alone one-third of our participants, believed that operating room staff and surgeons considered their use of the device to be distracting. Although recent literature supports the concept that the use of mobile devices tends to occur during times of low cognitive load (e.g., during the maintenance phase of the anesthetic) [19], it is important to note that the use of these devices may adversely affect the perception of trainees by our perioperative colleagues.
This study has several limitations. A significant limitation that we encountered was the low survey response rate, which may have led to sampling bias. This may have been a result of several factors. From a process perspective, participants were reminded to complete the post-rotation survey within the app, one week prior to completing their rotation. However, completion of the survey was voluntary and, to preserve participant anonymity, we included no mechanism to identify who had completed their surveys; unfortunately, this limited our follow-up ability. This voluntary aspect of the survey, likely combined with the aforementioned decline in usage of the app, may have resulted in the observed attrition. We plan to investigate whether improved post-rotation survey participation occurs if the app is available on a participant’s own device, rather than on the tablet computer, or if survey completion can be converted into a mandatory function.
There are several other noteworthy study limitations. First and most significantly, the use of the mobile app was added to the existing educational curriculum for the pediatric anesthesiology rotation at our institution. As we did not collect pre-rotation and post-rotation didactic test information from a control group (non-users), we cannot, in any case, draw conclusions regarding the effectiveness of the app beyond our within-group comparisons. Given the complexities inherent in analyzing educational interventions and learning preferences, however, it is not guaranteed that the inclusion of a control group will necessarily provide widely generalizable data [12,20]. Second, the study involved a relatively small number of participants. This could represent another source of sampling bias. Third, the results related to patterns of use are only applicable to the app studied; therefore, they should not be extrapolated to other medical education apps.
In conclusions, this study employed quantifiable data provided by in-app analytics to characterize the use patterns of a pediatric anesthesia mobile app; notably, it shows a greater frequency of use by AA students, compared with anesthesiology residents. Further research is needed to determine the trainees’ preferred choice of device, user experience, and content, for the full range of clinical and nonclinical purposes. More work is warranted to establish whether the use of mobile apps in the operating room is distracting and presents risks to patient care.

ACKNOWLEDGMENTS

The authors would like to thank Dr. Bruce Miller and Children’s Healthcare of Atlanta for support of this project, including the purchase of the tablet computers used in this study

Notes

Edward R. Mariano has received unrestricted educational program funding (past) paid to his institution from Halyard Health (Alpharetta, GA, USA). This company has no connection to the presented work and had absolutely no input into any aspect of the present study’s conceptualization, design, and implementation; data collection, analysis, and interpretation; or manuscript preparation.

Katherine S. Monroe, Michael A. Evans, Shivani G. Mukkamala, Julie L. Williamson, Craig S. Jabaley, and Vikas N. O’Reilly-Shah have no conflicts of interest.

Supplementary Materials

Futher detailes are presented in the online version of this article (Available from https://doi.org/10.4097/kja.d.18.00014).

References

1. Vasilopoulos T, Chau DF, Bensalem-Owen M, Cibula JE, Fahy BG. Prior podcast experience moderates improvement in electroencephalography evaluation after educational podcast module. Anesth Analg. 2015; 121:791–7.
crossref
2. Chu LF, Erlendson MJ, Sun JS, Clemenson AM, Martin P, Eng RL. Information technology and its role in anaesthesia training and continuing medical education. Best Pract Res Clin Anaesthesiol. 2012; 26:33–53.
3. Bahner DP, Adkins E, Patel N, Donley C, Nagel R, Kman NE. How we use social media to supplement a novel curriculum in medical education. Med Teach. 2012; 34:439–44.
crossref
4. Boruff JT, Storie D. Mobile devices in medicine: a survey of how medical students, residents, and faculty use smartphones and other mobile devices to find information. J Med Libr Assoc. 2014; 102:22–30.
crossref
5. Nuss MA, Hill JR, Cervero RM, Gaines JK, Middendorf BF. Real-time use of the iPad by third-year medical students for clinical decision support and learning: a mixed methods study. J Community Hosp Intern Med Perspect. 2014; 4.
crossref
6. Patel BK, Chapman CG, Luo N, Woodruff JN, Arora VM. Impact of mobile tablet computers on internal medicine resident efficiency. Arch Intern Med. 2012; 172:436–8.
crossref
7. Luo N, Chapman CG, Patel BK, Woodruff JN, Arora VM. Expectations of iPad use in an internal medicine residency program: is it worth the "hype"? J Med Internet Res. 2013; 15:e88.
crossref
8. Robinson RL, Burk MS. Tablet computer use by medical students in the United States. J Med Syst. 2013; 37:9959.
crossref
9. Soma DB, Homme JH, Jacobson RM. Using tablet computers to teach evidence-based medicine to pediatrics residents: a prospective study. Acad Pediatr. 2013; 13:546–50.
crossref
10. Ellaway RH, Fink P, Graves L, Campbell A. Left to their own devices: medical learners' use of mobile technologies. Med Teach. 2014; 36:130–8.
crossref
11. Mi M, Wu W, Qiu M, Zhang Y, Wu L, Li J. Use of mobile devices to access resources among health professions students: a systematic review. Med Ref Serv Q. 2016; 35:64–82.
crossref
12. Frye AW, Hemmer PA. Program evaluation models and related theories: AMEE guide no. 67. Med Teach. 2012; 34:e288–99.
crossref
13. Hardyman W, Bullock A, Brown A, Carter-Ingram S, Stacey M. Mobile technology supporting trainee doctors' workplace learning and patient care: an evaluation. BMC Med Educ. 2013; 13:6.
crossref
14. O'Reilly-Shah V, Easton G, Gillespie S. Assessing the global reach and value of a provider-facing healthcare app using large-scale analytics. BMJ Glob Health. 2017; 2:e000299.
15. O'Reilly-Shah V, Mackey S. Survalytics: an open-source cloud-integrated experience sampling, survey, and analytics and metadata collection module for androidoperating system apps. JMIR Mhealth Uhealth. 2016; 4:e46.
16. R Core Team. R: A language and environment for statistical computing [Internet]. Vienna: R Foundation for Statistical Computing;2016. Available from https://www.R-project.org.
17. Rothman BS, Gupta RK, McEvoy MD. Mobile technology in the perioperative arena: rapid evolution and future disruption. Anesth Analg. 2017; 124:807–18.
18. Average mobile app session length as of 4th quarter 2015, by category (in minutes) [Internet]. New York (NY): Statista;[updated 2016 Mar; cited 2018 Feb]. Available from https://www.statista.com/statistics/202485/average-ipad-app-sessionlength-by-app-categories.
19. Slagle JM, Porterfield ES, Lorinc AN, Afshartous D, Shotwell MS, Weinger MB. Prevalence of potentially distracting noncare activities and their effectson vigilance, workload, and nonroutine events during anesthesia care. Anesthesiology. 2018; 128:44–54.
20. Kirkpatrick JD, Kirkpatrick WK. Kirkpatrick’s Four Levels of Training Evaluation. Alexandria: Association for Talent Development;2016. p. 152.

Fig. 1.
Screenshots of the customized anesthesia calculator app, loaded with didactic information.
kja-d-18-00014f1.tif
Fig. 2.
Participant flow diagram. The attrition of participants at various stages of the study process is delineated.
kja-d-18-00014f2.tif
Fig. 3.
Normalized number of app uses per study day, per number of trainees in each category. Typically, the first study day coincided with the first day of the trainee’s pediatric anesthesia rotation. The data demonstrated a very high rate of app usage during the first week of the rotation, which decreased over time to reach a low steady-state rate of app usage by the last week of the rotation.
kja-d-18-00014f3.tif
Fig. 4.
Time of day when the content was accessed: (A) All data, (B) survey completion, (C) accessing didactic lectures, (D) performing a calculation for a new patient (new age/weight combination). Trainees tended to make age and weight entries early in the day, as well as sporadically throughout the day, whereas they tended to access lectures more frequently near lunchtime and in the early evening.
kja-d-18-00014f4.tif
Table 1.
Basic Demographics and Use of Apps
Residents (n = 18) AAs (n = 12) P value
Age 0.075
 Mean ± SD 29.7 ± 1.8 27.2 ± 4.3
 Range 28–34 22–36
Gender 1.000
 Female 8 (44.4%) 6 (50.0%)
 Male 9 (50.0%) 5 (41.7%)
 Prefer not to answer 1 (5.6%) 1 (8.3%)
Uses anesthesiology-specific apps 1.000
 Yes 8 (44.4%) 5 (41.7%)
 No 10 (55.6%) 7 (58.3%)

Data are presented as number of patients (%) or mean ± SD. AAs: anesthesiologist assistant students.

Table 2.
Basic Attitudes towards Technology and Apps
Residents AAs P value
Early adopter of new technology 0.439
 Strongly disagree 1 (5.6%) 1 (8.3%)
 Disagree 3 (16.7%) 3 (25.0%)
 Neutral 6 (33.3%) 2 (16.7%)
 Agree 5 (27.8%) 6 (50.0%)
 Strongly agree 3 (16.7%) 0 (0.0%)
Not comfortable using mobile apps/smartphones 0.613
 Strongly disagree 12 (66.7%) 7 (58.3%)
 Disagree 3 (16.7%) 3 (25.0%)
 Neutral 0 (0.0%) 1 (8.3%)
 Agree 3 (16.7%) 1 (8.3%)
Medical apps are useful 0.719
 Disagree (Any) 4 (22.2%) 2 (16.7%)
 Neutral 3 (16.7%) 4 (33.3%)
 Agree 8 (44.4%) 3 (25.0%)
 Strongly agree 3 (16.7%) 3 (25.0%)

Data are presented as number of patients (%). AAs: anesthesiologist assistant students.

Table 3.
Top 10 in-App Clicks on Various Drugs within the List in the App
Drug name Click frequency
Succinylcholine 13
Ondansetron 9
Dexmedetomidine 7
Glycopyrrolate 7
Epinephrine 5
Atropine 4
Ketorolac 4
Neostigmine 4
Rocuronium 4
Bupivacaine 3
Table 4.
In-App Lectures and Orientation Materials That Were Accessed by Trainees Most Frequently
Name of lecture Number of times accessed
Orientation: Rotation expectations and protocols 36
20 questions: Instrument for guided pediatric anesthesia learning 34
Preoperative evaluation of the pediatric patient 33
Egleston pediatric anesthesia education manual 32
Airway lecture 25
Pediatric obstructive sleep apnea 23
Anesthesia for the ex-premature infant 22
Asthma and reactive airway disease 21
Pediatric trauma 19
Malignant hyperthermia 18
Orientation: Maps, forms, electronic medical 18
record tips
Gastroschisis and omphalocele 17
Pacemakers 16
Pediatric pain 16
Faculty phone numbers and portraits 15
Muscle disorders 13
Temperature regulation 12
Intraoperative management of children with congenital heart diseases 12
Table 5.
Basic Description of Education, Training, and Role of Anesthesia Providers in the USA
Physician anesthesiologist (MD, DO, MB CHB) Certified registered nurse anesthetist (CRNA) (Master’s in Nursing, Doctor of Nursing Practice) Certified anesthesiologist assistant (CAA) (MSc)
Eligibility to apply (typical) 4-year undergraduate degree Bachelor of Science in Nursing 4-year undergraduate degree (specified
4-year medical school One year of critical care/ICU experience prerequisite courses required)
Educational period 1-year internship 24-36 months 24-28 months
3 years anesthesiology
Healthcare system role (authors’ institutions) Anesthesia care team supervisor or IP Anesthesia care team APP Anesthesia care team APP
Healthcare system role (USA-at-large) Anesthesia care team supervisor or IP Anesthesia care team APP or IP (varies state-by-state and frequently changing) Anesthesia care team APP (no pathway to independent practice)

APP: advanced practice provider, IP: independent practitioner.

TOOLS
Similar articles