Abstract
Objectives
To provide accurate personalized medical care, it is necessary to gather individual-related data or contextual information regarding the target person. Nowadays a large number of people possess smartphones, which enables sensors in the smartphones to be used for lifelogging. The objective of the study is to analyze human activity pattern by using lifelog agent cooperating with the Health Avatar platform.
Methods
Using the lifelog measured by accelerometer and gyroscope in a smartphone at a 50 Hz rate, the agent reveals how long the user walks, runs, sits, stands, and lies down, and this information is summarized by hours. The summaries are sent to the Health Avatar platform and finally are written in the Continuity of Care Record (CCR) format.
Results
The lifelog agent is successfully operated with the Health Avatar platform. In addition, we implement an application that displays the user's activity patterns in a graph and calculates the metabolic equivalent of task based calorie burned by hour or by day using the lifelog of the CCR form to show that the lifelog can be used as medical records.
Personalized diagnostics are important for modern medication. To make diagnoses personal, it is essential to obtain individual-related data or contextual information regarding the person; such information considered together is called a lifelog. In particular, records that indicate the everyday activity performed by people are very valuable resources for identifying their health status. The records can be used for chronic disease management [1,2], rehabilitation systems [3], disease prevention [4,5], and as personal indicators of health status [6]. However, it is very difficult to monitor users' activities. There is a naïve approach of having users record every activity by themselves or with assistance, but this is essentially infeasible due to the limitations of time and resources.
Healthcare systems for various purposes have exploited lifelog data measured by sensors. Fujita et al. [7] presented a human activity monitoring system that identifies human activities, lifelogging with four microelectromechanical systems sensors every day. Shahriyar et al. [8] presented the Intelligent Mobile Health Monitoring System (IMHMS), which considers the biomedical and environmental data of patients as lifelogs and sends feedback via mobile devices. Mena et al. [9] built a mobile personal health application for ambulatory blood pressure monitoring, called the ARV-mobile. Blood pressure and heart rate signals, measured by ambulatory blood pressure sensors, are considered as lifelogs and sent to mobile devices to detect any unexpected status. These systems, however, require another device in which the sensors are embedded, and this may be uncomfortable or inconvenient for users.
Recently, smartphones have become so pervasive and popular that a large number of people hold the new devices almost all the time. Since the latest smartphones enclose a number of useful sensors, such as accelerometers, global positioning system (GPS), or gyroscopes, the signals detected by the sensors can be exploited as lifelog data to measure the activities of the user and encompass the contextual information regarding the user. Lifelog data, however, are too raw and their size is too huge; thus, it is difficult to use them as medical records. Hence, a system is needed to interpret and summarize lifelog data to make the summaries useful as medical records.
In addition, cooperation with medical institutes is necessary to make lifelog data practically useful for medication or healthcare purpose. As a starting point, we implement our system based on the Health Avatar platform [10], which is a personalized healthcare service platform that connects users and service providers in a simple and safe way. In this platform, lifelog summaries are written as a type of FunctionalStatus in the Continuity of Care Record (CCR) format [11] and can be used by other service providers as needed. Therefore, we believe that use of the Health Avatar platform will enable lifelog data to be used by a number of healthcare-related service providers.
The Health Avatar platform [10] is a personalized healthcare service platform that provides an interaction channel between users and service providers. The Health Avatar platform handles easy-to-use, on-demand or match-making, and secure communication processes. The Health Avatar platform comprises three components: the health agent, health avatar, and broker. The health agent is a service provider application that tracks, summarizes, and analyzes the user's personal data, such as medical records, genomic sequencing, and lifelog data. The health avatar is a user-side application that provides some useful information provided by health agents. Note that the health avatar keeps all medical and genomic data related to the user and sends them when an authorized health agent requires them. The broker, positioned between the health avatar and health agent, supervises the registrations of health avatars and health agents, the establishment of interaction channels between them, and all data that is sent or received. Figure 1 shows the architecture of the Health Avatar platform.
For agent developers, the Health Avatar platform provides application programming interfaces (APIs), such as read and write CCR records. Health agents can use these APIs to load the user's data and to send summaries or analysis results. Our lifelog agent also exploits the APIs to cooperate with the Health Avatar platform. Unlike other agents, however, lifelog agents differ in an important aspect. Since Health Avatar does not keep lifelog data, the agents should generate lifelog data at the user side. In general, the size of lifelog data is too large; therefore, it is necessary to reduce their size by summarizing. Then, the lifelog agents use the summaries to provide services.
Figure 2 illustrates the interaction between our lifelog agent and the Health Avatar platform. The lifelog agent comprises two components: the lifelogging application and lifelog server. When the lifelogging application is executed, both an accelerometer and a gyroscope in the smartphone are activated, and the lifelogging application begins recording the sensor values at a 50-Hz rate. In an ideal situation, the number of lifelog records in a day reaches more than 4 million and the size of records becomes more that 300 MB. Although the smartphone is a well-developed device, it is still difficult to keep and interpret such a large amount of data. Hence, in our implementation, lifelog records are transferred to the lifelog server, which has the capability of keeping the private information securely and of interpreting an enormous number of records.
During the conveyance of a huge amount of lifelog data, it is very important to keep the data secure, because they contain very sensitive and private information. For security, we adopt the safe and reliable protocol of the personal lifelog upstreaming system (PLUS) [12]. The protocol consists of four phases: security establishment, user authentication, preparation, and upstreaming. In the security establishment phase, PLUS first establishes a safe communication channel for secure message exchange between a smartphone and the server. They share the same encryption key for exchanging confidential messages or data with the aid of superb algorithms, such as RSA [13] or AES [14] algorithms. The user authentication phase is to validate the user that holds the smartphone. In this case, the user ID is thought of as the avatar ID. The server checks whether the avatar ID is a valid one. In the next phase, the preparation phase, some necessary actions are performed before the lifelog data is transferred. In this phase, the server checks a list of logs written in previous transmissions for recovery and preparation process. The smartphone checks its own lifelog database to check the extent to which data will be sent. Finally, in the upstreaming phase, the smartphone encrypts each piece of lifelog data to be transferred to the server.
After the lifelog transmission, the second component, lifelog server, stores the lifelog data in a database in a regularized form. At some time, the server begins to identify the user's activity with the collected lifelog. To recognize the user's activity, we first need to compute a set of feature vectors for each day. Each feature vector is composed of 24 distinct features, which are generated from a fixed-width window of 64 consecutive sensor values with a 50% overlap between successive windows. For each window, 12 features are generated by the computation of averages and standard deviations with respect to each axis. The other 12 features come from the frequency domain after a fast Fourier transform [15]. They are also generated by the computation of averages and standard deviations in the frequency domain. After feature extraction, each feature vector is considered as a basic unit of activity recognition.
In our agent, we chose five basic activities: walking, running, sitting, standing, and lying down. They are what most people frequently do in their daily lives. To recognize the activities, we first collect a training dataset. We collect as a training dataset sensor data for ten minutes for each activity at a 50-Hz rate before deploying the agent. We use a support vector machine (SVM)-based activity recognition method to identify which activity is performed at each time. Other activity recognition algorithms can be applied instead of SVM, such as swarm-based optimization methods [16,17]. Figure 3 shows the process of activity recognition.
Since the results of activity recognition are a list of activities at each moment, it is difficult to show or use them. We need to carry out an additional step, lifelog summarization. The results can be aggregated by hours. For instance, the summary may indicate that the user walks for ten minutes, runs for one minute, sits for forty-two minutes, and stands for seven minutes from 4 pm to 5 pm. The summaries are sent to the broker in the Health Avatar platform and finally are sent to the avatar in the CCR format.
Although the summary of lifelog is itself useful for analyzing the user's lifestyle, it can be used for various services. For instance, it can be used as a reference for healthcare services or used in third-party applications. One of the applications with lifelog summaries is to compute calorie burned. In our application, calorie burned is calculated as follows [18]: For MET, assuming that the activities are performed in a general sense, we refer to [18] to choose the following values.
In our implementation, the summary is of the form <ID, date, hour, activity, time, unit>. For instance, a summary <20130805, 07, "sitting", 1427106, ms> indicates that the user sits for 1427106 milliseconds from 7 am to 8 am for August 5, 2013. The summaries are inserted into the "FunctionalStatus" attribute in the CCR format. The "FunctionalStatus" indicates the ability of a patient to manage daily activities, for example, ambulatory ability, activities of daily living, mental status, ability to care for self, etc. The summarized lifelog data are regarded as a type of "Activities of Daily Living". Figure 4 shows an example of lifelog data in the CCR form.
Figure 5 shows screenshots of our application displaying the results of lifelog summarization and calories burned calculation, based on the Health Avatar platform. In this experiment, a user has lifelogged for 7 days in August. In the calendar in our application, the days on the which the user collected lifelog data are highlighted in color. If one of those days is clicked, then a graph of the activity patterns of the user is displayed on the screen. Each bar indicates the proportions of five activities and the middle line shows the calorie burned during each hour. Thus, this graph shows the user's activity patterns and lifestyle in a day. There is an option to display a graph that shows the patterns for each day, thus showing the user's activity patterns and lifestyle during a week.
In this study, we implemented a lifelog agent for human activity pattern analysis, working with the Health Avatar platform. The lifelog agent comprises two components. One is the lifelogging application, which activates an accelerometer and gyroscope in a smartphone and records the sensor values at a 50-Hz rate. The application transfers the lifelog data to a lifelog server in a reliable and secure way at an appropriate time. The other component is the lifelog server, which receives lifelog data and analyzes them to identify five basic human activities: walking, running, sitting, standing, and lying down. The server summarizes the results by hours, and then sends them to the Health Avatar platform. Furthermore, we implement an application that displays the user's activity patterns in a graph and calculates calorie burned by hour or by day, to show how the summarized lifelog data can be used.
We believe that our agent provides a guide to embody lifelog data in medical records so that practitioners or healthcare-related service providers may leverage the lifelogs in a simpler way to provide more accurate diagnosis, personalized fitness planning, or other various applications. As a future work, we are planning to collect not only activity-related data but also other types of lifelog data, such as sleep or dietary data, to exploit them for object, linkage, and cluster analysis.
Acknowledgments
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. 2010-0028631).
References
1. Amft O, Troster G. Recognition of dietary activity events using on-body sensors. Artif Intell Med. 2008; 42(2):121–136.
2. Zwartjes D, Heida T, van Vugt J, Geelen J, Veltink P. Ambulatory monitoring of activities and motor symptoms in Parkinson's disease. IEEE Trans Biomed Eng. 2010; 57(11):2778–2786.
3. Sazonov ES, Fulk G, Sazonova N, Schuckers S. Automatic recognition of postures and activities in stroke patients. Conf Proc IEEE Eng Med Biol Soc. 2009; 2009:2200–2203.
4. Sazonov ES, Fulk G, Hill J, Schutz Y, Browning R. Monitoring of posture allocations and activities by a shoe-based wearable sensor. IEEE Trans Biomed Eng. 2011; 58(4):983–990.
5. Warren JM, Ekelund U, Besson H, Mezzani A, Geladas N, Vanhees L, et al. Assessment of physical activity: a review of methodologies with reference to epidemiological research: a report of the exercise physiology section of the European Association of Cardiovascular Prevention and Rehabilitation. Eur J Cardiovasc Prev Rehabil. 2010; 17(2):127–139.
6. Arcelus A, Herry CL, Goubran RA, Knoefel F, Sveistrup H, Bilodeau M. Determination of sit-to-stand transfer duration using bed and floor pressure sequences. IEEE Trans Biomed Eng. 2009; 56(10):2485–2492.
7. Fujita T, Masaki K, Maenaka K. Human activity monitoring system using MEMS sensors and machine learning. J Jpn Soc Fuzzy Theory Intell Inform. 2008; 20(1):3–8.
8. Shahriyar R, Bari MF, Kundu G, Ahamed SI, Akbar MM. Intelligent Mobile Health Monitoring System (IMHMS). Int J Control Autom. 2009; 2(3):13–28.
9. Mena LJ, Felix VG, Ostos R, Gonzalez JA, Cervantes A, Ochoa A, et al. Mobile personal health system for ambulatory blood pressure monitoring. Comput Math Methods Med. 2013; 2013:598196.
10. Chung HJ. Health avatar: an informatics platform for personal and private big data of avatars and health agents. In : Medical Informatics Symposium in Bio-Medical Informatics for Future Medicine; 2013 Nov 11-13; Busan, Korea.
11. ASTM. Standard specification for Continuity of Care Record (CCR). West Conshohocken (PA): ASTM International;2012. (ASTM Standard E2369-12).
12. Kwon Y, Heo S, Kang K, Bae C. Personal lifelog upstreaming system for secure and reliable lifelog transmission. Int J Inf Proc Manag. 2013; 4(2):36–44.
13. Rivest RL, Shamir A, Adleman L. A method for obtaining digital signatures and public-key cryptosystems. Commun ACM. 1978; 21(2):120–126.
14. Daemon J, Rijmen V. Rijndael for AES. In : Proceedings of the 3rd AES Candidate Conference; 2000 Apr 13-14; New York, USA. p. 343–348.
15. Cooley JW, Tukey JW. An algorithm for the machine calculation of complex Fourier series. Math Comp. 1965; 19(90):297–301.
16. Bae C, Yeh WC, Shukran MA, Chung YY, Hsieh TJ. A novel anomaly-network intrusion detection system using ABC algorithm. Int J Innov Comput Inf Control. 2012; 8(12):8231–8248.
17. Kwon Y, Heo S, Kang K, Bae C. Particle swarm optimization using adaptive boundary correction for human activity recognition. In : Proceedings of the International Conference on Internet; 2013 Dec 12-16; Pattaya, Thailand.
18. Ainsworth BE, Haskell WL, Herrmann SD, Meckes N, Bassett DR Jr, Tudor-Locke C, et al. 2011 Compendium of Physical Activities: a second update of codes and MET values. Med Sci Sports Exerc. 2011; 43(8):1575–1581.