by Jessie Chuang (Co-founder, Classroom Aid)
The movie “Big Hero 6”is a fun Disney movie. Its main character,Baymax, is a personal health companion – a huggable robot that takes personalized care of your health. If possible, do you want to give your family member who’s sick one personal health companion? My first boss taught me “you must tell the conclusion at the beginning of your report”, so here is my conclusion – xAPI is a very effective tool for mobile health — to help us build a Baymax!
Background and Motivation
Mobile health(mHealth) solutions have flourished in recent years. According to research (Saxon, 2016.), average Americans check their mobile phones 150 times a day. 60% is for health purpose, like tracking fitness goals. Americans will use health apps more in the next 5 years. Mobile phones are now our most intimate friend. Apple offers ResearchKit and CareKit for healthcare support and research, and Google has Google Fit SDK, too. Hospitals have started to use mobile portals to engage and serve patients.
2/3 of mHealth apps only provide health information; the other 1/3 of mHealth apps monitor physiological values. While this is fantastic, these apps are not enough for special needs patients. 92% of spinal cord injury patients need to use wheelchairs.ithin 10-15 years of injury, 50-70% of those patients have upper limb overuse problems, and their probability of chronic illness increases significantly. Our proposed solution was to develop a Smartchair App that provides instructional videos of rehabilitation for patients to follow along and report their feelings. It monitors context and result information. It also includes a social platform for patients to connect with peers. From the data collected, the system behind it, called Context Awareness Suggestion Engine (iCASE), builds context-awareness and then provides recommendations dynamically to support patients.
At the center is Smartchart App (from ICAN Lab., National Taiwan University), data from the motor power wheelchair and the prescription from physician or therapistintegrate with data collected by Smartchair. This App is offered to spinal cord injury patients in two healthcare organizations: Taoyuan Potential Development Center, National Taiwan University Hospital.
A successful xAPI implementation enables people to connect and work together. As well, a successful xAPI implementation is purposeful about what data is collected. With these priorities, the following challenges were addressed.
- Concerning the Smartchair user’s needs, frequent clicks result in chronic injuries. The solution is to build context-awareness from user’s history records, the prescription and other context data to prompt the recommended action for user dynamically. This reduces clicks and provides guidance at the same time.
- Different services needed to communicate amongst each other – for example, the site for therapists, and wheelchair vendor’s system. In order to solve this, xAPI is leveraged for data transfer and integration.
- System developers need a more efficient way to collect user behaviors to feedback to improve the system. Also after revision, users also need to relearn a newly-designed interface. The solution is a recommendation engine to prompt recommended actions for supporting users as well as collecting feedback, developers will update the system design only after the hit rate of recommendation is higher than a threshold.
xAPI is a user-centered data format, use Actor-Verb-Object along with other rich contextual and result data, the experience can be captured with necessary details. You might know better than me that for medical data, context information is crucial. This way allows collecting and transferring data in between heterogeneous platforms through intermediary server – the Learning Record Store, LRS.
Now let’s look at context-awareness modeling. Contexts could be static(i.e. Personal information, location information…) or dynamic(i.e. user behavior, weather…), and belong to many categories that matter to our analytics(i.e. User context, time context, physical context, computing context…). You can imagine sensors are like our 5 senses, and computing is like our brain. Only with xAPI, we can record heterogenous data in a coherent way, so those 5 senses data can be put together immediately for computing brain to build context-awareness. For example, it’s like Baymax observes when, how long, with whom, the patient did exercises.
How does the computer model data? It’s like Baymax can reason that the patient walk longer when it’s morning than afternoon, or exercise better when exercising together with peers. There are many well-developed data science / machine learning methods and even open-source tools to let machine do reasoning and predictions. For example:
- Statistics: statistical inference from frequency, average, etc., it’s the most popular method
- Sequential pattern: analyzing data pattern on timeline
- Association rule: investigating the probability that when an event appears, another event also appears
- Classification: classifying old data, and then predict the future data
- Clustering: clustering data by property similarity
This is the conceptual system architecture to represent how this computer “brain” works to process data to support users. xAPI collects data from functionality interface into building behavior model, which will be sent to context-awareness model for calculation, and then combine with expert knowledge, these data will be processed by filter model to sort out suggested next steps and offer to the patient. The cost-benefit analysis is to analyze if this recommendation engine really improve user experience and solve the problems identified.
Behavior modeling involves how we segment user interface, block naming (meaningful for human and machine), mapping behavior events to xAPI statements with necessary contexts and results. Then user’s behavior records will be processed along with context awareness model, using those mentioned machine learning methods to model data, also taking into consideration of contexts such as current time and location.
The filter model is to filter and sort the recommended next steps for the user. Usually this kind of recommendation engine only factors in user’s habits, but for a patient, we need to put therapist’s prescription into filter model. All recommended actions are referring to the prescription list, the priority is adjusted according to user’s history and context (for example, time). It adjusts the weight for recommending actions according to user’s history data — for behaviors with lower frequency, the weight will be increased, for behaviors with higher frequency, the weight will be reduced, or totally removed if an action’s frequency exceeds the threshold set by the therapist. The recommendation engine prompts recommended actions upfront (after user login) in calculated sequence, on top of the current hierarchy. As mentioned, this reduces clicks, and provides guidance as well.
We can add voice recognition and voice control, and wearables for monitoring purpose, for example, measuring quantified data of patient’s movement, or monitoring eyeballs, brainwave etc., it depends on the pinpointed purposes.
If we have more sensors, those data might help us get more context information, then we can answer more questions. But some context data will be collected by designing the functionalities such as enabling interactions between patients and caregivers, physicians, and we can analyze the impact of such social context.
Based on the same system architecture, another related work is to help cancer patients choose the right foods and balance between preferences and nutrition requirements. And another related work is in in learning and training domain.
The strategy here is the collaboration between machine and experts (therapist or teacher). First the machine builds recommendations from expert’s prescription and user’s behavior history to balance both dynamically, the output of machine learning is under monitoring of expert, the expert can modify the recommendation if needed, and then machine will adjust its modeling to fit to that label every time. That means, machine learns from its human collaborator and real data continuously.
The benefit for experts to use this system is that collected data can give them a better picture about their patients or students, they can take actions at the moment of need; and with enough training, machine’s recommendation can be very helpful and share the experts’ work loading.
xAPI Serves Humanity
Many medical devices and sensors can collect data around a patient, including physiological sensors, biokinetic sensors, ambient sensors — even swallowing a pill can take pictures inside the body. With data so available, there are two things we care the most about here.
- As mentioned before, for medical data the related context of a data point is very important. xAPI can record contexts in a standard way. This means that experience tracking is not only described as a number; now we can record human behaviors in a more human way.
- When data is in different formats, we can NOT make sense out of them or use them UNTIL the time and computing power are committed to integrate and interpret them. xAPI data, by contrast, are highly structured in an intentional, pre-designed way. Thus, it can be integrated meaningfully as soon as collected: the data can be put to use right away for humans to read (dataviz as cognitive agent), for machines to compute & respond (less guesswork). This means services can talk to each other & work together in real time to serve humanity ASAP.
- xAPI is a very effective tool in enabling Apps to serve humanity because it connects heterogeneous data immediately.
- XAPI is about people working together. Among lessons learned, this is the no.1 factor in a successful xAPI implementation. Usually driven by people who care about patient-centered or learner-centered service. Talk to an xAPI expert about your problems needed to be solved, questions needed to be answered, then plan xAPI implementation together. To reduce technological barrier, xAPI wrappers and connectors are available.
- XAPI is about connecting current technologies, instead of re-inventing wheels.(that’s API’s power) Maybe you’ve heard of API economy, xAPI is an API for experience data; with proper xAPI profile/recipe design, data can be leveraged with machine learning APIs or our iCASE system which works for patients or learners. Think of the fact that we have such “personalized advertisements” around us everyday: can we bring that “personalization” technology to learners and patients? Up to all of us.
jia-Ru Ho, Yun Yen Chuang, Ray-I Chang, “SmartChair APP – Mobile Technologies for Supporting Patients with Spinal Cord Injury,” The 11th E-Learning and Information Technology Symposium, 2016.
JESSIE CHUANG (Co-founder, Classroom Aid, Inc.)
Jessie Chuang is co-founder of Classroom Aid Inc., Lead of ADL xAPI Chinese Community of Practice, and consultant of Visca – an xAPI visual analytics service. She has provided consulting services and courses in OER (Open Educational Resources), mobile learning design, learning standards, educational technology product/solution design and visualization design for educators, researchers and vendors. Recently she is passionate about xAPI implementation design and analysis, data-driven learning design and how analytics & machine learning work in different industries. She often connects ideas from different domains, in her past career in high tech. R&D she had obtained more than 20 patents for new inventions.