Assistive Dressing System: A Capabilities Study for Personalized Support of Dressing Activities for People Living with Dementia

Dressing is one of the most common activities provided by caregivers. It is also one of the most stressful due to its complexity and privacy challenges. Conditions such as Dementia, a brain condition that interferes with memory, thinking and social ability, can create additional frustration for even normally simple tasks. In particular, interference with memory and social function add challenges as normally simple tasks such as dressing become difficult and frustrating. We present an evolving prototype automated responsive emotive sensing system (DRESS) that provides much needed assistance to independence and privacy for people living with advanced stages of dementia (PWD) and their caregivers. In preparation for in-home trials with PWDs, we initially evaluated the system’s ability to detect dressing events by asking 11 healthy participants to simulate common correct and incorrect dressing scenarios, such as donning shirt and pants inside out, back in front, and partial dressing, in a laboratory setting. We found that although the tracking system missed a few expected detections, it was generally highly capable at detecting dressing phases for both pants and shirt. Our study suggests that the use of a tracking system in the context of detecting dressing processes has the potential to automatically recognize, and generate prompts and feedback to assist PWDs or related cognitive disorders to correctly dress themselves with little or, ideally no assistance from their caregivers.


Introduction
Dementia describes a series of symptoms involving memory loss, and difficulty with tasks such as planning, organizing, communicating, and motor function. A number of progressive illnesses, such as Alzheimer's, can trigger cognitive changes associated with dementia that affect memory, thinking, behavior, and the ability to perform everyday activities [4]. The cognitive neurodegenerative disorder, Alzheimer's disease (AD), is most commonly associated with dementia [3]. The World Health Organization Report estimates 7.7 million new cases of dementia were diagnosed in 2010 -one new case every four seconds [36]. Additionally, the population aged 60 or over is growing at a faster rate than the total population in almost all world regions [33] and the likelihood of an individual over age 65 developing dementia roughly doubles every five years [36]. At this rate, an estimated 682 million people will be living with dementia (PWD) in the next 40 years -significantly more than the entire population of all of North America and nearly as much as all of Europe [34].
Prior research reports extensive involvement of caregivers (86%) in supporting Activities for Daily Living (ADL) for patients dealing with the early and middle stages of dementia. Dressing, in particular, is of greatest concern, and involves the greatest percentage of caregivers (61%), followed by activities such as feeding (52%), bathing (37%), toileting (34%), and incontinence care (25%) [8,27]. Dressing is a complex activity that has implications for privacy and independence for PWD and their caregivers. These issues can often lead to frustration and anxiety, for both PWD and their caregivers, especially when caregivers are adult children of the PWD--even more so when the adult children caregivers and the PWD are of opposite genders.
While attempts have been made to automate real time assistance for activities such as washing hands [24], cooking [25], and taking medications [15], we are not aware of any system available in the fields of context-aware computing and human-computer-interaction, that uses real time prompting to assist the dressing process.

Background and Prior Work
Though various assistive technologies exist for many daily needs and activities, few of these systems target the specific needs of PWD. Mihailidis et al. [24] developed a system to facilitate persons with moderate to severe dementia wash their hands. Examples of other behaviors targeted by assistive efforts include cooking [25] and taking medications [15]. Other studies of patients with Alzheimer's support the use of assistive efforts to improve ADL through cognitive interventions [7]. While there have been several recent efforts to advance statistical pattern recognition as a technique to identify and respond to human behaviors, these efforts have not yet been sufficiently promising with respect to the needs of PWD [2, 6,14,17,29,32].
Wu et al. [16] presented an activity recognition system that combined RFID and video feedback in a kitchen setting, and achieved a recognition rate of 80% for 16 activities with 33 subjects. The system developed by Mihailidis, et al. [23] uses video processing to recognize context and prompt actions performed in hand washing. An important contribution in advancing systems that identify ADL is Proact [30], a project that address recognition of 14 everyday activities. The system not only reports the activity being performed, but also the extent to which it is performed. Dalton et al. [9] and Fleury et al. [13] have also advanced related research.
The earliest work in assisting people with one of the most common forms of dementia, Alzheimer Disease (AD), in dressing was conducted by Namazi and Johnson [26]. They demonstrated how modifying closet arrangement to organize the clothing in a visible and preplanned sequential order can help improve independent activity by AD patients. Engelman, et al. [10] showed how an increase in dressing independence for persons with AD might be accomplished using prompting procedures.
Popleteev and Mayora [31] developed a smart assistive buttoning system for people with mild cognitive decline. Their system detects if a button was "locked" with its correct counterpart, and if incorrect (unlocked or locked with a wrong counterpart), an event was triggered and the system provided an alert to the user, or recorded the event details for further analysis by caregivers. Matic et al. [21] developed a system that detects dressing activity failures using RFID and video. However, this system does not provide feedback, or assist in rectifying mistakes during the dressing process.
While each of these systems provide significant contributions, we have not identified another system that employs a comprehensive approach--monitoring dressing activity, identifying correct dressing and dressing failures, and providing feedback and guidance to rectify mistakes.

Dress System Architecture and Detection
To facilitate acceptance during in home deployment, the DRESS system we have developed is built into a standard five-drawer dresser (   Information from these sources is used to identify whether clothing orientation is correct (e.g., front of the pants) or incorrect (e.g., back of the pants). This information can also be helpful to infer the current stage of the dressing process. This in turn provides the basis for offering appropriate feedback, guidance, and reinforcement during the process. Finally, Radio Frequency Identification (RFID) tags and receivers located inside of the drawer are used to immediately initiate the clothing guidance process when the clothing is removed from the drawer. Additional sensors include: X10 door/window motion sensors attached to each drawer to detect opening/closing of the drawers, and sensor on the chair placed in front of the dresser to detect sitting/standing; future plans include adding a small skin conductivity wrist sensor worn by the user.
The system uses a Mac OS X server running Indigo 4.0, to manage and acquire data from the sensors [34]. The sensors communicate information identifying ID and sensor state, to a radio frequency receiver in the base station, which relays this information to the central Indigo server. The Indigo home automation application receives the sensor data, assesses status and, depending on context and state of the dressing process or caregiver's input, executes embedded AppleScript trigger actions.
These supportive interactive experiences are delivered through: mobile devices, such as Apple's iPod touch, to provide visual cues; wireless speakers to provide audio feedback; and changes in room lighting to attract attention.
A native iOS mobile application for the iPod touch device was developed for the caregiver to control and monitor the dressing sequence ( Figure 3). With this iOS application, caregivers can check if the PWD is inside the room, can edit user information such as name and gender, can select the clothing items and the preferred customary dressing order for the clothing items the PWD will wear and prompt or intervention frequency. In order to give specific guidance throughout the dressing process, at this point, the system depends on the fiducial tracking system to detect progress at each step. The fiducial tracking system uses  Other detection events were rule based and reflected the appearance of several of the fiducials in the context of the current stage of the dressing sequence. For example, to verify that the shirt is closed and worn correctly, DRESS searches for the 4 markers placed near the Velcro. The close proximity between the right matching markers of both sides of the shirt and their orientation determines whether there is any misalignment. The proximity threshold between matching markers is fixed and previously determined by testing and capturing this distance with a correctly worn shirt. If the alignment conditions are not met, the system indicates a misalignment error.
Dressing errors such as wearing the shirt back in front or inside out are identified using the specific markers attached to those parts of the clothing. For example, markers attached to the backside and inside areas of the shirts are used to identify the correspondent errors cases when donning the shirt.
Partial or error detection for donning pants follows a similar protocol as the shirt. To identify if the user has indeed stood, pulled the pants up, and worn them correctly, the system looks for markers in the upper half of the pants. As with the shirt, errors like reversal back to front or wearing inside out are detected with markers attached to these parts of the clothing.

System Evaluation Methodology
We conducted a study in a laboratory setting to evaluate the DRESS detection capabilities. We were specifically interested in observing the dressing pattern through fiducial markers' detection at different stages of 9 dressing scenarios common to PWD [13]. These included the clothing worn: correctly (shirt and pants); partially or on one limb i.e. one arm worn or one leg worn (shirt and pants); backwards with the back in front (shirt and pants); inside out (shirt and pants); and misaligned (for shirt only). For initial assessment system functionality, eleven healthy participants (7 female / 4 male, age 19 to 41 -average age 25) were engaged to emulate a range of dressing errors for a single one-hour session. Included were pre and post surveys, regarding common dressing practices.
Participants were instructed that each trial would consist of the following steps: 1) wait for the experimenter's cue of when to start and what dressing condition to perform; 2) pick the respective clothing item from the drawer, 3) put it on in the way prescribed for the condition; 4) once completed, wait 3 seconds; if the condition was an acted error, then engage in the DRESS prompted corrections; 6) wait 3 seconds, once the correct dressing has been completed. All participants performed each of the 9 dressing conditions 2 times.

Results
Results indicated that the system was most reliable at reporting expected detections for acted errors of inside out pants and shirt in phase 1, followed by detecting each limb worn in phases 2 and 4 for both clothing items -missing only 4 out of 388 expected detections across all conditions (99% accuracy). Furthermore, the system identified several initially unexpected detections, e.g., inside of the shirt or back of the pants in the transition phases.
With respect to the shirt, in one trial the participant appeared confused about the orientation of the shirt and turned it inside out several times before completing donning, resulting in an unexpected, but accurate recognition. Other conditions resulted in similar detection reliability, missing 5, 6 and 7 detections for partial, back in front and misalignment conditions, respectively.
Unexpected detections included partial detections that occurred after completing the wearing the second arm, just before completion of donning (phase 5). Video inspection revealed that the unexpected detections were primarily due to lengthy adjustments by the subjects before closing the shirt. For instance: opening and closing the shirt several times to bring the two parts of the shirt together; holding the neck or Velcro occluding the markers; and for females adjusting hair each created unexpected detections. In one case, the shirt was too large for the participant and frequent folds impeded marker detection.
When participants were asked to put on pants correctly, we expected the following detections through the different phases of the process: sporadic back of the pants when adjusting just before putting the pants on (phase 1); then that right/left leg was worn (phase 2); then that the other leg was worn (phase 4); and finally that dressing was correctly completed (phase 6). The system was successful at all the detections except at phase 6, where completion detection was missed 5 out of 22 events.
In terms of detecting the acted-errors in phase 1, the system was the most reliable at detecting when pants were donned inside out (100%), and only missed 3 back in front detections. For the partial dressing condition, the system did not correctly record partial dressing, however inspection of the data indicated that no completed detection was recorded when the dressing was partial. Video examination indicated that partial dressing could be detected in some occasions when the middle to upper fiducials on one pant leg was visible while none of the fiducials of the other leg of the pants were. However, folding of the clothing when it is partially worn during donning makes detection of partial dressing challenging.
Missing detections were found to occur while subjects put on the second leg (phase 4) and upon correct completion of the dressing process. Visual inspection of the videos indicated that missing detections were due to: inability of the camera to see markers; to suboptimal position of the participant with respect to the camera (tilted, too close, too short); occlusion by cloth folds when the clothes were too large for the participant; failure of the Indigo server to recognize the visible markers or record the detection events on time; and participant donning the clothing too quickly for the marker to be captured and recognized.

Limitations
Dressing patterns observed in our study may not directly apply to dressing patterns of PWDs. Even within this study's population, there might be slight differences in how participants naturally make and correct dressing mistakes outside of the laboratory environment used in the study. For example, some participants reported the awkwardness of intentionally making mistakes (2 participants), putting pants on while seated (3 participants) and dealing with the shirt's Velcro when it got stuck in their hair (2 participants). Most participants reported that they never, or rarely, put on shirts (9 participants) or pants (11 participants) the wrong way (e.g., putting them on backwards). Only field studies with PWDs during the next phase of development will enable verification of DRESS capacities in natural settings for the target audience. Nonetheless, this study showed that the capability of the system for detecting multiple diverse dressing patterns.
Although there are situations in which caregivers stated that their loved ones with dementia did care for what they would wear, many of them stated that PWDs not only prefer to use whatever is most comfortable, but also did not care about the clothing type. However, it is important to consider a way to reduce the number of fiducial markers on clothes and to make them less intrusive so the clothes look as natural as possible. Some possibilities being explored are use of infrared markers, and creating marker designs and to work with designers to create trackable clothing items that could be more fashionable, and thus more acceptable.

Conclusion and Future Work
The technology used in this dressing assistance system is already highly effective, in most cases reliable, and provides the pre-requisite capabilities needed to advance greater levels of independence and wellbeing for PWD and their caregivers. The system also has the potential to provide effective verbal prompting when dressing errors are identified, and can assist by providing guidance regarding how to proceed to correctly complete the dressing activity. Our next step is to refine and improve algorithms used to analyze and predict user actions prior to in-home deployment. We will also test the system with target users (older adults diagnosed with mild to moderate dementia), extending the capability of the system to detect additional articles of clothing, and seek to customize the system for use in the context of additional ADLs, beyond dressing.
The evaluation of DRESS in a laboratory setting has shown that it is possible to detect clothing orientation and position and to infer users' current state of dressing using a combination of sensors, adaptive software and fiducial tracking systems. Thus, this system has demonstrated its initial capability to provide the core foundational features required of an automated dressing support system for PWD's dressing and maintaining their independence and privacy, while providing their caregivers with much needed respite.