The project’s mission is to develop and validate an unobtrusive and modular system.
Human movement as observed in natural settings is highly variable and versatile, allowing us to interact with an extensive range of objects and deal with very different situations. This variability is much more varied than the data collected in an experimentally predefined task or clinical assessment scales which are usually restricted to the experimentalist’s implementation assumptions and might overlook important aspects of the relationship between brain signals and human motor behaviour. Consequently, this naturally occurring variability has the potential of informing us how the brain controls our movements and in particular changes in the structure of the variability may be powerful biomarkers for neurological disease. In addition to this, understanding the structure of human movement and its relationship to external factors can improve the control of assistive technology such as prostheses and orthoses by making it more human-like and thus increase adoption and usage of such technology.
It is the objective of the eNHANCE project to enhance and train upper extremity motor function during daily-life in people with physical disabilities. eNHANCE will empower the user in performing their daily-life functional interaction with their environment through an intelligent multimodal adaptive interface controlled by a high performance intention detection input interface and personalised by an individual behavioural model.
Scenario 1: Since half a year, Mrs. Jansen has very limited arm and hand use due to the stroke she suffered at age 61. Fortunately, her cognitive functions have recovered, but she still cannot use her affected right arm effectively during all types of personal and household tasks, despite intensive rehabilitation in the first six months after her stroke. She hasn’t been able to return to her job as nurse practitioner.
Now that she’s back at home, she uses the eNHANCE system that supports her affected arm and hand very smartly. As she learned during her rehab, she tries to use her right arm as much as possible, but she was still having trouble with carrying large objects, like a tray with coffee from the kitchen to the living room. With the smart arm and hand support, she is again able to use both hands. It is amazing that the intelligent orthotic support automatically follows her intentions without the need for her to give explicit commands. Even more, the system empowers her via the environment, for example in opening the door when she approaches it when carrying the tray in both hands. She can now invite her friends again to come over for coffee or tea and play bridge, her new hobby. Also, she is very happy that she is much more self-supporting in daily life than some time ago. The system also stimulates her to be physically and cognitively active in a manner that very well matches her personal preferences. She realises that this is very important for her in order to preserve her cognitive and physical abilities. For example, she started practicing her card game skills online with some of her friends to prepare for matches, and uses the system to make appointments and talk about their game strategies. She enjoys her regained independency and physical abilities every day.
Scenario 2: Mr. Baker suffers from Duchenne muscular dystrophy and is now 25 years old. Throughout the years, his motor functions have gradually decreased because of his progressive disease. As a consequence, walking is impossible since he was 15 years old, causing him to be wheelchair bound. Reaching and grasping have become increasingly difficult in the past years. To his great distress, this makes him more and more dependent on support of caregivers. He knows that the resulting physical inactivity contributes to his reducing physical condition. It also reduces his self-esteem tremendously.
Since a few months, he is now using the newly developed eNHANCE system, which supports his arm and hand function in a very natural way. The system quickly recognizes his intention via his gaze and movements, without interfering with his activities. Based on this, the arm and hand devices give him just enough support to allow him to reach for his glasses and put them on, while he is working on montaging his video clips on his laptop. He really likes that the system allows him to be as active as he can, but still give him enough support to handle objects like his video camera. Also, he is now again able to prepare and eat his breakfast and lunch without requiring assistance. The intelligent eNHANCE system fits amazingly well with his personal preferences, which the system learnt adaptively since its initial usage. For example, he likes to share his edited film clips with his friends or invite them over to watch it together, which he can do easily via the system. Mr. Baker enjoys his regained independence, which has improved his quality of his life tremendously.
Key eNHANCE concepts
1) Multimodal interface
User control input interface
Eye tracking to derive user control input is key to the multimodal interface. It is ideally suited to derive user motion intention in a natural manner without adding cognitive load, has high information capacity and is low cost (Figure 2). It will be supplemented by sensory information derived from the existing arm and hand control, including muscle activity (electromyographical or mechanoacoustic sensing), movement (inertial sensing) and interaction force between body and support system and with the environment. This supplementary sensing is especially important for natural identification of grasp intention.
Mechatronic arm and hand support interface
Mechatronic support of arm and hand function enables the user to functionally interact with the environment during daily-life tasks like taking a meal or handling objects. eNHANCE will integrate existing arm and hand support systems, improve their actuation to make them controllable based on user intention detection. The improved support system will be designed to be minimally obtrusive to the end-user. The intelligent mechatronics arm and hand support system is supplemented by additional multi-modal motivational communication, including audial and visual cues, skin vibration and suggestive movements of the arm.
Environment observation interface
The environmental context has an important impact on our motor behaviour and interaction with the environment. We will apply a portable head-mounted 3D scene camera in combination with inertial sensing of head orientation to derive the natural active vision of the user, and derive context information, including social interactions with other persons, using supervised learning approaches.
2) Assessment system
Automatic intention detection not requiring conscious inputs from the user is an essential feature for a functional support system. We will base the arm support end-point control on gaze-based decoding of action intention using the presented high-performance low-cost eye tracking system. Reach and grasp intention detection will be additionally based on supplementary on-body sensing of muscle activivation (EMG/MMG), body movement and interface forces, and on a head-mounted environmental observation system.
Assessment of motor performance relative to capacity informs the personalised motor support intelligence about the performance reserve of the user. It will be based on information derived from the multimodal sensory interface with the person and the environment. Performance measures will be derived from sensing the task execution in relation to the level of support: to what level does the user himself contribute to the generation of movements, how frequently does the user functionally interact with the environment and how effective is the user in contributing to functional tasks. The capacity is conceived as the maximum performance the user can achieve. The first approach is to assess capacity from separate clinical tests. Subsequently, capacity will be estimated implicitly from daily-life functioning as the maximum level of user performance when the user is maximally motivated by the system.
The personalized behavioural model will predict user performance depending on arm and hand support level and supplementary motivational communication, taking into account context, including social interactions, and environment. This will be the basis for the support intelligence to decide about the minimal level of support given to the user to realise the intended movements. The behaviour model will be iteratively identified and adapted, based on the comparison between predicted and subsequently observed user performance.
3) Personalised motor support intelligence
The personalized motor support intelligence control the mechatronics arm and hand support and generates supplementary motivational communication with the user, in order to support the intended movements, while requiring maximum user performance and motivating the user to be active in generating new intentions for functional interactions with the environment. For this purpose, the personalized motor support intelligence requires inputs about the user intentions and performance. The provided support level and supplementary motivational communication will be decided by the support intelligence based on the influence on user performance, as predicted by the adaptively identified personalized behavioural model. The motor support intelligence will generate additional random excitation of user support and motivational communication to improve adaptive identification of the user behaviour model.
The eNHANCE project envisions a new generation of intelligent and adaptive multimodal interfaces to support people with severe motor impairments. These interfaces will enable these people to improve their participation in society and are expected to be the basis for improving the innovation capacity of European companies in this domain, but also in much wider application spectra.