19 - ISSN - 1392-1215 - Multimodal Wheelchair Control For The Paralyzed People
19 - ISSN - 1392-1215 - Multimodal Wheelchair Control For The Paralyzed People
19 - ISSN - 1392-1215 - Multimodal Wheelchair Control For The Paralyzed People
SIGNAL TECHNOLOGY
T 121
SIGNAL TECHNOLOGIJA
A multi-modal approach is also common, for example a system to control the movement of an electric wheelchair using a small vocabulary word recognition system and a set of sensors to detect and avoid obstacles [7] or a combination of eye-tracking with simple obstacle recognition by detecting special markers in compound spaces [8]. Path finding, autonomous driving are the very important parts of robotic guidance being integrated into wheelchairs, ranging from the simple solutions such as sonars or an infrared sensors providing a partial guidance similarly to car parking systems [9, 10]. More complex solutions offer full autonomous guidance [11] based on physical, local and global levels such as free-space detection, wall detection, free-space search, direction following, wall following, motion control, obstacle avoidance, etc. However some still prefer to emphasize the human-centered manipulations instead of fully autonomous operations [12]. Various manipulators to compensate the arm disabilities are also being used in wheelchair designs, ranging from basic robotic arms capable of autonomously tracking a steady target and grasping the target via visual servo controller [13] to an advanced carbon fiber modular devices, incorporating DC servo drive with actuator hardware at each individual joint, allowing reconfigurable link lengths with seven degrees of freedom meeting the needs of mobility-impaired persons with a limitations of only upper extremities [14]. AI is also an important factor in autonomous wheelchair designs. Novel approaches, for example [15] are based on user plan recognition, allowing estimating not only the users low-level intent (i.e. go to a point), but also a high-level intent (i.e. open a door). Multimodal wheelchair control algorithm for the paralyzed Most researches are developing the interfaces for the widest audience of semi disabled people (those capable of speaking, moving a bit, etc.) while we target the most unfortunate ones (fully and partially paralyzed), integrating various types of inputs and outputs into one associative, modular control package, where all the modalities supplement on another or are enabled / disabled based on a persons needs and abilities. At the current stage we are concentrating on the following three main control modalities: 1) A speech recognition system capable of recognizing a predefined list of Lithuanian voice commands; 2) An eye tracking system capable of tracking the movement of eye pupil i.e. gaze tracking; 3) A special cursor centric touch recognition system capable of detecting the direction of finger or tongue movement on a sensor surface. A simple, but very effective control algorithm was developed to control the wheelchair (Fig. 1). At the beginning of a dialog a user inputs some control command (either by swiping a finger through a touch plate, by uttering a voice command or by moving an eye). Depending on the configuration and his abilities a user might be able to use more than one input modality. In this 82
case the overriding (primary) input is chosen (based on the preset, or even by the user (if capable) himself. Next the signal is sent to an appropriate processing block.
In case of the traditional input (figure gestures) a security solution was introduced as a safety feature. A person must lift his finger (or tongue) after each swipe. If he does not lift a finger after a preset time, the command is rejected and he is asked to repeat it. An easy option to enable or disable additional modalities is also offered for
the user (for example if his tongue or finger is tired of swiping). A user may enable speech recognition and gaze tracking simply by touching left or upwards sides of the input pad. For safety reasons the additional modalities are disabled after each successful recognition of an input (must be enabled again by touching an appropriate zone). In case of speech recognition and eye tracking, the confidence measure of uttered voice command or gaze recognition is compared to a model and if it is high enough the system issues a command to a wheel chair motor (for example TURN LEFT if a user said so). If the recognition accuracy is low the user is asked to repeat the input. Implementation We have chosen a modular architecture for the technical implementation of the mobility device. The wheelchair itself is a standard issue wheelchair with added gears, dual motors (one for each wheel), automation controller (relay and USB interface) block and a set of I/O modules (Fig. 2). At a current stage all processing operations are done on a netbook pc. Touch input was chosen for a traditional input and in principle is compatible with most touch input devices, such as smartphones, computer touchpads, touchscreens, etc., capable of detecting the movement and the direction of a pointer (a finger or a tongue) thus providing the direction vector used for wheelchair control. This modality was designed under eyes-free interface design guidelines, meaning that the graphical feedback is not necessary and allowing a control, just by swiping a surface space (the virtual cursor always returns back to the center). For safety reasons the system does not react to a user input if a finger is kept pressed down continuously. Software allows enabling the other two modalities if a user whishes so or is able to use them. The voice command recognizer capable of recognizing simple phrases, for example direction commands, such as (the words in brackets are optional): (vaiuok) pirmyn, (vaiuok) atgal, (suk) kairn, (suk) deinn, etc. was implemented to control a wheelchair. A proprietary Lithuanian speech recognition algorithm [16, 17] was chosen, capable of being adapted to any foreign language based recognizer, thus allowing easer porting to integrated devices with built-in recognition systems (usually only English). The video recognition - eye tracking was realized under the ANN based software developed by other Lithuanian scientists [5], thus enabling the device to turn left or right, move forwards and backwards based on the movement of the eye pupil. A software can be configured to allow all three control modalities to be used together, thus complementing each other, although eye tracking for safety and concentration issues should be enabled only if the other controls are unusable and can also be configured independently by a supervisor or enabled by a person himself (using a special zones of the touch surface) if he is capable of using more than one modality. A control algorithm is still in the prototype stage as experimental testing, final end-user evaluation and finetuning of control and response of the device itself, the use 83
of reliable navigation and mapping tools remain to be investigated in near future to implement the movement assistance and autonomy in confined spaces.
Conclusions and future work The tradition input (touch) was combined with voice recognition capabilities and eye-tracking into one associative interface aimed at a control of the mobility device for fully and partly paralyzed people, creating a scalable and very intuitive HCI interface. The solution is not universal every disabled person is different: some might expect a real-life companion from a speech interface; some might be unable to stick to a predefined control scheme; some might have a psychological fear of technology (unable to use a system if no human is nearby). Near future experimental analysis and fine-tuning remains to be done, trying to determine and overcome different issues on how the device performs in various environmental conditions, how the recognition accuracy affects overall usability, how good and natural is the designed interface for the final end user (a disabled person). Also the possibilities of integrating the autonomous navigation features, such as environment sensing, path-finding and remote audio-visual tracking and control for the care personnel need to be determined. Acknowledgements This work is part of the research project Lietuvik balso komand atpainimui orientuoto, multimodalinio imanij rengini asociatyvinio valdymo algoritmo sukrimas ir modeliavimas, No.: 20101216-90 funded by European Union Structural Funds project Postdoctoral Fellowship Implementation in Lithuania within the framework of the Measure for Enhancing Mobility of Scholars and Other Researchers and the Promotion of Student Research (VP1-3.1-MM-01) of the Program of Human Resources Development Action Plan.
References
1. Ding D., Cooper R. A. Electric Powered Wheelchairs // IEEE Control Systems Magazine. IEEE, 2005. Vol. 25. No. 2. P. 2234. 2. Thiang. Implementation of speech recognition on MCS51 microcontroller for controlling wheelchair // Intelligent and Advanced Systems2007. IEEE, 2007. P. 11931198. 3. Qidwai U., Ibrahim F. Arabic speechcontrolled wheelchair: A fuzzy scenario // Information Sciences Signal Processing and their Applications2010 (ISSPA2010), 2010. P. 153156. 4. Yi Zh., Quanjie L., Yanhua L., Li Z. Intelligent wheelchair multimodal humanmachine interfaces in lip contour extraction based on PMM // IEEE proceedings of Robotics and Biomimetics (ROBIO2009). IEEE, 2009. P.21082113. 5. Prosceviius T., Raudonis V., Kairys A., Lipnickas A., Simutis R. Autoassociative gaze tracking system based on artificial intelligence // Electronics and Electrical Engineering. Kaunas: Technologija, 2010. No. 5(101). P. 6772. 6. ChungHsien K., YiChang Ch., HungChyun Ch., Jia Wun S. Eyeglasses based electrooculography human wheelchair interface // IEEE proceedings of Systems, Man and Cybernetics2009. IEEE, 2009. P. 47464751. 7. Fezari M., BousbiaSalah M., Bedda M. Speech and sensor in guiding an electric wheelchair // Proceedings of ICSC Congress on Computational Intelligence Methods and Applications (CIMA2005), 2005. P. 15. 8. Kairys A., Raudonis V., Simutis R. iHouse for advanced environment control // Electronics and Electrical Engineering. Kaunas: Technologija, 2010. No. 4(100). P.3742. 9. Kuruparan J., Jayanthan T., Ratheeskanth V., Denixavier S., Munasinghe S. Semiautonomous Low Cost Wheelchair for Elderly and Disabled People // Proceedings of 10.
11.
12.
13.
14.
15.
16.
17.
International Conference on Information and Automation2006 (ICINFA2006), 2006. P. 104108 Murai A., Mizuguchi M., Nishimori M., Saitoh T., Osaki T., Konishi R. Voice activated wheelchair with collision avoidance using sensor information // ICCASSICE 2009. P. 42324237. Bourhis G., Horn O., Habert O., Pruski A. An Autonomous Vehicle for People with Motor Disabilities // IEEE Robotics & Automation Magazine. IEEE, 2001. Vol. 8. P. 2028. ChungHsien K., Chen H. H. W. HumanOriented Design of Autonomous Navigation Assisted Robotic Wheelchair for Indoor Environments // IEEE proceedings of IEEE International Conference on Mechatronics2006. IEEE, 2006. P. 230235. Zhu Zh., Ye A., Wen F., Dong X., Yuan K., Zou W. Visual servo control of intelligent wheelchair mounted robotic arm // IEEE proceedings of IEEE International Conference on Mechatronics, 2006. IEEE, 2006. P. 230235. Schrock P., Farelo F., Alqasemi R., Dubey R. Design, simulation and testing of a new modular wheelchair mounted robotic arm to perform activities of daily living // IEEE proceedings of International Conference on Rehabilitation Robotics2009. IEEE, 2009. P. 518523. Su Y., Du Zh., Yao Y. Key Technology of Semi autonomous Control Based on Plan Recognition for Intelligent Wheelchair // Proceedings of International Conference on Measuring Technology and Mechatronics Automation2010 (ICMTMA2010), 2010. P. 545549. Maskeliunas R., Rudzionis A., Rudzionis V. Advances on the Use of the Foreign Language Recognizer // LCNS 5967: Development of Multimodal Interfaces: Active Listening and Synchrony. Springer, 2010. P. 217224. Maskeliunas R., Rudzionis A., Rudzionis V. Investigation of Foreign Languages Models for Lithuanian Speech Recognition // Electronics and Electrical Engineering. Kaunas: Technologija, 2009. No. 3(91). P. 3742. Received 2011 02 21
R. Maskeliunas, R. Simutis. Multimodal Wheelchair Control for the Paralyzed People // Electronics and Electrical Engineering. Kaunas: Technologija, 2011. No. 5(111). P. 8184. Most serious accidents and injuries often end with various motoric disabilities usually resulting in a limited control of the muscles of various body parts and in some worst case scenario even the whole body. The associated governmental support programs of today are targeted at providing the technological means enabling the handicapped persons to have as independent life as possible. In the presented research authors concentrate on the development of a mobility device for the paralyzed people, trying to develop an effective multimodal HCI algorithm by combining the traditional input with speech and video recognition technologies into one associative package. Paper reviews some of the associated research on the topic, a proposed algorithm for the multimodal wheelchair control and a scheme of implementation. Ill. 2, bibl. 17 (in English; abstracts in English and Lithuanian). R. Maskelinas, R. Simutis. Multimodalinis paralyiuotj veimlio valdymas // Elektronika ir elektrotechnika. Kaunas: Technologija, 2011. Nr. 5(111). P. 8184. Nemaa dideli avarij ir nelaiming atsitikim baigiasi mogaus motorikos sutrikimais. Paprastai sutrikdoma rankos, kojos, veido ar, blogiausiu atveju, net ir viso kno raumen veikla. Praktikai labai sunku yra suvienodinti vieno ar kito sutrikimo techninius sprendimus, nes reikia nagrinti konkreius kiekvieno mogaus simptomus ir taikyti skirtingas gydymo ar pagalbos metodikas. Dl ios prieasties io tyrimo autoriai apsiriboja visiko ir dalinio kno paralyiaus atvejais, kad sukurt efektyvi mogaus ir mainos ssaj, valdymui panaudodami tris pagrindines vesties ir ivesties modalijas (lietim, nek ir vaizd) ir integruodami jas vien asociatyvj valdymo algoritm. Straipsnyje pristatomos pasaulyje taikomos technologijos, autori silomas sprendimas ir jo techninio gyvendinimo schema. Il. 2, bibl. 17 (angl kalba; santraukos angl ir lietuvi k.).
84