Understanding Emotions in Children with Developmental Disabilities during Robot Therapy Using EDA
Abstract
:1. Introduction
1.1. Human–Robot Interaction and Robot Therapy
1.2. Robots for Children with Autism
- KASPAR is a child-sized robot that was developed by the Adaptive Systems Group of the University of Hertfordshire, which can create human facial expressions and gestures. KASPAR has been recently updated to its fifth version for working more suitably in autism therapy [19]. With the fifth version of KASPAR, four children with autism participated in a game regarding emotions.
- Keepon is a robot with a yellow, round-shaped minimal body that was developed by the CareBots Project. The upper body has a head, left and right eyes, a wide-angle video camera equipped, and microphon-attached nose. With four degrees of freedom, Keepon expresses a state of mind with a gesture [20]. Twenty-five children participated in the environment, interacting with Keepon for 10 to 20 min, and the emotional communication was observed. Children made eye contact with, touched, and talked to Keepon [21].
- Muu is also famous as a simply designed robot that has only one eye in the robot body. It is covered with a soft material. It moves irregularly due to the internal spring, and is covered with a soft material. In the experiment with Muu [22], children with autism played with Muu for 5 to 10 min. This study was run once a month and observed for six months. The case study showed that Muu, described as an inanimate robot in the study, works for children with autism as a social agent.
- QTrobot, which was created by LuxAI, is an artificial-intelligence robot with an LCD face and a robot arm. It improves the emotion-recognition ability of autistic children by imitating their emotions with expressive faces and moving arms. The behavior of the robot is limited to a certain extent and the child is never denied. By using the attached tablet terminal, an individual can interact with QTrobot to increase the level of interest in the other party and to reduce confusion [23].
- CASTOR is a small humanoid that was produced by Diego et al. It offers a replicable robot platform that integrates the concepts of soft actuators and compliant mechanisms, and is aimed at real-world therapies, including physical interactions between children and robots [24].
- NAO is a small humanoid robot developed by Aldebaran Robotics (Softbank Robotics) that walks on two legs independently and has verbal and nonverbal communication functions [25]. This robot has been widely used and one of these fields is autism therapy, such as in study of Shamsuddin et al. [26]. They conducted a case study of autistic-child-and-robot interaction and compared his reactions in a normal class with people, by observation. The result shows that the child showed more active interactions when playing with robot then in the normal class.
1.3. EDA in HRI
1.4. Contributions and Paper Overview
2. Methods
- Break-in part: The therapist showed the child a facial expression card for each of the four emotions and the child practiced expressing the emotions.
- Facial-expression-game part: The NAO showed the child the facial expressions that were practiced in the break-in part. Thereafter, the therapist showed the facial-expression card that was identified by the NAO to the child and paired it with the facial expression performed by the NAO. The therapist then asked the child to perform the facial expressions specified by the NAO. The NAO instructed the child, Show me one of the four facial expressions, and the therapist encouraged the child to exhibit one. The NAO recognized the facial expressions and conveyed them to the subject. This was repeated until the correct answer was provided with four facial expressions.
- Free-structure-play part: When the facial-expression-game part was over, the therapist could freely operate the NAO using the numeric keypad. At this time, the NAO could indicate four facial expressions and compliment the subject.
2.1. Facial Expression Game Part
2.2. Devices
2.3. Integration of Systems
2.3.1. NAO Interactive Application
2.3.2. Recording Systems
2.3.3. E4 Wristband System
2.4. Analysis
Features in EDA
2.5. Hypothesis
3. Results
3.1. Features in Each Subject
- When the average values of the raw data of subject 1 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “joy,” “anger,” “sadness,” and “surprise.” As a result, in Russell’s circumplex model, “surprise,” with the highest arousal value, was the lowest.
- When the average values of the EDA data from the raw data of subject 2 were arranged in descending order, the order was “surprise,” “anger,” “sadness,” and “joy.” However, when the EDA data following wavelet transform were sorted, the order was “surprise,” “sadness,” “joy,” and “anger.” In both of these results, the arousal values of Russell’s circumplex model were not in descending order, but the changes appeared in the order of the mean value of the EDA data before and after the wavelet transform was applied.
- When the average values of the raw data of subject 3 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “joy,” “surprise,” “sadness,” and “anger.” This result was not in descending order of the arousal value in Russell’s model.
- When the average values of the raw data of subject 4 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “sadness,” “anger,” “joy,” and “surprise.” As a result, the lowest on the arousal axis of Russell’s model, “sad,” was the highest, and the highest, “surprise,” was the lowest, on the arousal axis.
- The EDA data of subject 5 had a higher EDA value for all data than the other subjects. When the average values of the raw data and the EDA data following wavelet transform were arranged in descending order, they were both in the order of “sad,” “surprise,” “joy,” and “anger.” In this result, “sad,” which should have the lowest arousal value in Russell’s model, was the highest.
- When the average values of the raw data of subject 6 and the EDA data following wavelet transform were arranged in descending order, they were both in the order of “surprise,” “sadness,” “joy,” and “anger.” This result demonstrates that “sad,” which had the lowest arousal value in Russell’s model, was second highest.
- The EDA data of subject 7 did not visually differ for each emotion. When the average values of the raw data and the EDA data following wavelet transform were arranged in descending order, they were both in the order of “joy,” “anger,” “sadness,” and “surprise.” This result indicates that “surprise,” which had the highest arousal value, was the highest in Russell’s model.
- When the average values of the raw data of subject 8 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “surprise,” “sadness,” “anger,” and “joy.” As a result, “sadness,” which was the lowest arousal value in Russell’s model, was the second highest.
- When the average values of the raw data of subject 9 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “sadness,” “anger,” “surprise,” and “joy.” This result was the highest for “sadness,” which had the lowest arousal value in Russell’s model.
- When the average values of the raw data of subject 10 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “anger,” “joy,” “surprise,” and “sadness.” According to this result, “sadness,” which had the lowest arousal value in Russell’s model, was the only one with the lowest arousal value.
- When the average values of the raw data of subject 11 and the EDA data following wavelet transform were arranged in descending order, both were in the order of “surprise,” “sadness,” “joy,” and “anger.” This result was not in descending order of the arousal value in Russell’s model. Moreover, in this experiment, the time that was required to acquire the emotions was not constant, but subject 11 exhibited a significant change for each emotion according to the Z-value.
3.2. Average of EDA Data
3.3. Recognition Rate
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- WHO. World Report on Disability 2011, Chapter 2 Disability—A Global Picture; WHO: Geneva, Switzerland, 2011. [Google Scholar]
- WHO. World Report on Disability 2011, Chapter 5 Assistance and Support; WHO: Geneva, Switzerland, 2011. [Google Scholar]
- Martinez-Martin, E.; Escalona, F.; Cazorla, M. Socially assistive robots for older adults and people with autism: An overview. Electronics 2020, 9, 367. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Takehashi, H.; Nagai, C.; Obinata, G.; Stefanov, D. Which robot features can stimulate better responses from children with autism in robot-assisted therapy? Int. J. Adv. Robot. Syst. 2012, 9, 72. [Google Scholar] [CrossRef] [Green Version]
- Wood, L.J.; Dautenhahn, K.; Lehmann, H.; Robins, B.; Rainer, A.; Syrdal, D.S. Robot-Mediated Interviews: Do robots possess advantages over human interviewers when talking to children with special needs? In Social Robotics, Proceedings of the 5th International Conference on Social Robotics, ICSR 2013, Bristol, UK, 27–29 October 2013; Springer: Cham, Switzerland, 2013; pp. 54–63. [Google Scholar]
- Wood, L.J.; Dautenhahn, K.; Rainer, A.; Robins, B.; Lehmann, H.; Syrdal, D.S. Robot-mediated interviews-how effective is a humanoid robot as a tool for interviewing young children? PLoS ONE 2013, 8, e59448. [Google Scholar] [CrossRef] [PubMed]
- Goodrich, M.A.; Schultz, A.C. Human-robot interaction:A survey. Found. Trends® Hum. Comput. Interact. 2008, 1, 203–275. [Google Scholar] [CrossRef]
- Hentout, A.; Aouache, M.; Maoudj, A.; Akli, I. Human–robot interaction in industrial collaborative robotics: A literature review of the decade 2008–2017. Adv. Robot. 2019, 33, 764–799. [Google Scholar] [CrossRef]
- Vasco, V.; Antunes, A.G.P.; Tikhanoff, V.; Pattacini, U.; Natale, L.; Gower, V.; Maggiali, M. HR1 Robot: An Assistant for Healthcare Applications. Front. Robot. AI 2022, 9, 12. [Google Scholar] [CrossRef] [PubMed]
- Imai, M.; Ono, T. Human-Robot Interaction. J. Soc. Instrum. Control. Eng. 2005, 44, 846–852. (In Japanese) [Google Scholar]
- Kawashima, K. A Trial of Case-Study Classification and Extraction of Therapeutic Effects of Robot-Therapy: Literature Review with Descriptive-Analysis. Clin. Psychol. Dep. Res. Rep. 2013, 6, 155–167. [Google Scholar]
- Wada, K.; Shibata, T.; Tanie, K. Robot Therapy at a Health Service Facility for the Aged. Trans. Soc. Instrum. Control. Eng. 2006, 42, 386–392. [Google Scholar] [CrossRef]
- Troncone, A.; Saturno, R.; Buonanno, M.; Pugliese, L.; Cordasco, G.; Vogel, C.; Esposito, A. Advanced Assistive Technologies for Elderly People: A Psychological Perspective on Older Users’ Needs and Preferences (Part B). Acta Polytech. Hung 2021, 18, 29–44. [Google Scholar] [CrossRef]
- Kanner, L. Autistic disturbances of affective contact. Nervous Child 1943, 2, 217–250. [Google Scholar]
- Edition, F. Diagnostic and Statistical Manual of Mental Disorders (DSM-5®). Am. Psychiatr. Assoc. 2013, 21, 591–643. [Google Scholar]
- Sadock, B.J. Kaplan and Sadock’s Synopsis of Psychiatry: Behavioral Sciences/Clinical Psychiatry; Lippincott Williams & Wilkins: Baltimore, MD, USA, 2007. [Google Scholar]
- Alabdulkareem, A.; Alhakbani, N.; Al-Nafjan, A. A Systematic Review of Research on Robot-Assisted Therapy for Children with Autism. Sensors 2022, 22, 944. [Google Scholar] [CrossRef]
- Rudovic, O.; Lee, J.; Dai, M.; Schuller, B.; Picard, R.W. Personalized machine learning for robot perception of affect and engagement in autism therapy. Sci. Robot. 2018, 3, 19. [Google Scholar] [CrossRef] [Green Version]
- Wood, L.J.; Zaraki, A.; Robins, B.; Dautenhahn, K. Developing kaspar: A humanoid robot for children with autism. Int. J. Soc. Robot. 2021, 13, 491–508. [Google Scholar] [CrossRef] [Green Version]
- Kozima, H.; Nakagawa, C.; Yasuda, Y. Children–robot interaction: A pilot study in autism therapy. Prog. Brain Res. 2007, 164, 385–400. [Google Scholar]
- Kozima, H.; Michalowski, M.P.; Nakagawa, C. Keepon. Int. J. Soc. Robot. 2009, 1, 3–18. [Google Scholar] [CrossRef]
- Miyamoto, E.; Lee, M.; Fujii, H.; Okada, M. How can robots facilitate social interaction of children with autism? Possible implications for educational environments. In Proceedings of the Fifth International Workshop on Epigenetic Robotics: Modeling Cognitive Development in Robotic Systems, Nara, Japan, 22–24 July 2005. [Google Scholar]
- Costa, A.P.; Steffgen, G.; Lera, F.R.; Nazarikhorram, A.; Ziafati, P. Socially assistive robots for teaching emotional abilities to children with autism spectrum disorder. In Proceedings of the 3rd Workshop on Child-Robot Interaction at HRI2017, Vienna, Austria, 6 March 2017. [Google Scholar]
- Casas-Bocanegra, D.; Gomez-Vargas, D.; Pinto-Bernal, M.J.; Maldonado, J.; Munera, M.; Villa-Moreno, A.; Stoelen, M.F.; Belpaeme, T.; Cifuentes, C.A. An Open-Source Social Robot Based on Compliant Soft Robotics for Therapy with Children with ASD. Actuators 2020, 9, 91. [Google Scholar] [CrossRef]
- Shamsuddin, S.; Ismail, L.I.; Yussof, H.; Zahari, N.I.; Bahari, S.; Hashim, H.; Jaffar, A. Humanoid robot NAO: Review of control and motion exploration. In Proceedings of the 2011 IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, 25–27 November 2011; pp. 511–516. [Google Scholar]
- Shamsuddin, S.; Yussof, H.; Ismail, L.I.; Mohamed, S.; Hanapiah, F.A.; Zahari, N.I. Initial response in HRI-a case study on evaluation of child with autism spectrum disorders interacting with a humanoid robot Nao. Procedia Eng. 2012, 41, 1448–1455. [Google Scholar] [CrossRef]
- Boucsein, W. Electrodermal Activity; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Stern, R.M.; Ray, W.J.; Quigley, K.S. Psychophysiological Recording; Oxford University Press: Oxford, UK, 2001. [Google Scholar]
- Bradley, M.M.; Miccoli, L.; Escrig, M.A.; Lang, P.J. The pupil as a measure of emotional arousal and autonomic activation. Psychophysiology 2008, 45, 602–607. [Google Scholar] [CrossRef] [Green Version]
- Dawson, M.E.; Schell, A.M.; Filion, D.L. The electrodermal system. In Handbook of Psychophysiology; Cacioppo, J.T., Tassinary, L.G., Berntson, G.G., Eds.; Cambridge University Press: Cambridge, UK, 2017; pp. 217–243. [Google Scholar]
- Dawson, M.E.; Schell, A.M.; Courtney, C.G. The skin conductance response, anticipation, and decision-making. J. Neurosci. Psychol. Econ. 2011, 4, 111. [Google Scholar] [CrossRef]
- Caruelle, D.; Gustafsson, A.; Shams, P.; Lervik-Olsen, L. The use of electrodermal activity (EDA) measurement to understand consumer emotions–A literature review and a call for action. J. Bus. Res. 2019, 104, 146–160. [Google Scholar] [CrossRef]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161. [Google Scholar] [CrossRef]
- Ayata, D.; Yaslan, Y.; Kamaşak, M. Emotion recognition via galvanic skin response: Comparison of machine learning algorithms and feature extraction methods. J. Electr. Electron. Eng. 2017, 17, 3147–3156. [Google Scholar]
- Akansu, A.N.; Serdijn, W.A.; Selesnick, I.W. Emerging applications of wavelets: A review. Phys. Commun. 2010, 3, 1–18. [Google Scholar] [CrossRef]
- Feng, H.; Golshan, H.M.; Mahoor, M.H. A wavelet-based approach to emotion classification using EDA signals. Expert Syst. Appl. 2018, 112, 77–86. [Google Scholar] [CrossRef]
- Kelley, J.F. An iterative design methodology for user-friendly natural language office information applications. ACM Trans. Inf. Syst. 1984, 2, 26–41. [Google Scholar] [CrossRef]
- Taheri, A.; Meghdari, A.; Alemi, M.; Pouretemad, H.; Poorgoldooz, P.; Roohbakhsh, M. Social robots and teaching music to autistic children: Myth or reality. In Social Robotics, Proceedings of the 8th International Conference on Social Robotics, ICSR 2016, Kansas City, MO, USA, 1–3 November 2016; Springer: Cham, Switzerland, 2016; pp. 541–550. [Google Scholar]
- Rudovic, O.; Lee, J.; Mascarell-Maricic, L.; Schuller, B.W.; Picard, R.W. Measuring engagement in robot-assisted autism therapy: A cross-cultural study. Front. Robot. AI 2017, 4, 36. [Google Scholar] [CrossRef] [Green Version]
- Gross, T.F. The perception of four basic emotions in human and nonhuman faces by children with autism and other developmental disabilities. J. Abnorm. Child Psychol. 2004, 32, 469–480. [Google Scholar] [CrossRef] [PubMed]
- Baron-Cohen, S.; Leslie, A.M.; Frith, U. Does the autistic child have a “theory of mind”? Cognition 1985, 21, 37–46. [Google Scholar] [CrossRef]
- Robins, B.; Ferrari, E.; Dautenhahn, K.; Kronreif, G.; Prazak-Aram, B.; Gelderblom, G.j.; Tanja, B.; Caprino, F.; Laudanna, E.; Marti, P. Human-centred design methods: Developing scenarios for robot assisted play informed by user panels and field trials. Int. J. Hum. Comput. Stud. 2010, 68, 873–898. [Google Scholar] [CrossRef] [Green Version]
S1 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
---|---|---|---|---|---|---|---|---|---|
Raw | Anger | 0.43149 | 0.00203 | 0.43654 | 0.42758 | 4.20634 | −0.00050 | 7 | 0 |
Joy | 0.46136 | 0.01170 | 0.52360 | 0.43782 | 6.79693 | −0.00096 | 10 | 0 | |
Sadness | 0.41464 | 0.00817 | 0.44039 | 0.39814 | 10.2608 | 0.00036 | 17 | 0 | |
Surprise | 0.35291 | 0.01088 | 0.38790 | 0.32773 | 16.1393 | 0.00035 | 42 | 0 | |
Wavelet | Anger | 0.43150 | 0.00197 | 0.43669 | 0.42761 | 4.20636 | −0.0005 | 6 | 0 |
Joy | 0.46117 | 0.00949 | 0.49402 | 0.43901 | 6.79779 | −0.00103 | 8 | 0 | |
Sadness | 0.41465 | 0.00807 | 0.44204 | 0.39984 | 10.2610 | 0.00036 | 12 | 0 | |
Surprise | 0.35291 | 0.01075 | 0.38882 | 0.33504 | 16.1395 | 0.00034 | 20 | 0 | |
S2 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 0.52904 | 0.04389 | 0.57608 | 0.4391 | 6.21269 | −0.00947 | 7 | 0 |
Joy | 0.47970 | 0.02236 | 0.50951 | 0.38790 | 3.25197 | −0.00155 | 6 | 0 | |
Sadness | 0.50582 | 0.23713 | 0.66058 | 0 | 4.47646 | −0.05765 | 6 | 0 | |
Surprise | 0.68445 | 0.02883 | 0.73381 | 0.61860 | 5.30536 | 0.00308 | 5 | 0 | |
Wavelet | Anger | 0.43150 | 0.00197 | 0.43669 | 0.42761 | 4.20636 | −0.00050 | 6 | 0 |
Joy | 0.47925 | 0.02133 | 0.50300 | 0.40257 | 3.24552 | −0.00093 | 4 | 0 | |
Sadness | 0.50591 | 0.23201 | 0.68548 | −0.00423 | 4.47763 | −0.0577 | 6 | 2 | |
Surprise | 0.68445 | 0.02851 | 0.73536 | 0.61731 | 5.30490 | 0.00314 | 4 | 0 | |
Raw | Anger | 0.01057 | 0.00342 | 0.01537 | 0.00129 | 0.08179 | −0.00065 | 8 | 0 |
Joy | 1.34860 | −0.05213 | 1.52668 | 0.32300 | 8.54974 | −0.05213 | 6 | 0 | |
Sadness | 0.13925 | 0.01050 | 0.16262 | 0.12036 | 0.66230 | −0.00079 | 4 | 0 | |
Surprise | 0.19759 | 0.02413 | 0.25865 | 0.16774 | 2.16917 | 0.00464 | 12 | 0 | |
Wavelet | Anger | 0.01058 | 0.00307 | 0.01495 | 0.00315 | 0.08181 | −0.00063 | 3 | 0 |
Joy | 1.35171 | 0.21511 | 1.59836 | 0.50761 | 8.54695 | −0.04914 | 5 | 0 | |
Sadness | 0.13916 | 0.00998 | 0.15575 | 0.11871 | 0.66207 | −0.00076 | 2 | 0 | |
Surprise | 0.19887 | 0.02437 | 0.25580 | 0.16911 | 2.23338 | 0.00496 | 6 | 0 | |
S4 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 2.84252 | 0.19124 | 3.30177 | 2.67971 | 24.8388 | 0.05799 | 6 | 0 |
Joy | 2.51112 | 0.11441 | 2.7619 | 2.36640 | 19.4837 | −0.01499 | 5 | 0 | |
Sadness | 3.13211 | 0.09746 | 3.37641 | 3.02315 | 24.2489 | 0.02746 | 5 | 0 | |
Surprise | 2.25934 | 0.16982 | 2.59158 | 2.02948 | 19.7752 | −0.05478 | 4 | 0 | |
Wavelet | Anger | 2.84270 | 0.19162 | 3.29200 | 2.67752 | 24.8380 | 0.05815 | 5 | 0 |
Joy | 2.51067 | 0.11261 | 2.74637 | 2.37213 | 19.48200 | −0.01502 | 3 | 0 | |
Sadness | 3.13210 | 0.09633 | 3.37472 | 3.05084 | 24.2487 | 0.02746 | 2 | 0 | |
Surprise | 2.25934 | 0.16982 | 2.59158 | 2.02948 | 19.7752 | −0.05478 | 4 | 0 | |
S5 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 33.9771 | 0.88651 | 36.0453 | 32.9277 | 127.196 | −0.62724 | 2 | 0 |
Joy | 34.3520 | 0.34325 | 34.8394 | 33.4283 | 163.152 | 0.16487 | 4 | 0 | |
Sadness | 41.1273 | 0.44909 | 41.6308 | 39.7299 | 359.940 | −0.08252 | 6 | 0 | |
Surprise | 37.5190 | 1.59681 | 39.7418 | 34.0201 | 178.622 | 0.77806 | 2 | 0 | |
Wavelet | Anger | 33.9707 | 0.87512 | 36.0630 | 33.0540 | 127.000 | −0.64781 | 2 | 0 |
Joy | 34.3571 | 0.33700 | 34.7807 | 33.5023 | 163.000 | 0.15909 | 2 | 0 | |
Sadness | 41.1272 | 0.43593 | 41.5292 | 39.8615 | 360.000 | −0.08425 | 5 | 0 | |
Surprise | 37.5162 | 1.60449 | 39.5258 | 34.0479 | 179.000 | 0.78431 | 1 | 0 | |
S6 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 1.11767 | 1.05655 | 4.30188 | 0 | 9.37177 | −0.32224 | 5 | 0 |
Joy | 4.50357 | 1.62758 | 6.06718 | 0.80439 | 39.1786 | 0.47677 | 3 | 0 | |
Sadness | 6.14001 | 0.10694 | 6.24970 | 5.82469 | 23.0507 | −0.07339 | 0 | 0 | |
Surprise | 7.87500 | 0.30894 | 6.47427 | 4.79894 | 92.4318 | −0.03553 | 9 | 0 | |
Wavelet | Anger | 1.11760 | 1.05880 | 4.29840 | −0.16029 | 9.38515 | −0.32420 | 4 | 4 |
Joy | 4.50490 | 1.62674 | 6.03391 | 0.96924 | 39.1920 | 0.47578 | 3 | 0 | |
Sadness | 6.14143 | 0.10092 | 6.24910 | 5.87374 | 23.0504 | −0.07152 | 1 | 0 | |
Surprise | 5.87119 | 0.29414 | 6.47482 | 5.25587 | 92.4337 | −0.03549 | 6 | 0 | |
S7 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 0.11154 | 0.00396 | 0.11904 | 0.09728 | 3.42962 | −0.00034 | 23 | 0 |
Joy | 0.11540 | 0.01626 | 0.14848 | 0.07808 | 4.58717 | −0.00020 | 27 | 0 | |
Sadness | 0.10286 | 0.01197 | 0.12544 | 0.04480 | 3.78928 | 0.00033 | 29 | 0 | |
Surprise | 0.09654 | 0.01340 | 0.12928 | 0.05504 | 3.83536 | −0.00054 | 29 | 0 | |
Wavelet | Anger | 0.11153 | 0.00383 | 0.11988 | 0.09826 | 3.42974 | −0.00034 | 18 | 0 |
Joy | 0.11540 | 0.01612 | 0.14951 | 0.08161 | 4.58725 | −0.00020 | 22 | 0 | |
Sadness | 0.10286 | 0.01176 | 0.11850 | 0.04423 | 3.78926 | 0.00033 | 25 | 0 | |
Surprise | 0.09654 | 0.01310 | 0.12113 | 0.05889 | 3.83521 | −0.00054 | 22 | 0 | |
S8 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 5.91103 | 0.17873 | 6.25988 | 5.60335 | 116.714 | 0.00314 | 9 | 0 |
Joy | 4.68259 | 0.11362 | 4.84751 | 4.35351 | 69.0545 | −0.01744 | 12 | 0 | |
Sadness | 6.43958 | 0.12888 | 6.68935 | 6.19407 | 94.9965 | −0.00588 | 7 | 0 | |
Surprise | 6.70039 | 0.34390 | 7.77854 | 6.17104 | 219.547 | 0.01438 | 21 | 0 | |
Wavelet | Anger | 5.91138 | 0.17695 | 6.19766 | 5.61145 | 117.000 | 0.00329 | 12 | 0 |
Joy | 4.68262 | 0.11302 | 4.84765 | 4.39869 | 69.1000 | −0.01745 | 9 | 0 | |
Sadness | 6.43941 | 0.12853 | 6.68754 | 6.21155 | 95.0000 | −0.00590 | 6 | 0 | |
Surprise | 6.70031 | 0.34300 | 7.69505 | 6.15898 | 220.000 | 0.01438 | 18 | 0 |
S9 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
---|---|---|---|---|---|---|---|---|---|
Raw | Anger | 12.8038 | 0.50481 | 13.6106 | 11.5710 | 137.666 | −0.09282 | 8 | 0 |
Joy | 12.4684 | 0.82407 | 13.6375 | 9.82250 | 155.723 | 0.04463 | 7 | 0 | |
Sadness | 13.0196 | 0.65215 | 14.0324 | 11.8252 | 153.147 | 0.01768 | 9 | 0 | |
Surprise | 12.5225 | 0.33536 | 13.2030 | 12.0830 | 134.619 | 0.08715 | 7 | 0 | |
Wavelet | Anger | 12.8047 | 0.49672 | 13.5953 | 11.6240 | 138.000 | −0.09210 | 6 | 0 |
Joy | 12.4711 | 0.80853 | 13.6875 | 9.95330 | 159.000 | 0.04415 | 5 | 0 | |
Sadness | 13.0218 | 0.63702 | 14.0203 | 11.9910 | 153.000 | 0.01878 | 6 | 0 | |
Surprise | 12.5225 | 0.33543 | 13.2109 | 12.0873 | 135.000 | 0.08741 | 5 | 0 | |
S10 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw | Anger | 4.62988 | 0.54054 | 5.56541 | 3.50872 | 68.3935 | 0.09091 | 5 | 0 |
Joy | 4.39484 | 0.55553 | 5.06021 | 3.48950 | 73.6648 | 0.09339 | 12 | 0 | |
Sadness | 2.42979 | 0.16078 | 2.69310 | 2.12594 | 23.6907 | −0.04779 | 3 | 0 | |
Surprise | 3.63377 | 0.22659 | 4.04169 | 3.12761 | 57.2970 | 0.01891 | 7 | 0 | |
Wavelet | Anger | 4.62950 | 0.53832 | 5.51474 | 3.49822 | 68.3901 | 0.09107 | 4 | 0 |
Joy | 4.39519 | 0.55496 | 5.05378 | 3.51319 | 73.6676 | 0.09337 | 7 | 0 | |
Sadness | 2.42985 | 0.15934 | 2.69442 | 2.16082 | 23.6916 | −0.04788 | 2 | 0 | |
Surprise | 3.63373 | 0.22605 | 4.02052 | 3.11828 | 57.2966 | 0.01894 | 4 | 0 | |
S11 | Emotion | Mean | SD | Max. | Min. | Integral | Slope | Peak | Zero c. |
Raw data | Anger | 4.77608 | 0.11609 | 5.00440 | 4.58069 | 75.2565 | 0.00348 | 8 | 0 |
Joy | 4.98669 | 0.14246 | 5.29370 | 4.78678 | 118.464 | 0.00435 | 13 | 0 | |
Sadness | 5.82886 | 0.18590 | 6.20250 | 5.62175 | 62.6857 | 0.02628 | 5 | 0 | |
Surprise | 6.32382 | 0.26272 | 6.99116 | 6.03171 | 93.2778 | 0.00233 | 4 | 0 | |
Wavelet | Anger | 4.77612 | 0.11578 | 5.00377 | 4.58275 | 75.2560 | 0.00350 | 7 | 0 |
Joy | 4.98670 | 0.14240 | 5.29604 | 4.78298 | 118.000 | 0.00435 | 9 | 0 | |
Sadness | 5.82887 | 0.18581 | 6.19785 | 5.62209 | 62.6856 | 0.02627 | 2 | 0 | |
Surprise | 6.32389 | 0.26278 | 6.98930 | 6.03438 | 93.2793 | 0.00228 | 3 | 0 |
Confusion Matrix | Emotion | Surprise | Anger | Happy | Sad |
---|---|---|---|---|---|
Raw data | surprise | 4 | 5 | 2 | 0 |
anger | 0 | 7 | 4 | 0 | |
happy | 1 | 4 | 5 | 1 | |
sad | 1 | 8 | 1 | 1 | |
Wavelet | surprise | 9 | 0 | 2 | 0 |
anger | 7 | 1 | 3 | 0 | |
happy | 6 | 0 | 5 | 0 | |
sad | 10 | 1 | 0 | 0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nagae, T.; Lee, J. Understanding Emotions in Children with Developmental Disabilities during Robot Therapy Using EDA. Sensors 2022, 22, 5116. https://doi.org/10.3390/s22145116
Nagae T, Lee J. Understanding Emotions in Children with Developmental Disabilities during Robot Therapy Using EDA. Sensors. 2022; 22(14):5116. https://doi.org/10.3390/s22145116
Chicago/Turabian StyleNagae, Taisuke, and Jaeryoung Lee. 2022. "Understanding Emotions in Children with Developmental Disabilities during Robot Therapy Using EDA" Sensors 22, no. 14: 5116. https://doi.org/10.3390/s22145116
APA StyleNagae, T., & Lee, J. (2022). Understanding Emotions in Children with Developmental Disabilities during Robot Therapy Using EDA. Sensors, 22(14), 5116. https://doi.org/10.3390/s22145116