ICACS23CameraReady ID 8295
ICACS23CameraReady ID 8295
ICACS23CameraReady ID 8295
net/publication/384674811
CITATIONS READS
32 22
3 authors:
Ahmad Jalal
Pohang University of Science and Technology
506 PUBLICATIONS 15,697 CITATIONS
SEE PROFILE
All content following this page was uploaded by Ahmad Jalal on 06 October 2024.
Abstract— Smart homes have grown in popularity not only as a towards the model where gestures are recognized and action is
luxury but also because of the numerous benefits they provide. performed accordingly.
In this research, a home automation system is developed for the
elders because as the number of elders rises, so does the
probability that patients will develop geriatric problems, which
necessitates society to address the issue. It is especially beneficial
for senior citizens and disabled youngsters. Many research and
innovation is conducting on in the field of gestures recognition.
In this project, home automation is performed through the use
of gestures to control appliances and contradicting the computer
vision approaches as an elder person is not capable for ensuring
the environment for the computer vision techniques as it
requires proper lightning conditions and angle to ensure the
parameters. Sensor embedded Hand glove that collects hand
motions has been discussed in this study. The wearable device
detects and records tilting, rotation, and acceleration of the hand
movement using accelerometers and gyroscopes. Our proposed
human gestures recognition (HGR) system recognizes nine Fig. 1 Prototype of the hand glove where inertial sensor MPU 9250 is used
different hand gestures taken from benchmarked dataset. We for gestures recognition.
used a combination of features extraction algorithms and a
random forest classifier to compare our system's performance Two common approaches for activity recognition are
with other well-known classifiers. We have achieved an accuracy computer vision-based [5] and sensors-based [6,7]. Motion
of 94% over the benchmark HGR dataset. Experiments have tracking via inertial measurement units (IMUs) on mobile and
shown that the proposed approach has the capability to wearable devices has attracted significant interest in recent
recognize gestures for controlling home appliances and can be years [8] . The processes involved in identifying human
used in healthcare, residences, offices, and educational activities include the extraction of features, categorization [9],
environments.
and subject detection. The extensive advancements in inertial
sensors and their roles in interior environments make it
Keywords—Genetic algorithm, hand gestures recognition, inertial difficult to reliably identify ordinary human
sensors, t-distributed stochastic neighbor embedding. activities/interactions. Smartphones have now evolved into
wearable mini computers with numerous sensors as a result of
I. INTRODUCTION advancements in microelectronic that has increased the
Hand gestures gotten a lot of attention in past few years processing power and it has also enabled new input interfaces,
because of its massive worth for development of a variety of such as gestures-based user interfaces [10].
applications including healthcare and rehabilitation [1]. As
one of the most popular human-computer interaction is hand Internet of things (IoT) is a system where connected
motion tracking [2]. Hand gestures are a powerful and a devices (such as fans, air conditioners, and other home
natural way for people to communicate with one another and appliances) are integrated with sensors to exchange the data
they are traditionally detected and recognized using computer and provide the service [11]. In order to improve the home
vision algorithms based on cameras [3]. Such techniques, on automation experience as it is one of the most current IoT [12]
the other hand, can be slow and demand a lot of processing research initiatives [13]. This emphasizes the device's
power, which means they use a lot of energy. For resource- importance to the users [14]. Many IoT-related solutions have
constrained devices, using embedded sensors i.e., inertial been presented to make everyday objects [15] communication
sensors to conduct gestures recognition is a better option. The capabilities [16]. One of these is HGR that has emerged as a
advantage of using embedded sensors is that gestures feasible method of communication with the automated home
detection may be done on the device itself, and the accuracy is systems [17]. Hand gestures and its recognition is of an idea
unaffected by illumination or space issues. In addition, the cost associated with some action or reaction that user performs in
and energy are reduced due to embedded sensors [4]. order to achieve a goal [18]. As a result, hand gestures are
becoming more popular for controlling home automation
Hand glove the prototype that we have design to recognize systems. It is possible to distinguish between vision-based that
the gestures is shown in Fig. 1. glove includes inertial sensor involves cameras and sensor-based that involves inertial
unit (MPU 9250) connected to the Arduino UNO. MPU 9250 sensors techniques for gestures recognition. Computer vision
is configured with the Arduino UNO and the signals are access (CV)-based HGR is recognized by evaluating pictures
through port by serial communication, the signals are directed acquired from videos [19-23], whereas inertial measurement
counterclockwise circle and clockwise circle) that has been ONE 0.05 0.07 0.12 0.00 0.00 0.02 0.01 0.31 0.43
used to change device states, for example mode selecting, SIX 0.00 0.01 0.00 0.00 0.00 0.00 0.00 0.00 0.99
left/right, on/off. Furthermore, we enable continuous hand CCW CW MOVE MOVE MOVE MOVE
ZERO ONE SIX
CIRCLE
gestures recognition that detects the starting and ending of CIRCLE DOWN LEFT RIGHT UP
Naïve Bayes Decision Tree Random Forest In this paper, we presented an approach for the recognition
Dynamic
Activities
Re- F- Re- F- Re- F-
of hand gestures to automate appliances. The proposed HGR
Precision Precision Precision
call measure call Measure call Measure
system targets the domains of human computer interaction for
Zero (0) 0.63 0.39 0.48 0.64 0.45 0.53 0.98 0.85 0.87
controlling home appliances and applicable in healthcare,
homes, offices, and for other educational purposes. Whereas,
One (1) 0.48 0.79 0.59 0.54 0.63 0.58 0.85 0.96 0.90 wearable sensor (Hand glove) based gesture recognition
technology in comparison to computer vision-based
Six (6) 0.61 0.43 0.51 0.48 0.47 0.48 0.94 0.91 0.92
recognition, it requires only lightweight processing along with
CCW
Circle
0.48 0.44 0.46 0.41 0.44 0.42 0.85 0.84 0.85 that it is low cost and requires low power and doesn’t violate
CW
users' privacy, and is unaffected by the lighting environment.
0.66 0.59 0.63 0.62 0.70 0.66 0.98 0.93 0.96
Circle Random forest results showed that our approach outperformed
Move
0.92 0.97 0.94 0.99 0.96 0.98 1.00 1.00 1.00 the recognition accuracy for accurate gestures recognition.
Down
Move
Left
0.87 0.99 0.93 0.90 0.90 0.89 1.00 1.00 1.00 REFERENCES
Move
[1] S. Perez-Gamboa, Q. Sun and Y. Zhang, "Improved Sensor Based Human
1.00 0.99 0.99 0.99 0.99 0.99 1.00 1.00 1.00
Right Activity Recognition via Hybrid Convolutional and Recurrent Neural
Networks,"IEEE International Symposium on Inertial Sensors and
Move Up 1.00 0.99 1.00 1.00 0.99 1.00 1.00 1.00 1.00
Systems (INERTIAL), 2021.
CCW= Counter Clockwise Circle, CW= Clockwise Circle [2] P. Meier, K. Rohrmann, M. Sandner and M. Prochaska, "Application of
magnetic field sensors for hand gesture recognition with neural networks,"
During the comparison we have compared the accuracy 2019 IEEE 1st Global Conference on Life Sciences and Technologies
with Naïve Bayes [55], random forest [56] and decision tree (LifeTech), 2019.
[57] to examine the system performance [58]. The comparison [3] N. Dawar and N. Kehtarnavaz, "Real-Time Continuous Detection and
Recognition of Subject-Specific Smart TV Gestures via Fusion of Depth and
table is visualized in Fig. 10. Inertial Sensing," in IEEE Access, vol. 6, 2018.
[4] Y. -T. Wang and H. -P. Ma, "Real-Time Continuous Gesture Recognition with
Wireless Wearable IMU Sensors," IEEE 20th International Conference on e-
Health Networking, Applications and Services (Healthcom), 2018.
[5] A. Jalal, S. Kamal and D. Kim, "Shape and Motion Features Approach
for Activity Tracking and Recognition from Kinect Video Camera,"
2015 IEEE 29th International Conference on Advanced Information
Networking and Applications Workshops, 2015, pp. 445-450, doi:
10.1109/WAINA.2015.38.
[6] A. Jalal, Quaid, M.A.K, "Wearable sensors based human behavioral
pattern recognition using statistical features and reweighted genetic
algorithm. Multimed", Tools Appl 79, 6061–6083 (2020).
https://doi.org/10.1007/s11042-019-08463-7
[7] A. Jalal, S. Kamal, "A Hybrid Feature Extraction Approach for Human
Detection, Tracking and Activity Recognition Using Depth
Sensors". Arab J Sci Eng 41, 1043–1051 (2016).
https://doi.org/10.1007/s13369-015-1955-8
[8] M. Javeed, A. Jalal and K. Kim, "Wearable Sensors based Exertion
Fig. 10. Comparison chart for Accuracy, Precision, Recall and F-score for
Recognition using Statistical Features and Random Forest for Physical
the proposed classifiers over HGR dataset Healthcare Monitoring," 2021 International Bhurban Conference on
Applied Sciences and Technologies (IBCAST), 2021, pp. 512-517, doi:
In Fig. 11 for the corresponding classifiers, the error rate 10.1109/IBCAST51254.2021.9393014.
for each classifier is listed. The utilized classifiers in proposed [9] A. Jalal, Quaid, M.A.K. & Kim, K. "A Wrist Worn Acceleration Based
method include random forest, decision tree and naïve Bayes Human Motion Analysis and Classification for Ambient Smart Home
for the comparison and accuracy improvement purposes. System". J. Electr. Eng. Technol. 14, 1733–1739 (2019).
https://doi.org/10.1007/s42835-019-00187-w
[10] A. Jalal, M. A. K. Quaid and A. S. Hasan, "Wearable Sensor-Based
Human Behavior Understanding and Recognition in Daily Life for
Smart Environments," 2018 International Conference on Frontiers of
Information Technology (FIT), 2018, pp. 105-110, doi:
10.1109/FIT.2018.00026.
[11] K. Feng and F. Yuan, "Static hand gesture recognition based on HOG
characters and support vector machines," 2013 2nd International
Symposium on Instrumentation and Measurement, Sensor Network and
Automation (IMSNA), 2013, pp. 936-938
[12] A. Jalal, and S. Kim, "The Mechanism of Edge Detection using the
Block Matching Criteria for the Motion Estimation," Proc. HCI,
pp.484-489, 2005.
[13] A. Jalal, and S. Kim, "A complexity removal in the floating point and
rate control phenomenon," Proc. KMS, pp.48-51, 2005
[14] A. Jalal and S. Kamal, "Real-time life logging via a depth silhouette-
based human activity recognition system for smart home services,"
2014 11th IEEE International Conference on Advanced Video and
Signal Based Surveillance (AVSS), 2014, pp. 74-80, doi:
10.1109/AVSS.2014.6918647.
[15] A. Jalal, Y. Kim, and D. Kim, “Ridge body parts features for human
pose estimation and recognition from RGB-D video data,” ICCNT, pp.
Fig. 11. Error rate calculated for the different classifiers 1-6, 2014.
[16] A. Jalal and Y. Kim, “Dense Depth Maps-based Human Pose Tracking [41] A. Jalal, S. Kamal and C. Cecer, “Depth maps-based human segmentation and
and Recognition in Dynamic Scenes Using Ridge Data, AVSS, pp. 119- action recognition using full-body plus body color cues via recognizer engine,
124, 2014. JEET, 2018.
[17] T. Kim, A. Jalal, H. Han, H. Jeon and J. Kim, “Real-Time Life Logging via [42] H. AlShu'eili, G. S. Gupta and S. Mukhopadhyay, "Voice recognition
Depth Imaging-based Human Activity Recognition towards Smart Homes based wireless home automation system," 2011 4th International
Services,” ISRESHB, pp. 63, 2013. Conference on Mechatronics (ICOM), 2011, pp. 1-6, doi:
[18] H. Zou, Y. Zhou, J. Yang, H. Jiang, L. Xie and C. J. Spanos, "WiFi- 10.1109/ICOM.2011.5937116.
enabled Device-free Gesture Recognition for Smart Home [43] Y. Zhao, W. Wang and Y. Wang, "A real-time hand gesture recognition
Automation," 2018 IEEE 14th International Conference on Control and method," International Conference on Electronics, Communications
Automation (ICCA), 2018, pp. 476-481, doi: and Control (ICECC), pp. 2475-2478, 2011.
10.1109/ICCA.2018.8444331. [44] F. -T. Liu, Y. -T. Wang and H. -P. Ma, "Gesture recognition with
[19] A. Yang, S. M. Chun, and J. Kim, "Detection and recognition of hand gesture wearable 9-axis sensors," IEEE International Conference on
for wearable applications in iomt," in 20th international conference on Communications (ICC), pp. 1-6, 2017
advanced communication technology, ICACT, pp. 1046-1053, 2018. [45] J. Wang and F. Chuang, "An Accelerometer-Based Digital Pen With a
[20] P. Mahwish, A. Jalal, and K. Kim, “Hybrid algorithm for multi people counting Trajectory Recognition Algorithm for Handwritten Digit and Gesture
and tracking for smart surveillance,” IEEE IBCAST, 2021. Recognition," in IEEE Transactions on Industrial Electronics, vol. 59,
[21] A. Ahmed, A. Jalal, and K. Kim, “Multi‑objects detection and segmentation no. 7, pp. 2998-3007, 2012
for scene understanding based on Texton forest and kernel sliding perceptron, [46] G. E. Hinton and S. T. Roweis, "Stochastic neighbor embedding,"
JEET,, 2020. Advances in Neural Information Processing Systems, pp. 833–840,
[22] A. Jalal, A. Ahmed, A. Rafique and K. Kim “Scene semantic recognition based 2002.
on modified Fuzzy c-mean and maximum entropy using object-to-object [47] Fábio Henrique M. Oliveira, Alessandro R. P. Machado, Adriano O.
relations”, IEEE Access, 2021. Andrade, "On the Use of t-Distributed Stochastic Neighbor Embedding
[23] B. W. Min, H. S. Yoon, J. Soh.,Y. M. Yang, and T. Ejima, "Hand gesture for Data Visualization and Classification of Individuals with
recognition using hidden Markov models," IEEE international conference on Parkinson’s Disease," Computational and Mathematical Methods in
systems, man, and cybernetics, vol. 5, pp. 4232-4235, 1997. Medicine, vol. 2018, Article ID 8019232, 17 pages, 2018.
[24] V. I. Pavlovic, R. Sharma. and T. S. Huang, "Visual interpretation of hand [48] Md. Rezwanul Ahsan, Muhammad Ibn Ibrahimy, Othman O. Khalifa,
gestures for human-computer interaction," A review. IEEE transactions on "Electromygraphy (EMG) Signal based Hand Gesture Recognition
pattern analysis and machine intelligence, vol. 19, no. 7, 1997. using Artificial Neural Network (ANN)," 4th International Conference
on Mechatronics (ICOM), 2011.
[25] R. Bisma and A. Jalal, “Object detection and segmentation for scene
understanding via multi-features and random forest,” in IEEE Conference on [49] Rashedul Islam, Sheraz Ali Khan, Jong-myon Kim, "Discriminant
Advancements in Computational Sciences, 2023. Feature Distribution Analysis-Based Hybrid Feature Selection for
Online Bearing Fault Diagnosis in Induction Motors," Journal of
[26] A. Hamid, Samia A. Chelloug and A. Jalal, “3D shape estimation from RGB
Sensors, Article ID 7145715, 16 pages, 2016.
data using 2.5D features and deep learning,” in IEEE ICACS, 2023.
[50] Quaid, M.A.K., Jalal, A., "Wearable sensors based human behavioral
[27] A. Faisal, Samia A. Chelloug and A. Jalal, “Multi-pedestrians anomaly
pattern recognition using statistical features and reweighted genetic
detection via conditional random field and deep learning,” in IEEE Conference
algorithm. Multimed Tools Appl," 2020.
on Advancements in Computational Sciences, 2023.
[51] S. Rakesh, G. Kovács, H. Mokayed, R. Saini and U. Pal, "Static Palm
[28] A. Shuja and A. Jalal, “Vehicle detection and tracking from Aerial imagery via Sign Gesture Recognition with Leap Motion and Genetic Algorithm,"
YOLO and centroid tracking,” in IEEE ICACS, 2023. 2021 Swedish Artificial Intelligence Society Workshop (SAIS), 2021.
[29] W. Manahil, J. Madiha and A. Jalal, “A novel deep learning model for [52] S. Canavan, W. Keyes, R. Mccormick, J. Kunnumpurath, T. Hoelzel and
understanding two-person interactions using depth sensors,” in proc. ICIC, L. Yin, "Hand gesture recognition using a skeleton-based feature
2021. representation with a random regression forest," 2017 IEEE
[30] J. Madiha and A. Jalal, “Body-worn hybrid-sensors based motion patterns International Conference on Image Processing (ICIP), 2017.
detection via bag-of-features and Fuzzy logic optimization,” in proc. ICIC, [53] W. Liu, Y. Fan, T. Lei and Z. Zhang, "Human gesture recognition using
2021. orientation segmentation feature on random rorest," 2014 IEEE China
[31] A. Ayesha and A. Jalal, “Automated body parts estimation and detection using Summit & International Conference on Signal and Information
salient maps and Gaussian matrix model,” in proc. IBCAST, 2021. Processing (ChinaSIP), 2014.
[32] A. Shahzad and A. Jalal, “A smart surveillance system for pedestrian tracking [54] K. N. Trong, H. Bui and C. Pham, "Recognizing hand gestures for
and counting using template matching,” in proc. ICRAI, 2021. controlling home appliances with mobile sensors," 2019 11th
[33] T. Kim, A. Jalal, H. Han, H. Jeon and J. Kim, “Real-time life logging via depth International Conference on Knowledge and Systems Engineering
imaging-based human activity recognition towards smart homes services,” (KSE), 2019.
ISRESHB, 2013. [55] A. Jalal, M. Quaid, S. Tahir, and K. Kim, “A study of accelerometer
[34] J. Madiha, Samia A. Chelloug and A. Jalal, “Deep activity recognition based and gyroscope measurements in physical life-log activities detection
on patterns discovery for healthcare monitoring,” in IEEE Conference on systems”, Sensors, 2020.
Advancements in Computational Sciences, 2023. [56] A. Rafique, A. Jalal, and K. Kim, “Automated sustainable multi-object
[35] T. Fatima, H. Rahman and A. Jalal, “A novel framework for human action segmentation and recognition via modified sampling consensus and
recognition based on features fusion and decision tree,” in IEEE ICACS, 2023. Kernel sliding perceptron”, Symmetry, 2020.
[36] A. Israr, Samia A. Chelloug and A. Jalal, “Abnormal action recognition in [57] A. Jalal, I. Akhtar, and K. Kim, “Human posture estimation and
crowd scenes via deep data mining and random forest,” in IEEE Conference sustainable events classification via Pseudo-2D stick model and K-ary
on Advancements in Computational Sciences, 2023. tree hashing”, Sustainability, 2020.
[37] M. Asifa, Abdul Haleem Butt and A. Jalal, “Highway traffic surveillance over [58] A. Jalal, M. Batool, and K. Kim, “Sustainable wearable system: Human
UAV dataset via Blob detection and histogram of gradient,” in IEEE ICACS, behavior modeling for life-logging activities using K-Ary tree hashing
2023. classifier”, Sustainability, 2020.
[38] N. Amir, A. Jalal, and K. Kim, “Automatic human posture estimation for sport
activity recognition with robust body parts detection and entropy markov
model, MTA, 2021.
[39] I. Akhter, A. Jalal, and K. Kim, “Adaptive pose estimation for Gait event
detection using context‑aware model and hierarchical optimization,” JEET,
2021.
[39] U. Azmat and A. Jalal, “Smartphone inertial sensors for human locomotion
activity recognition based on template matching and codebook generation,”
ICT, 2021.
[40] A. Jalal and S. Kamal, “Improved behavior monitoring and classification
using cues parameters extraction from camera array images, IJIMAI, 2018.