Paper 62-Design and Experimental Analysis of Touchless Interactive Mirror
Paper 62-Design and Experimental Analysis of Touchless Interactive Mirror
Paper 62-Design and Experimental Analysis of Touchless Interactive Mirror
net/publication/341874153
Article in International Journal of Advanced Computer Science and Applications · January 2020
DOI: 10.14569/IJACSA.2020.0110562
CITATIONS READS
0 1,305
4 authors, including:
Attiya Baqai
Mehran University of Engineering and Technology
39 PUBLICATIONS 117 CITATIONS
SEE PROFILE
All content following this page was uploaded by Attiya Baqai on 04 June 2020.
Abstract—A prototype of a smart gesture-controlled mirror by observing the effect of different gestures from different
with enhanced interactivity is proposed and designed in this angles using different backgrounds and their timely response
paper. With the help of hand gestures, the mirror provides some to display intended information.
basic amenities like time, news, weather etc. The designed system
uses Pi cam for image acquisition to perform the functions like Moreover, relative to other touch or voice control smart
gesture recognition and an ultrasonic sensor for presence mirrors this device is also useful for deaf and dumb. With that,
detection. This paper also discusses the experimental analysis of it didn’t restrict the user to be in the close vicinity like in
human gesture interaction using parameters like the angle which conventional touch-based smart mirrors. The framework
ranges its horizon from 0 to 37 degree when tilting the forearm comprises finger count gestures to fetch and display the time,
up and down and 0 to 15 degrees when the forearm is twisted weather and news from the specified internet location. The
right to left and otherwise based on yellow, pink and white paper is organized as follows: The work related to the smart
background colours. Additionally, the range of detection using an mirrors and gestured controlled hardware and software is
ultrasonic sensor is restricted to the active region of 69.6 to 112.5 provided in Section 2 whereas Section 3 illustrates the design
degrees. Moreover, time delay which takes half a second for a steps and development of the prototype. Section 4 illustrates
time as retrieve with the system and take 6 to 10 seconds for the software algorithm used in the system. Section 5 describes
fetching headlines and weather information from the internet. the analysis of the developed system based on and its
These analyses are taken into account to subsequently improve interpretation of obtaining different results. The paper is
the design algorithm of the gesture-controlled smart mirror. The
concluded in Section 6.
framework developed comprises of three different gesture defects
under which mirror will display the mentioned information on its II. LITERATURE REVIEW
screen.
Many researchers proposed various methods and
Keywords—Smart Mirror; Raspberry Pi; Pi-camera; interaction techniques for smart mirrors. The intervention of
Application Programing Interface; Hue Saturation Value; Region gestures to interact with devices gave a new direction to smart
of Interest homes and smart devices. The author in [1] proposed Smart
Reflect—a software platform for developing smart mirror
I. INTRODUCTION applications. The authors in [2-3] uses external computers,
Human beings have always been striving to create new additional controllers, sensors and Application Program
technology to make their life easy, swift and comfortable. Interface (API) integration to operate and control the smart
Smart devices [1-5] including mobile phones, TVs, mirrors. With the use of a spy mirror foil, [4] uses the Wii
refrigerators, etc. are part of that chain. In today’s fast-paced Balance Board and Kinect to reflect the user’s position and
life, one is strained to do multitasking. This research involves their gestures. The author in [5] shows a survey of smart
making the mirror interactive so that a user can not only notice board mirror. Most of the smart mirrors developed so
beautify him or her but can also perform regular chores like far are thoroughly based on displaying the information with
checking the weather, accessing inbox or look the to-do list. less emphasis on the quick and easy way of doing it. Despite
using fewer components additional sensors, external
Keeping in view the limitations of the voice-controlled computers, controllers and software platforms are used to get
devices the interaction is based on gesture-controlled making it done. The author in [6] uses face recognition authentication
the proposed design also useful for speech impaired. The to provide access to data feeds on the smart mirror. The
framework proposed utilizes finger count gestures to fetch and devices in [7,8] take the voice commands to control the
display the time, weather and news from the specified internet mirror. By the time, different interaction mechanisms like
location. The efficiency of the interactive mirror is analyzed touch and voice are introduced to provide a better experience.
478 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 11, No. 5, 2020
People do prefer few efforts to interact with the devices. A flexibility by its gesture-controlled mechanism. The detail
survey in [9] illustrates the convenient ways of gesture description of hardware and software is presented as follows.
interaction over touch and voice with different algorithms. To
formulate the best technique for mid-air interaction, several A. Hardware
methods were considered as [10] uses the dollar method with The mirror is composed of a LED display module of
the best trajectory like multi-finger for processing of short Samsung upon which a 1mm acrylic sheet is pasted such that
command gestures. The author in [11] proposed finger it reflects the person when there is no light passing through it
transformation algorithm for controlling the mobile. and displays information when light passes through it. With
Moreover, to select the commands on multi-touch display a such a small thickness of the acrylic sheet as shown in
finger count gesture technique and analysis is introduced in Fig. 1(b), it offers high-quality visualization of the information
[12]. The main limitation of these methods is related to the displayed on its screen as shown in Fig. 2. Furthermore, the Pi
fact that the simple point-to-point trajectory comparison works camera module v2 is used in the system to capture hand
well with short gestures and few features, but would not gestures which by computer vision techniques analyze a
probably scale well to long single or multiple trajectories, as different set of gestures performed by human fingers and
required by different application domains. Besides that, the interpret them to display useful information. The system also
external environment has a vital role in mid-air interaction has the capability of detecting the person with an ultrasonic
between human and machine. An algorithm in [13] is sensor with a separate processing control based on Arduino
proposed to overcome the effect of lighting condition on microcontroller. The power of the display monitor is
gesture interface using unit-gradient vector (UGV), controlled by the a passive infrared sensor placed right at the
background subtraction methods and Hue saturation value bottom of the device. It takes input from an ultrasonic sensor
(HSV) thresholding. To this end, the gestures are also based on the distance measurement of the person from a
performed using different objects around users. The author in mirror, it triggers an LED light transmitter to transmit the
[14] uses experimental results to formulate a method that ON/OFF code to the built-in receiver of the monitor for
helps users to interact and tailored gesture profiles for objects making the monitor on/off as shown in Fig. 2. The system
existing in the omnipresent environment. However, the with all its components is controlled by a Raspberry Pi
omission of environmental interference in interaction is still a microcontroller.
challenge. The authors in [15,16] used a tracking device and
reveals temporal division and spatial division gesture
techniques to interact with the large public display. To
formulate the best technique for hand gesture [17] used the
object contour method for fast and accurate hand detection
and tracking algorithm. The author in [18] improve gesture
recognition by focusing spatial channels on hands. The author
in [19] uses Raspberry Pi with the external camera to capture
finger gesture using colour markers. Likewise, [20] illustrates
a remote free approach to control a led using hand gesture.
Besides that, [21,22] proposed Patch-Levy-based Bees
Algorithm (PLBA) and Median-Average Otsu’s thresholding
method to overcome the limitation of traditional Otsu's
thresholding method. In contrast to these complex algorithms,
we have used the traditional Otsu’s method and derived better
results.
In this paper, we have critically analyzed the responses of
(a)
gesture on finger count mechanism using techniques like
background or foreground segmentation, and colour
information of the hand to subtract hand images. Although
several robust techniques and algorithm are proposed to
optimally execute these techniques we critically use contour
convex hull and convexity defects and traditional techniques
to design robust algorithms to provide a better interactive
experience.
III. IMPLEMENTATION OF THE PROPOSED SYSTEM
This section illustrates the hardware components used to
design the interactive mirror. The designed interactive mirror
is composed of a Raspberry Pi controller, a display module, a
pi-camera and an ultrasonic sensor. The mirror displays the
information on its display surface when connected to the (b)
internet via WIFI. The system conceptual diagram is shown in Fig 1. (a) Conceptual Diagram of the Interactive Mirror, (b) The
Fig. 1(a). The proposed mirror support comfort and additional Interpretation of Acrylic Two-Way Sheet.
479 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Gesture Detection Algorithm Vol. 11, No. 5, 2020
Activate
Camera
Organize
imports
Capture Video
Segment Hand
Gesture Algorithm:
A basic contour detection method in the Region of Interest NO If # of If # of If # of
defects defects defects
(ROI) is implemented for gesture recognition. The program = 1? = 2? = 3?
algorithm is depicted in Fig. 3, which at first import necessary B B B
A YES A YES A YES
packages along with Date &Time, Weather and Newsfeed
files. Next to the video frames are captured from which for the DISPLAY TIME DISPLAY NEWS DISP WEATHER
Hand Region, the image is segmented by cropping captured Fig 3. The Algorithm for Gesture Detection and Implementation.
frame. This cropped image is our Region of Interest. This ROI
is further processed to improve quality by converting ROI in TABLE I. INFORMATION DISPLAYED ON D IFFERENT H AND GESTURES
Grayscale which subsequently converts an image into binary
S.
i.e. high-intensity pixels are treated as one and low-intensity Gestures (No of fingers) Information Fetched
No.
pixels as zero. Based on binary values it decides whether the
pixel is of interest or not. After that, Gaussian Blur is applied 1 Two Time and Date
to reduce noise to acquire the shape of the tracked objects.
Besides, the Otsu’s Binarization Thresholding Method using 2 Three News Feed
OpenCV is used to highlight the particular colour range and
3 Four Weather
automatically approximate the threshold value of bimodal
image from image histogram.
To this end, from the processed image in the ROI, a
maximum contour is found in the region and in that contour,
convex hull and convexity defects are identified. Convex hull
or points are general tips of the fingers and other point and
subsequently found the convexity defects, which are the
deepest points of deviation on the contour. Later on, cosine
rule is applied for finding the angles of all defects (the number
of fingers extended). These angles must be greater than 90
degrees as it can’t interpret the angles less than 90. Moreover,
it examines the maximum contour area in a range between
2000 to 10000 pixels and decides whether the hand is within
the specified range. When the hand is with the area of
detection it gives the indication and acknowledges the user to
make the gesture. Additionally, the system will then count the
number of defects and according to the number of defects in
Fig. 4, it will fetch different information as summarized in Fig 4. Finger Count Gestures.
Table I.
480 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 11, No. 5, 2020
Send ON
code
481 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 11, No. 5, 2020
Fig 8. Time Delay Measurement for Calling Time & Date, Weather and
Headlines Information.
100
Accuracy Percentage
95
90
85
80
75
70
65
60
55
Fig 7. Experimental Setup for Presence Detection Analysis. 50
2 3 4 5
V. RESULTS AND DISCUSSION Gesture Type
This section discusses the results obtained for the time
taken to display information, the effect of finger count Accuracy For detecting Number of fingers in 18 trials
gestures on hand recognition accuracy, the effect of deviation
of a user from Line of Sight facing the mirror, variation in Fig 9. Performance of Hand Gesture Recognition Accuracy with Different
distance range, variation in ambient light and effects of Backgrounds.
changing background colours on hand detection accuracy.
From Fig. 8, one can observe the time delay for different In order to analyze the gesture interactivity by contour,
information displayed on the surface of the mirror calculated convex hull and convexity defect a total of 18 readings were
using Time Module in Python. Fig. 8 proposes that the taken for each gesture to examine its accuracy. In Fig. 9, 18
interactive mirror on an average took 0.5 seconds to display readings are taken on six different colour backgrounds (Dark
date and time to the specified location on the mirror screen, green, Skin colour, White, Pink, Yellow, Light green) with
which is the last time as it was taken from the operating date varying light intensity as mentioned in the experimental setup.
and time. Moreover, it can also be observed that it shows
482 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 11, No. 5, 2020
The accuracy is calculated and plotted against gesture type Fig. 10 narrates the minimum distance between two
using equation (1): fingers for 2, 3 and 4 finger count gestures detection. Such
that, for 2 fingers, which is index and middle the minimum
Accuracy = Number of times gesture recognized easily
distance should be 25 millimetres (mm) or greater, as per the
X algorithm the two fingers didn’t detect the gesture up to the
100 / total trials (1) distance of 21 mm. However, it gives an abrupt reading from
The term easily refers to the best possible defect 21 mm to 24 mm, which means within this distance it
identification with almost less than 1 second time delay. From sometimes detects fingers count gesture. Moreover, for the
Fig. 9, it is deduced that accuracy decreases from 94% to 55% fingers gesture 3 it didn’t accept the gesture defect up to 15
with the requirement of recognizing more fingers or defects. mm. Subsequently, it relics abrupt between 16 mm to 18mm
The background colour and light intensity are interlinked with and detect the gesture count 3 for 18 mm or higher.
each other as all the backgrounds are vulnerable for some Additionally, for the 4-fingers count gesture, the escape of no
gesture at some value of light intensity especially when the detection is from 0 to 18 mm and abrupt is from 19 to 21 mm
light intensity is greater than 100 lux. The yellow colour and the detection zone is 21 or higher. The prime reason of
background (Fixed for this setup in Fig. 9) compared to other difference of distance between each of the finger count
colours yielded better results. gestures is the angle at which each finger is extended; for a
normal human being the index and middle fingers are
extended at 90 degrees. In contrast, the ring finger is extended
at 70 to 80 degrees when the rest of the finger(s) is held by the
thumb. In addition, when four fingers are extended the degree
of extension of index and middle is 90 degrees and the angle
of extension of ring finger ranges 87 to 90 degrees which
differ from person to person. Consequently, the angle at which
little finger is extended is 88 to 90 degrees. Additionally, one
of the concerns for the limitation is that in this paper we
critically analyze the basic contour detection algorithm
without subtracting the background so it helps the developers
to know the findings of the background interference and
design robust contour detection algorithm for the gesture
defects.
The finger count gesture is also prone to an inclination of
the forearm at yellow, white and pink backgrounds. To take it
in consideration, we attached the Samsung S7 edge phone at
the forearm and took the readings, Fig. 11 gives the forward
and backward angles of the forearm lifted at 3.5 feet (usual
placing of a mirror) with the light intensity of 100 lux
considered as a normal room light. For the yellow
background, the range of detection ranges 0-19 degrees
backward, as after the backward angle of inclination increases
to 19 it didn’t detect the gesture defect. Consequently, for the
same background, the forward angle ranges 0-37 degrees as
after 37 degrees the system doesn’t detect the gesture.
Moreover, for the pink background, the backward angle of
inclination is 0 to 9 degree and the forward angle is 0 to 20
Fig 10. Minimum Distance between the Figures for Finger Count Gesture
Detection.
degrees for the gesture to be detected. However, for the white
background, the angle of the forearm for gesture detection
ranges 0 to 28 degrees backward and 0 to 36 degrees forward.
It is kept in regard from Fig. 12 that the device
functionality is also susceptible to the twist of the wrist as the
wrist is curled from its actual position the gesture detection
possesses the susceptibility and it also differs with
backgrounds. To check this, we took 100 trails, for yellow
background, the angle of detection ranges 0 to 15 degrees
right as after the right twist increases the 15 degrees the
camera didn’t detect any gesture detection. Similarly, as the
wrist twist at the left side, the detection ranges 0 to 10 degree
and offers no detection afterwards. Further, for the pink
Fig 11. Forward and Backward Tilt Angle with Different Colour background, the range of twist angle is 0 to 13 degrees and no
Backgrounds. detection after 13 degrees at the left curl. Furthermore, for the
483 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 11, No. 5, 2020
right curl the range limit to 0 to 6 degrees. Additionally, for VI. CONCLUSION
the white background, the left twist angle is restricted to 0 to A touch-free reflective interface is designed in this paper
23 degree and the right twist angle ranges 0 to 15 degrees. that displays informational data. The designed Interactive
We took only these three backgrounds in consideration as Mirror acts as a traditional mirror as well as it provides instant
on the sharp red or the combination of Red-Blue-Green (RGB) access to information on the mirror surface efficiently and
if red increases the mark of 190 with the hue and saturation effectively to provide time optimization for the user. With
increases the point of 220 it exhibits abrupt or no gesture that, we also have carried out an analysis to improve
detection. The similar sort of functionality is observed when interaction mechanism. In addition, we have used a simple and
the light intensity increases the mark of 120 lux as it’s a cost-effective method of interaction. The results show that the
common consideration the sharp light imposes a great impact mirror performs better with light colour backgrounds and the
on the camera’s shutter owing to the fact the existing result go worse as the background colour goes sharp and can
algorithm is prone to light intensity. Similarly, the width of further be improved for sharp backgrounds by implementing
fingers for swift gesture count detection should be at least 1.8 background refinement algorithms to make it robust. This
cm to get the gesture detection. mechanism accurately works if a user appears at 56.6°
(degrees) to 69.5° and 69.6° to 112.5° cone of the ultrasonic
Fig. 13 narrates the presence detection analysis such that sensor. But, when a user turns at 112.6° to 123.75° it can
we have connected an ultrasonic sensor at the bottom center of produce false detection. This problem can be overcome by
the mirror; the mirror draws cone region of 0-180 degrees. At using tunnel propagation sensor or using multiple ultrasonic
a cone angle of 0-56.5 degrees, the sensor doesn’t detect the sensors to cater side poses as well. It is recommended to carry
person and the mirror remains OFF with the region of 56.6 to out more study on improving gesture recognition algorithm for
69.5 degrees the mirror posse’s abrupt response as at some continuous inputs from the user with that Artificial
instant it becomes ON. Consequently, with the same range at a intelligence can also be added in this interactive mirror.
similar instant, it remains OFF. However, when the person Moreover, the background subtraction method can be used to
comes in the range of 69.6 to 112.5 degrees the mirror remains overcome the gesture predisposition. Besides, with the help of
ON. Additionally, in the range of 112.6 to 123.75 degrees the analysis new techniques can be used to overcome gesture
mirror again gives abrupt readings when it finds a person vulnerability. Additionally, swipe gestures and 3D Dressing
within this range. Further, in the angle range 123.76 to 180 amenities can be designed in the future device.
degree, the mirror doesn’t detect the person and remains OFF.
ACKNOWLEDGMENTS
This research work has been carried out in Top Quality
Lab IIT Building, and Innovation and Entrepreneurship Center
(IEC) Mehran University of Engineering and Technology,
(MUET) Jamshoro. The authors are thankful to Institute of
Information & Communication Technologies (IICT) Mehran
University of Engineering and Technology Jamshoro for
facilitating this research.
REFERENCES
[1] D. Gold, D. Sollinger, and Indratmo, “SmartReflect: A modular smart
mirror application platform,” 7th IEEE Annu. Inf. Technol. Electron.
Mob. Commun. Conf. IEEE IEMCON 2016, 2016
[2] S. Yong, G. Liqing, and D. Ke, “Design of Smart Mirror Based on
Raspberry Pi,” 2018 International Conference on Intelligent
Transportation, Big Data & Smart City vol. 67, pp. 6–9, 2018.
Fig 12. Right and Left Twist Angle of the Wrist with Different Backgrounds. [3] V. Mehta, B. Walia, V. Rathod, and M. Kothari, “Smart and Interactive
Home Using Raspberry Pi,” International Journal of Computer Sciences
and Engineering Vol 6 no. 3, pp. 241–245, 2018.
[4] D. Besserer, J. Bäurle, A. Nikic, F. Honold, F. Schüssel, and M. Weber,
“Fitmirror: A Smart Mirror for Positive Affect in Everyday User Morning
Routines,” Proc. Work. Multimodal Anal. Enabling Artif. Agents
Human-Machine Interact., pp. 48–55, 2016.
[5] A. R. Barse, “A Survey Paper on Smart Mirror Notice Board,”
International Journal of Science Technology Management and Research
vol. 2, no. 10, pp. 1–3, 2017.
[6] P. K. A. and A. E. S. M. Anwar Hossain, “SMART MIRROR FOR
AMBIENT HOME ENVIRONMENTSMART MIRROR FOR
AMBIENT HOME ENVIRONMENT,” J. Can. Dent. Assoc., vol. 70, no.
3, pp. 156–7, Mar. 2004.
[7] S. Athira, F. Francis, R. Raphel, N. S. Sachin, S. Porinchu, and S.
Francis, “Smart mirror: A novel framework for interactive display,” Proc.
IEEE Int. Conf. Circuit, Power Comput. Technol. ICCPCT 2016, 2016.
Fig 13. Presence Detection Analysis.
484 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 11, No. 5, 2020
[8] S. Paszkiel and R. Kania, “Interactive, Multifunctional Mirror – Part of a [15] C. Jeffrey and B. Terra, “Touchless Tactile Displays for Digital Signage:
Smart Home,” Informatics, Control. Meas. Econ. Environ. Prot., vol. 6, Mid-air Haptics meets Large Screens, SIGCHI Conf. Hum. Factors
no. 1, pp. 66–68, 2016. Comput. Syst. 1–4, 2018.
[9] J. B. Panchal and K. P. Kandoriya, “Survey on Hand Gesture Recognition [16] R. Walter, G. Bailly, and J. Müller, “StrikeAPose: revealing mid-air
Methods,” International Journal of Advanced Research in Electrical, gestures on public displays,” SIGCHI Conf. Hum. Factors Comput. Syst.,
Electronics and Instrumentation Engineering vol. 4, no. 11, pp. 271–278, pp. 841–850, 2013.
2014. [17] D. Lee, J. S. Kim, and H. J. Lee, “Fast hand and finger detection
[10] F. M. Caputo, P. Prebianca, A. Carcangiu, L. D. Spano, and A. Giachetti, algorithm for interaction on smart display,” Displays, no. February, 2018.
“Comparing 3D trajectories for simple mid-air gesture recognition,” [18] P. Narayana, J. R. Beveridge, and B. A. Draper, “Gesture Recognition :
Comput. Graph., vol. 73, pp. 17–25, 2018. Focus on the Hands.” IEEE/CVF Conf on Computer Vision and Pattern
[11] F. Chen, J. Deng, Z. Pang, M. Baghaei Nejad, H. Yang, and G. Yang, Recognition 2018
“Finger Angle-Based Hand Gesture Recognition for Smart Infrastructure [19] G. S. N and S. Jayanthy, “Virtual Control Hand Gesture Recognition
Using Wearable Wrist-Worn Camera,” Appl. Sci., vol. 8, no. 3, p. 369, System Using Raspberry Pi,” Asian Res. Publ. Netw. J. Eng. Appl. Sci.,
2018. vol. 10, no. 7, pp. 2989–2993,2015.
[12] G. Bailly, J. Müller, and E. Lecolinet, “Design and evaluation of finger- [20] M. Singh and G. Singh, “Interfacing Raspberry Pi With Led to interact
count interaction: Combining multitouch gestures and menus,” Int. J. with hand gestures in MATLAB : A Research,” vol. 3, no. 1, pp. 414–
Hum. Comput. Stud., vol. 70, no. 10, pp. 673–689, 2012. 417, 2015.
[13] J. Dulayatrakul, P. Prasertsakul, T. Kondo, and I. Nilkhamhang, “Robust [21] W. A. Hussein, S. Sahran, and S. N. H. S. Abdullah, “A fast scheme for
implementation of hand gesture recognition for remote human-machine multilevel thresholding based on a modified bees algorithm,”
interaction,” Proc. - 2015 7th Int. Conf. Inf. Technol. Electr. Eng. Knowledge-Based Syst., vol. 101, pp. 114–134, 2016.
Envisioning Trend Comput. Inf. Eng. ICITEE 2015, pp. 247–252, 2015.
[22] C. Sha, J. Hou, and H. Cui, “A robust 2D Otsu’s thresholding method in
[14] A. Atia, S. Takahashi, and J. Tanaka, “Smart gesture sticker: Smart hand image segmentation,” J. Vis. Commun. Image Represent., vol. 41, no.
gestures profiles for daily objects interaction,” Proc. - 9th IEEE/ACIS Int. October, pp. 339–351, 2016.
Conf. Comput. Inf. Sci. ICIS 2010, pp. 482–487, 2010.
485 | P a g e
www.ijacsa.thesai.org
View publication stats