MADAWTv 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

M.Madan Mohan et.

al Journal of Science Technology and Research (JSTAR)

AR Watch Try – On Application for Android Devices


1
Madan Mohan.M, 2D. Santhosh Kumar, 3Ajay.K
1
Assistant Professor, 1,2,3Department of Computer Science Engineering,
Nehru Institute of Engineering and Technology Coimbatore.
1
[email protected], [email protected], [email protected]

Abstract: In recent days Augmented Reality is an emerging trend in marketing and sales
strategies. Augmented reality ads are immersive, which means they help marketers create a
certain emotional connection with customers. Unlike images or banners, for example, AR ads
are interactive and lifelike consumers can see and even interact with them. Now-a-days people
prefer online shopping rather than the traditional window shopping and Augmented Reality
allows brands to give customers unique experiences with the convenience of tapping into their
mobile devices. So the main purpose is to build an “AR Watch Try-On application” is to develop
android application for trying different watches in a Virtual way using a mobile which supports
AR camera. This application can be used on Online Watch Shopping websites and applications
such as Titan, Fastrack, Sonata and so on. The application will eliminate the human efforts by
physically visiting the Watch shops which is very time-consuming activity. User can try out
multiple watches and different varients of those watches.

Key words: Augmented Reality, Emerging trend, Augmented Reality ads, interactive and
lifelike, Virtual component, AR camera, Time Consuming Activity.
Introduction
Augmented reality has been a hot topic in software development circles for a number
of years, but it’s getting renewed focus and attention with the release of products like Google
Glass. Augmented reality is a technology that works on computer vision-based recognition
algorithms to augment sound, video, graphics and other sensor-based inputs on real world objects
using the camera of your device.

Corresponding Author: M.Madan Mohan, Asst. Professor,


Department of CSE, Nehru Institute of Engineering and Technology
Coimbatore.
Mail: [email protected]

Volume No.2, Issue No.1 (2021) 39


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)

It is a good way to render real world information and present it in an interactive way so that
virtual elements become part of the real world.
Augmented reality displays superimpose information in your field of view and can take
you into a new world where the real and virtual worlds are tightly coupled. It is not just limited
to desktop or mobile devices.
A simple augmented reality use case is a user captures the image of a real-world object,
and the underlying platform detects a marker, which triggers it to add a virtual object on top of
the real-world image and displays on your camera screen.
LITERATURE SURVEY
The research for augmented reality technology has brought up development of
various applications in the field of computer science. In this literature review, it shows how the
implementation of augmented reality in various fields using unity 3D.
[1] Santosh Sharma, Yash Kaikini, Parth Bhodia, Sonali Vaidya has proposed
technique named “Marker less Augmented Reality based Interior Designing system”,
which uses Marker- less Augmented Reality as a basis for enhancing user experience and for a
better perception of things. It has advantage of no need of markers in the surface area and
disadvantage is Object is aligned with camera so that it moves as we move a camera.
[2] Snehal Mangale, Nabil Phansopkar, Safwaan Mujawar, Neeraj Singh has
proposed technique named “Virtual Furniture Using Augmented Reality” which is a web
based application where user, have to place the marker in a room where they want to try out
furniture items. The user’s webcam will be on and through the webcam they will capture the
live feed of the room. Application captures the image and passes through predefined marker
detection algorithm. Algorithm is based on image processing techniques using color and other
properties as the input to detect the marker. User initially selects the furniture to be placed from
the given database. The application superimposes furniture on the original image with the
center coinciding with the markers center in both directions. Furniture objects are overlaid on
to the two dimensional image frame acquire from webcam. This will appear as if it is actually
placed in the real world. And finally the user can view how the area looks with the furniture
present.
[3] Khushal Khairnar, Kamleshwar Khairnar, Sanket kumar Mane, Rahul
Chaudhari has proposed a technique named “Furniture Layout Application Based on
Marker Detection and Using Augmented Reality” to develop an application where user have
to place the marker in a room where he want to try out furniture items. The user’s webcam will
be on and through the webcam he will capture the live feed of the room. Then application
search the marker using fiducial marker detection algorithm.

Volume No.2, Issue No.1 (2021) 40


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
EXISTING SYSTEM
Traditional methods of designing include advising and assisting customers who have
relied upon a combination of verbal explanations and 2D drawings through online shopping
application. However, this medium of approach clearly restricted to the limit of explanations
provided to customer for the particular model of a watch and makes him less efficient and
confused to buy the watch.
The main drawbacks in the mediums of existing system are:
 Static view of design which is unable to convey
 Cannot determine the watch will fit to our needs.
 Information like size and comfortability can’t be known.
PROBLEM DEFINITION
As the customer purchases various types of watches through online, but in online it
shows only photo or 2D image and cannot be determined whether it is suitable for them or not.
So, to overcome that we can use this application to check whether the watch is suitable or not by
placing it in their own hand using augmented reality images.
Our application is a step in this direction, allowing users to view a 3D rendered model -
a virtual resemblance of the watch which can be viewed and configured in real time using our
Augmented reality application.
This study proposes a new method for applying Augmented Reality technology to
watches, where a user can view virtual watches and communicate with 3D virtual watch data
using a dynamic and flexible user interface.
PROPOSED SYSTEM
With the approach of augmented reality application, this can be easily achieved. People
today are well versed with the technology and are operating smartphones which support AR.
Thus, the concept of creating a watch try-on application brings the designer step closer to being
technologically advanced.
With the recent emergence of better cameras and more accurate sensors in soon-to- be
mainstream devices. In our current implementations of application, we use Vuforia Framework to
accurately detect the real- world environment, such as the locations of walls and points of
intersection, allowing users to place virtual objects into a real context.
The proposed system uses Image tracking Augmented Reality as a basis for enhancing user
experience and for a better perception of things. In Image tracking the user points his phone to an
image, the app scans and recognises that image and overlays a 3-D model on top of that image.
The basic premise of the proposed system is to overlay digital 3D models on top of target
image using a camera.

Volume No.2, Issue No.1 (2021) 41


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
 This Application will use AR supported mobile phone to scan the target image and
display the augmented watch object to check whether it adjusts the users hand and preferences
of their needs.
 Blender is a software which offers a comprehensive creative feature set for 3D
computer animation, modeling , simulation, rendering, and composition.
 The next step involves setting up light, shadow, and camera positioning of these models
using various components of Unity 3D.
 Next, the watch model is selected and the selected model is rendered and processed to
be loaded on the scanned target image by Vuforia Framework.
 Mapping of 3D model onto the smartphone screen takes place which decides the
dimensions and appearance of the model which is then rendered and displayed onto the screen.

FLOW DIAGRAM

MODEL IMPLEMENTATION
Creating Augmented Reality Objects.
First, we should establish virtual models with the help of Blender software to
create 3D furniture models, the models mainly use Polygon and NURBS modeling methods,
the animation mainly uses key frame and expression animation technology. After establishing
models in Blender, four important information of the model will be stored in the file of each
model, they are the model’s vertex coordinate, texture coordinate, normal coordinate and the
total number of polygons, these data are the main data when rendering model.
Application will store them in memory and read them to render models when calling
rendering function. The data quantity of the model is very huge, so we need a loading module
of the model to make it loaded into program conveniently. Later we export the model data,
the file exported by Blender is .obj file, which stores above information, next convert the
information to file which is available in program by model loader and obtain the model data

Volume No.2, Issue No.1 (2021) 42


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
by calling head file. After loading model data, we can render and display it in the scene
through Unity 3D.

As shown from the above image it shows how the 3D objects are developed for this
application using the Blender. In the figure it shows the front view and 3D view of the object
that is been created.
Developing Scenes for User Interface
In this module we create scenes for every slide of application using Unity 3D. The main
interface interacts by sliding browsing and selecting the key. The main interface contains watch
models, buttons that helps change colors, description of the watches like model name, price,
varients. The watch column stores the key of all watches, display one watch at a time and which
also supports sliding browsing. In order to implement these functions, we make the scene
display to ratio of Android display and add the buttons to the scene that helps in moving to next
scene.

Volume No.2, Issue No.1 (2021) 43


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)

As shown in the above image, it shows the Unity 3D platform that is used to create scene
for the watch model which is imported from Blender. For every model of the watch we will
create an individual scenes and at final we combine all scenes.
Place the Virtual Object on the Surface Area
In this scenario, we use the help of Vuforia package which will be imported in Unity
3D and modify the package that will help us to scan the target image where we need to place
the virtual object in in the real world. Once the modifications to Vuforia has done, we will
create scene such that after the surface area is scanned and when the user tap on the touch
screen then the virtual 3D model will be rendered or integrated with theuser hand so that user
can verify the watch model suits to their needs. The user can drag-and- drop virtual watch
model according to his desired in the real scene via user interface provided at this stage.
Verification of placed objects
Once the user thinks that the object is well suited to his need, he can check the
description by selecting the information button that helps in describing the color, model and
price of the watch. In order to view this description, we create another scene that helps in
displaying all the required information. We add another button that helps in changing color of
that model. For this change in color we use then program with C#.
OUTPUT SCREENS
Home page
When the application is launched the home page appears where we can able to select the
watch models that is needed to buy and verify whether it suits the user and the target image is
shown in the camera.

Volume No.2, Issue No.1 (2021) 44


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
Placing the object scene
Once the target image is scanned and obtained the points of the image the watch is now
visible on the screen and the UI with watch details are popped up. Along with that we get UI
buttons that helps changing colors and Information of that watch is shown.

CONCLUSION
The main objective of this “Augmented Reality Watch Try-on Application” is to analyze
the use of augmented reality to render the watch model in real world. Augmented reality
technology that allows the customers to decide and interact the watches with the real world,
offering new possibilities for online shopping. It helps the customer to view and understand the
watches for their requirements. Due to this customer will come to know that they can buy
watches from whenever and wherever they want. Augmented reality support for watches help in
creating many new opportunities for future research to anticipate new ideas in the field of online
shopping as customer will get benefit with these types of applications and gives a better
understanding and decision making for purchasing a watch in an efficient way. Augmented
reality is new evolving technology in the field of computer science and will make us much more
helpful than the traditional technologies.
FUTURE SCOPE
In future our “Augmented Reality Watch Try-on Application” dataset and scope will be
scalable. The user might not only be able to try out different watch models but they can also try
out this application by trying on garments, goggles, hair styles etc. It can also be used for various
applications in shopping malls, interior designing, Medical Science etc. New technology may
come into existence in future that will help in developing 3D models automatically.

Volume No.2, Issue No.1 (2021) 45


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
REFERENCES
1. A Rapid Deployment Indoor Positioning Architecture based on Image Recognition. In
Proceedings of the IEEE 7th International Conference on Industrial Engineering and
Applications (ICIEA), Bangkok, Thailand, 16–21 April 2020; pp. 784–789.18. Gerst weiler G.;
Vonach E.; Kaufmann, H.HyMoTrack.
2. Performance Evaluation of Augmented Reality based 3D Modelling Furniture Application
2018. Tahir Ahmed T; Vijaya Shetty S ; R. Samira simha; Sushmitha BedereJ
3. (c2018). Language integrated query (linq), [Online]. Available:
https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/linq/
(visited on 05/12/2018).
4. (2017). Unity arkit plugin, [Online]. Available: https://assetstore. unity . com / packages /
essentials / tutorial - projects / unity - arkit-plugin- 92515 (visited on 05/02/2018).
5. M. Lanham, Augmented Reality Game Development. Packt Publishing, 2017, ISBN:
1787122883 9781787122888.
6. Mami Mori, Jason Orlosky, Kiyoshi Kiyokawa, Haruo Takemura. (2016, Sep.). A Transitional
AR Furniture Arrangement System with Automatic View Recommendation. IEEE
Adjunct. [Online]. 21(3).pp.21-24. ISBN: 978-1-5090-3740-7. Available:
https://ieeexplore.ieee.org/document/7836488
7. Snehal Mangale, Nabil Phansopkar, Safwaan Mujawar, Neeraj Singh. (2016, May). Virtual
Furniture Using Augmented Reality. IOSR Journal of Computer
Engineering. [Online]. e- ISSN: 2278-0661, p-ISSN: 2278-8727, pp.42-46. Available:
http://www.iosrjournals.org/iosr-jce/papers/Conf.16051/Volume-
1/9.%204246.pdf?id=7557
8. Khushal Khairnar, Kamleshwar Khairnar, Sanket kumar Mane, Rahul Chaudhari. (2015, Oct.).
Furniture Layout Application Based on Marker Detection. International Research Journal of
Engineering and Technology. [Online]. 02(07). p-ISSN: 2395-0072, e-ISSN: 2395- 0056.
Available: https://www.irjet.net/archives/V2/i7/IRJET- V2I780.pdf
9. Indoor localization and navigation using smartphones augmented reality and inertial
tracking. In Proceedings of the IEEE International Conference on Electronics Circuits and
Systems, Abu Dhabi, UAE, 8– 11 December 2013; pp. 929–932.16. Nam, G.H.; Seo, H.S.; Kim,
M.S. Gwon, Y.K.; Lee, C.M.; Lee, D.M. AR-based Evacuation Route Guidance.
10. Elizabeth Carvalho, Gustavo Maçães, Isabel Varajão, Nuno Sousa and Paulo Brito. (2011,
Nov.). Use of Augmented Reality in the furniture industry. Presented at Center for
Computer Graphics. [Online]. Available: https://www.researchgate.net/publication/23
6863499_Use_of_Augmented_Reality_in_the_furniture_industry

Volume No.2, Issue No.1 (2021) 46


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
11. Billinghurst, M. (2002) “Augmented real ity in education” in New Horizons for Learning,
2nd ed., vol.3, New York: McGraw-Hill, 2010, pp. 123-135.
12. Karthick, R., et al. “Overcome the challenges in bio-medical instruments using IOT–A
review.” Materials Today: Proceedings (2020). https://doi.org/10.1016/j.matpr.2020.08.420
13. Karthick, R., et al. “A Geographical Review: Novel Coronavirus (COVID-19) Pandemic.” A
Geographical Review: Novel Coronavirus (COVID-19) Pandemic (October 16, 2020). Asian
Journal of Applied Science and Technology (AJAST)(Quarterly International Journal) Volume
4 (2020): 44-50.
14. Sathiyanathan, N. “Medical Image Compression Using View Compensated Wavelet
Transform.” Journal of Global Research in Computer Science 9.9 (2018): 01-04.
15. Karthick, R., and M. Sundararajan. “SPIDER-based out-of-order execution scheme for Ht-
MPSOC.” International Journal of Advanced Intelligence paradigms 19.1 (2021): 28-41.
https://doi.org/10.1504/IJAIP.2021.114581
16. Sabarish, P., et al. “An Energy Efficient Microwave Based Wireless Solar Power Transmission
System.” IOP Conference Series: Materials Science and Engineering. Vol. 937. No. 1. IOP
Publishing, 2020. doi:10.1088/1757-899X/937/1/012013
17. Vijayalakshmi, S., et al. “Implementation of a new Bi-Directional Switch multilevel Inverter
for the reduction of harmonics.” IOP Conference Series: Materials Science and Engineering.
Vol. 937. No. 1. IOP Publishing, 2020. doi:10.1088/1757-899X/937/1/012026
18. Karthick, R., and M. Sundararajan. “Hardware Evaluation of Second Round SHA-3
Candidates Using FPGA (April 2, 2014).” International Journal of Advanced Research in
Computer Science & Technology (IJARCST 2014) 2.2.
19. Karthick, R., et al. “High resolution image scaling using fuzzy based FPGA implementation.”
Asian Journal of Applied Science and Technology (AJAST) 3.1 (2019): 215-221.
20. P. Sabarish, R. Karthick, A. Sindhu, N. Sathiyanathan, Investigation on performance of solar
photovoltaic fed hybrid semi impedance source converters, Materials Today: Proceedings,
2020, https://doi.org/10.1016/j.matpr.2020.08.390
21. Karthick, R., A. Manoj Prabaharan, and P. Selvaprasanth. “Internet of things based high
security border surveillance strategy.” Asian Journal of Applied Science and Technology
(AJAST) Volume 3 (2019): 94-100.
22. Karthick, R., and M. Sundararajan. “A novel 3-D-IC test architecture-a review.” International
Journal of Engineering and Technology (UAE) 7.1.1 (2018): 582-586.
23. Karthick, R., and M. Sundararajan. “Design and implementation of low power testing using
advanced razor based processor.” International Journal of Applied Engineering Research
12.17 (2017): 6384-6390.

Volume No.2, Issue No.1 (2021) 47


M.Madan Mohan et.al Journal of Science Technology and Research (JSTAR)
24. Karthick, R., and M. Sundararajan. “A Reconfigurable Method for TimeCorrelatedMimo
Channels with a Decision Feedback Receiver.” International Journal of Applied Engineering
Research 12.15 (2017): 5234-5241.
25. Karthick, R., and M. Sundararajan. “PSO based out-of-order (ooo) execution scheme for HT-
MPSOC.” Journal of Advanced Research in Dynamical and Control Systems 9 (2017): 1969.
26. Karthick, R. “Deep Learning For Age Group Classification System.” International Journal Of
Advances In Signal And Image Sciences 4.2 (2018): 16-22.
27. Karthick, R., and P. Meenalochini. “Implementation of data cache block (DCB) in shared
processor using field-programmable gate array (FPGA).” Journal of the National Science
Foundation of Sri Lanka 48.4 (2020). http://doi.org/10.4038/jnsfsr.v48i4.10340
28. Suresh, Helina Rajini, et al. “Suppression of four wave mixing effect in DWDM system.”
Materials Today: Proceedings (2021). https://doi.org/10.1016/j.matpr.2020.11.545
29. M. Sheik Dawood, S. Sakena Benazer, N. Nanthini, R. Devika, R. Karthick, Design of rectenna
for wireless sensor networks, Materials Today: Proceedings, 2021.
https://doi.org/10.1016/j.matpr.2020.11.905
30. M. Sheik Dawood, S. Sakena Benazer, R. Karthick, R. Senthil Ganesh, S. Sugirtha Mary,
Performance analysis of efficient video transmission using EvalSVC, EvalVid-NT, EvalVid,
Materials Today: Proceedings,2021. https://doi.org/10.1016/j.matpr.2021.02.287

ACKNOWLEDGEMENTS
The authors would like to thank our guide, Mr. M. Madan Mohan for serving as the
backbone of the entire project helping us from the idea selection till the end and also for
providing with alternative solutions during the time of obstacles. The authors would also like to
thank Dr. S. Subasree for her continuous encouragement and motivation.
.

Volume No.2, Issue No.1 (2021) 48

You might also like