A Virtual Reality Soldier Simulator with Body Area Networks for Team Training
Abstract
:1. Introduction
- Cost-Effective Design: We designed and implemented our own sensor nodes, such that the fundamental operations of the inertial sensors could be adaptively adjusted for acquisition and integration. Therefore, it has a competitive advantage in terms of system cost.
- System Capacity: Based on the proposed simulator, a six-man squad is able to conduct military exercises that are very similar to real missions. Soldiers hold mission rehearsals in the virtual environment such that leaders can conduct tactical operations, communicate with their team members, and coordinate with the chain of command.
- Error Analysis: This work provides an analysis of the quaternion error and further explores the sensing measurement errors. Based on the quaternion-driven rotation, the measurement relation with the quaternion error between the earth frame and the sensor frame can be fully described.
- System Delay Time: The update rate of inertial sensors is about 160 Hz (i.e., the refresh time of inertial sensors has a time delay of about 6 ms). The simulator is capable of training six men at a 60 Hz system update rate (i.e., the refresh time of the entire system needs about 16 ms), which is acceptable for human awareness with delay (≤40 ms).
- System Feedback: Instructors can provide feedback after mission rehearsals using the visual after action review (AAR) function in the simulator, which provides different views of portions of the action and a complete digital playback of the scenario, allowing a squad to review details of the action. Furthermore, instructors can analyze data from digital records and make improvements to overcome the shortcomings in the action (Figure 1). Accordingly, in the immersive virtual environment, soldiers and leaders improve themselves with respect to situational awareness, decision-making, and coordination skills.
2. System Description
2.1. Sensor Modeling
2.2. Sensor Calibration
- Step 1: Given a fixed gesture, we measure the sensing data (i.e., the raw data) and calculate the measurement offsets.
- Step 2: Remove the offset and normalize the modified raw data to the maximum resolution of the sensor’s analog-to-digital converter. In this work, the calibrated results (CR) of the sensors are described by Equations (4)–(6).
2.3. Information Processing
2.4. Communication and Node Authentication Procedures
2.5. System Intialization
3. Quaternion Representation
4. Performance Analysis
4.1. Rotation Matrix
4.2 Error Analysis
5. System Operating Procedures
5.1. Participants
5.2. Virtual Environment
5.3. Procedure
6. Experimental Results
6.1. Error Analysis
6.2. Simulated Training Performance
6.3. Discussion
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Dimakis, N.; Filippoupolitis, A.; Gelenbe, E. Distributed Building Evacuation Simulator for Smart Emergency Management. Comput. J. 2010, 53, 1384–1400. [Google Scholar] [CrossRef] [Green Version]
- Knerr, B.W. Immersive Simulation Training for the Dismounted Soldier; No. ARI-SR-2007-01; Army Research Inst Field Unit: Orlando, FL, USA, 2007. [Google Scholar]
- Lele, A. Virtual reality and its military utility. J. Ambient Intell. Hum. Comput. 2011, 4, 17–26. [Google Scholar] [CrossRef]
- Zhang, Z.; Zhang, M.; Chang, Y.; Aziz, E.-S.; Esche, S.K.; Chassapis, C. Collaborative Virtual Laboratory Environments with Hardware in the Loop. In Cyber-Physical Laboratories in Engineering and Science Education; Springer: Berlin/Heidelberg, Germany, 2018; pp. 363–402. [Google Scholar]
- Stevens, J.; Mondesire, S.C.; Maraj, C.S.; Badillo-Urquiola, K.A. Workload Analysis of Virtual World Simulation for Military Training. In Proceedings of the MODSIM World, Virginia Beach, VA, USA, 26–28 April 2016; pp. 1–11. [Google Scholar]
- Frissen, I.; Campos, J.L.; Sreenivasa, M.; Ernst, M.O. Enabling Unconstrained Omnidirectional Walking through Virtual Environments: An Overview of the CyberWalk Project; Human Walking in Virtual Environments; Springer: New York, NY, USA, 2013; pp. 113–144. [Google Scholar]
- Turchet, L. Designing presence for real locomotion in immersive virtual environments: An affordance-based experiential approach. Virtual Real. 2015, 19, 277–290. [Google Scholar] [CrossRef]
- Park, S.Y.; Ju, H.J.; Lee, M.S.L.; Song, J.W.; Park, C.G. Pedestrian motion classification on omnidirectional treadmill. In Proceedings of the 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Korea, 13–16 October 2015. [Google Scholar]
- Papadopoulos, G.T.; Axenopoulos, A.; Daras, P. Real-Time Skeleton-Tracking-Based Human Action Recognition Using Kinect Data. In Proceedings of the MMM 2014, Dublin, Ireland, 6–10 January 2014. [Google Scholar]
- Cheng, Z.; Qin, L.; Ye, Y.; Huang, Q.; Tian, Q. Human daily action analysis with multi-view and color-depth data. In Proceedings of the European Conference on Computer Vision, Florence, Italy, 7–13 October 2012; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- Kitsikidis, A.; Dimitropoulos, K.; Douka, S.; Grammalidis, N. Dance analysis using multiple kinect sensors. In Proceedings of the 2014 International Conference on Computer Vision Theory and Applications (VISAPP), Lisbon, Portugal, 5–8 January 2014; Volume 2. [Google Scholar]
- Kwon, B.; Kim, D.; Kim, J.; Lee, I.; Kim, J.; Oh, H.; Kim, H.; Lee, S. Implementation of human action recognition system using multiple Kinect sensors. In Proceedings of the Pacific Rim Conference on Multimedia, Gwangju, Korea, 16–18 September 2015. [Google Scholar]
- Beom, K.; Kim, J.; Lee, S. An enhanced multi-view human action recognition system for virtual training simulator. In Proceedings of the 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA), Jeju, Korea, 13–16 December 2016. [Google Scholar]
- Liu, T.; Song, Y.; Gu, Y.; Li, A. Human action recognition based on depth images from Microsoft Kinect. In Proceedings of the 2013 Fourth Global Congress on Intelligent Systems, Hong Kong, China, 3–4 December 2013. [Google Scholar]
- Berger, K.; Ruhl, K.; Schroeder, Y.; Bruemmer, C.; Scholz, A.; Magnor, M.A. Marker-less motion capture using multiple color-depth sensors. In Proceedings of the the Vision, Modeling, and Visualization Workshop 2011, Berlin, Germany, 4–6 October 2011. [Google Scholar]
- Kaenchan, S.; Mongkolnam, P.; Watanapa, B.; Sathienpong, S. Automatic multiple kinect cameras setting for simple walking posture analysis. In Proceedings of the 2013 International Computer Science and Engineering Conference (ICSEC), Nakorn Pathom, Thailand, 4–6 September 2013. [Google Scholar]
- Kim, J.; Lee, I.; Kim, J.; Lee, S. Implementation of an Omnidirectional Human Motion Capture System Using Multiple Kinect Sensors. IEICE Trans. Fundam. 2015, 98, 2004–2008. [Google Scholar] [CrossRef]
- Taylor, G.S.; Barnett, J.S. Evaluation of Wearable Simulation Interface for Military Training. Hum Factors 2012, 55, 672–690. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Barnett, J.S.; Taylor, G.S. Usability of Wearable and Desktop Game-Based Simulations: A Heuristic Evaluation; Army Research Inst for the Behavioral and Social Sciences: Alexandria, VA, USA, 2010. [Google Scholar]
- Bink, M.L.; Injurgio, V.J.; James, D.R.; Miller, J.T., II. Training Capability Data for Dismounted Soldier Training System; No. ARI-RN-1986; Army Research Inst for the Behavioral and Social Sciences: Fort Belvoir, VA, USA, 2015. [Google Scholar]
- Cavallari, R.; Martelli, F.; Rosini, R.; Buratti, C.; Verdone, R. A Survey on Wireless Body Area Networks: Technologies and Design Challenges. IEEE Commun. Surv. Tutor. 2014, 16, 1635–1657. [Google Scholar] [CrossRef]
- Alam, M.M.; Ben Hamida, E. Surveying wearable human assistive technology for life and safety critical applications: Standards, challenges and opportunities. Sensors 2014, 14, 9153–9209. [Google Scholar] [CrossRef] [PubMed]
- Bukhari, S.H.R.; Rehmani, M.H.; Siraj, S. A Survey of Channel Bonding for Wireless Networks and Guidelines of Channel Bonding for Futuristic Cognitive Radio Sensor Networks. IEEE Commun. Surv. Tutor. 2016, 18, 924–948. [Google Scholar] [CrossRef]
- Ambroziak, S.J.; Correia, L.M.; Katulski, R.J.; Mackowiak, M.; Oliveira, C.; Sadowski, J.; Turbic, K. An Off-Body Channel Model for Body Area Networks in Indoor Environments. IEEE Trans. Antennas Propag. 2016, 64, 4022–4035. [Google Scholar] [CrossRef]
- Seo, S.; Bang, H.; Lee, H. Coloring-based scheduling for interactive game application with wireless body area networks. J. Supercomput. 2015, 72, 185–195. [Google Scholar] [CrossRef]
- Xsens MVN System. Available online: https://www.xsens.com/products/xsens-mvn-animate/ (accessed on 21 January 2019).
- Tian, Y.; Wei, H.X.; Tan, J.D. An Adaptive-Gain Complementary Filter for Real-Time Human Motion Tracking with MARG Sensors in Free-Living Environments. IEEE Trans. Neural Syst. Rehabil. Eng. 2013, 21, 254–264. [Google Scholar] [CrossRef] [PubMed]
- Euston, M.; Coote, P.; Mahony, R.; Kim, J.; Hamel, T. A complementary filter for attitude estimation of a fixed-wing UAV. In Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, 22–26 September 2008. [Google Scholar]
- Yoo, T.S.; Hong, S.K.; Yoon, H.M.; Park, S. Gain-Scheduled Complementary Filter Design for a MEMS Based Attitude and Heading Reference System. Sensors 2011, 11, 3816–3830. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wu, Y.; Liu, K.S.; Stankovic, J.A.; He, T.; Lin, S. Efficient Multichannel Communications in Wireless Sensor Networks. ACM Trans. Sens. Netw. 2016, 12, 1–23. [Google Scholar] [CrossRef]
- Fafoutis, X.; Marchegiani, L.; Papadopoulos, G.Z.; Piechocki, R.; Tryfonas, T.; Oikonomou, G.Z. Privacy Leakage of Physical Activity Levels in Wireless Embedded Wearable Systems. IEEE Signal Process. Lett. 2017, 24, 136–140. [Google Scholar] [CrossRef]
- Ozcan, K.; Velipasalar, S. Wearable Camera- and Accelerometer-based Fall Detection on Portable Devices. IEEE Embed. Syst. Lett. 2016, 8, 6–9. [Google Scholar] [CrossRef]
- Ferracani, A.; Pezzatini, D.; Bianchini, J.; Biscini, G.; Del Bimbo, A. Locomotion by Natural Gestures for Immersive Virtual Environments. In Proceedings of the 1st International Workshop on Multimedia Alternate Realities, Amsterdam, The Netherlands, 16 October 2016. [Google Scholar]
- Kuipers, J.B. Quaternions and Rotation Sequences; Princeton University Press: Princeton, NJ, USA, 1999; Volume 66. [Google Scholar]
- Karney, C.F. Quaternions in molecular modeling. J. Mol. Graph. Model. 2007, 25, 595–604. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gebre-Egziabher, D.; Elkaim, G.H.; Powell, J.D.; Parkinson, B.W. A gyro-free quaternion-based attitude determination system suitable for implementation using low cost sensors. In Proceedings of the IEEE Position Location and Navigation Symposium, San Diego, CA, USA, 13–16 March 2000. [Google Scholar]
- Horn, B.K.P.; Hilden, H.M.; Negahdaripour, S. Closed-form solution of absolute orientation using orthonormal matrices. JOSA A 1988, 5, 1127–1135. [Google Scholar] [CrossRef]
- Craig, J.J. Introduction to Robotics: Mechanics and Control; Pearson/Prentice Hall: Upper Saddle River, NJ, USA, 2005; Volume 3. [Google Scholar]
x | 0.95 | 0.98 | 0.96 |
y | 0.97 | 0.99 | 0.96 |
z | 0.99 | 0.98 | 0.94 |
Header | Data to Show Packet Number | 8 bits |
---|---|---|
Payload | Bone data size | 144 bits |
Total data length | ||
Soldier No. | ||
T-Pose status | ||
Sensor nodeID | ||
Total Bones of a skeleton | ||
Yaw value of the bone | ||
Pitch value of the bone | ||
Roll value of the bone | ||
Tail | Data to show end of packet | 8 bits |
DATA | UNIT | RANGE |
---|---|---|
Soildier no. | N/A | 0~255 |
Friend or Foe | N/A | 0~2 |
exterbal Control | N/A | 0/1 |
Team no. | N/A | 0~255 |
Rank | N/A | 0~15 |
Appearance | N/A | 0~255 |
BMI | Kg/Meter2 | 18~32 |
Health | Percentage | 0~100 |
Weapon | N/A | 0~28 |
Vechicle | N/A | 0~5 |
Vechicle Seat | N/A | 0~5 |
Position X | Meter | N/A |
Position Y | Meter | N/A |
Position Z | Meter | N/A |
Heading | Degree | −180~180 |
Movement | N/A | 0~255 |
Behavior | N/A | 0~255 |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fan, Y.-C.; Wen, C.-Y. A Virtual Reality Soldier Simulator with Body Area Networks for Team Training. Sensors 2019, 19, 451. https://doi.org/10.3390/s19030451
Fan Y-C, Wen C-Y. A Virtual Reality Soldier Simulator with Body Area Networks for Team Training. Sensors. 2019; 19(3):451. https://doi.org/10.3390/s19030451
Chicago/Turabian StyleFan, Yun-Chieh, and Chih-Yu Wen. 2019. "A Virtual Reality Soldier Simulator with Body Area Networks for Team Training" Sensors 19, no. 3: 451. https://doi.org/10.3390/s19030451
APA StyleFan, Y. -C., & Wen, C. -Y. (2019). A Virtual Reality Soldier Simulator with Body Area Networks for Team Training. Sensors, 19(3), 451. https://doi.org/10.3390/s19030451