A Virtual Reality Soldier Simulator with Body Area Networks for Team Training

Yun-Chieh Fan, Chih-Yu Wen, Yun-Chieh Fan, Chih-Yu Wen

Abstract

Soldier-based simulators have been attracting increased attention recently, with the aim of making complex military tactics more effective, such that soldiers are able to respond rapidly and logically to battlespace situations and the commander's decisions in the battlefield. Moreover, body area networks (BANs) can be applied to collect the training data in order to provide greater access to soldiers' physical actions or postures as they occur in real routine training. Therefore, due to the limited physical space of training facilities, an efficient soldier-based training strategy is proposed that integrates a virtual reality (VR) simulation system with a BAN, which can capture body movements such as walking, running, shooting, and crouching in a virtual environment. The performance evaluation shows that the proposed VR simulation system is able to provide complete and substantial information throughout the training process, including detection, estimation, and monitoring capabilities.

Keywords: body area network; training simulator; virtual reality.

Conflict of interest statement

The authors declare no conflict of interest.

Figures

Figure 1
Figure 1
System training effectiveness process.
Figure 2
Figure 2
System architecture: single-soldier layout (a); multi-soldier network (b).
Figure 3
Figure 3
Top view of sensor nodes (Left); the wearable sensor node (Right).
Figure 4
Figure 4
Deployment of sensor nodes and the sink node.
Figure 5
Figure 5
Mean and variance of the gyroscope in the x, y and z directions.
Figure 6
Figure 6
Mean and variance of the accelerometer in the x, y and z directions.
Figure 7
Figure 7
Mean and variance of the magnetometer in the x, y and z directions.
Figure 8
Figure 8
The attitude angles of a sensor node placed on the upper arm when standing in a T-pose.
Figure 9
Figure 9
The diagram shows the process of sensor fusion.
Figure 10
Figure 10
Operation mode during each time slot. (a) Step 1: automatic packet synchronization; (b) Step 2: identification check on valid packets by the sink node.
Figure 11
Figure 11
The diagram shows that sensor nodes communicate with sink node in two domains.
Figure 12
Figure 12
A fully equipped soldier.
Figure 13
Figure 13
System initialization flow diagram.
Figure 14
Figure 14
The kinematic chain of a human body.
Figure 15
Figure 15
Virtual environment on a HMD. (a) An indoor view. (b) An outdoor view.
Figure 16
Figure 16
Sensing measurement errors of the sensor nodes were calibrated when the T-pose calibration was performed. (a) All sensor nodes were calibrated well during the T-pose procedure. (b) One sensor node attached to the right thigh was not calibrated well, and a sensing error was derived in the pitch direction.
Figure 17
Figure 17
The scalar part and the vector part of the quaternion error for a small rotation of measurement error of pitch angle.
Figure 18
Figure 18
Snapshot of the system.
Figure 19
Figure 19
Means of experienced participants and unexperienced participants under various experimental conditions (horizontal axis: single-man, two-man, three-man; vertical axis unit: seconds).
Figure 20
Figure 20
Mean of three-man teams with voice communication/without voice communication under the experimental conditions (horizontal axis: experienced/unexperienced participants; vertical axis unit: seconds).
Figure 21
Figure 21
Mean times with different numbers of participants under the experimental conditions (horizontal axis: number of participants; vertical axis unit: seconds).

References

    1. Dimakis N., Filippoupolitis A., Gelenbe E. Distributed Building Evacuation Simulator for Smart Emergency Management. Comput. J. 2010;53:1384–1400. doi: 10.1093/comjnl/bxq012.
    1. Knerr B.W. Immersive Simulation Training for the Dismounted Soldier. Army Research Inst Field Unit; Orlando, FL, USA: 2007. No. ARI-SR-2007-01.
    1. Lele A. Virtual reality and its military utility. J. Ambient Intell. Hum. Comput. 2011;4:17–26. doi: 10.1007/s12652-011-0052-4.
    1. Zhang Z., Zhang M., Chang Y., Aziz E.-S., Esche S.K., Chassapis C. Cyber-Physical Laboratories in Engineering and Science Education. Springer; Berlin/Heidelberg, Germany: 2018. Collaborative Virtual Laboratory Environments with Hardware in the Loop; pp. 363–402.
    1. Stevens J., Mondesire S.C., Maraj C.S., Badillo-Urquiola K.A. Workload Analysis of Virtual World Simulation for Military Training; Proceedings of the MODSIM World; Virginia Beach, VA, USA. 26–28 April 2016; pp. 1–11.
    1. Frissen I., Campos J.L., Sreenivasa M., Ernst M.O. Enabling Unconstrained Omnidirectional Walking through Virtual Environments: An Overview of the CyberWalk Project. Springer; New York, NY, USA: 2013. pp. 113–144. Human Walking in Virtual Environments.
    1. Turchet L. Designing presence for real locomotion in immersive virtual environments: An affordance-based experiential approach. Virtual Real. 2015;19:277–290. doi: 10.1007/s10055-015-0267-3.
    1. Park S.Y., Ju H.J., Lee M.S.L., Song J.W., Park C.G. Pedestrian motion classification on omnidirectional treadmill; Proceedings of the 15th International Conference on Control, Automation and Systems (ICCAS); Busan, Korea. 13–16 October 2015.
    1. Papadopoulos G.T., Axenopoulos A., Daras P. Real-Time Skeleton-Tracking-Based Human Action Recognition Using Kinect Data; Proceedings of the MMM 2014; Dublin, Ireland. 6–10 January 2014.
    1. Cheng Z., Qin L., Ye Y., Huang Q., Tian Q. Human daily action analysis with multi-view and color-depth data; Proceedings of the European Conference on Computer Vision; Florence, Italy. 7–13 October 2012; Berlin/Heidelberg, Germany: Springer; 2012.
    1. Kitsikidis A., Dimitropoulos K., Douka S., Grammalidis N. Dance analysis using multiple kinect sensors; Proceedings of the 2014 International Conference on Computer Vision Theory and Applications (VISAPP); Lisbon, Portugal. 5–8 January 2014;
    1. Kwon B., Kim D., Kim J., Lee I., Kim J., Oh H., Kim H., Lee S. Implementation of human action recognition system using multiple Kinect sensors; Proceedings of the Pacific Rim Conference on Multimedia; Gwangju, Korea. 16–18 September 2015.
    1. Beom K., Kim J., Lee S. An enhanced multi-view human action recognition system for virtual training simulator; Proceedings of the 2016 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA); Jeju, Korea. 13–16 December 2016.
    1. Liu T., Song Y., Gu Y., Li A. Human action recognition based on depth images from Microsoft Kinect; Proceedings of the 2013 Fourth Global Congress on Intelligent Systems; Hong Kong, China. 3–4 December 2013.
    1. Berger K., Ruhl K., Schroeder Y., Bruemmer C., Scholz A., Magnor M.A. Marker-less motion capture using multiple color-depth sensors; Proceedings of the the Vision, Modeling, and Visualization Workshop 2011; Berlin, Germany. 4–6 October 2011.
    1. Kaenchan S., Mongkolnam P., Watanapa B., Sathienpong S. Automatic multiple kinect cameras setting for simple walking posture analysis; Proceedings of the 2013 International Computer Science and Engineering Conference (ICSEC); Nakorn Pathom, Thailand. 4–6 September 2013.
    1. Kim J., Lee I., Kim J., Lee S. Implementation of an Omnidirectional Human Motion Capture System Using Multiple Kinect Sensors. IEICE Trans. Fundam. 2015;98:2004–2008. doi: 10.1587/transfun.E98.A.2004.
    1. Taylor G.S., Barnett J.S. Evaluation of Wearable Simulation Interface for Military Training. Hum Factors. 2012;55:672–690. doi: 10.1177/0018720812466892.
    1. Barnett J.S., Taylor G.S. Usability of Wearable and Desktop Game-Based Simulations: A Heuristic Evaluation. Army Research Inst for the Behavioral and Social Sciences; Alexandria, VA, USA: 2010.
    1. Bink M.L., Injurgio V.J., James D.R., Miller J.T., II . Training Capability Data for Dismounted Soldier Training System. Army Research Inst for the Behavioral and Social Sciences; Fort Belvoir, VA, USA: 2015. No. ARI-RN-1986.
    1. Cavallari R., Martelli F., Rosini R., Buratti C., Verdone R. A Survey on Wireless Body Area Networks: Technologies and Design Challenges. IEEE Commun. Surv. Tutor. 2014;16:1635–1657. doi: 10.1109/SURV.2014.012214.00007.
    1. Alam M.M., Ben Hamida E. Surveying wearable human assistive technology for life and safety critical applications: Standards, challenges and opportunities. Sensors. 2014;14:9153–9209. doi: 10.3390/s140509153.
    1. Bukhari S.H.R., Rehmani M.H., Siraj S. A Survey of Channel Bonding for Wireless Networks and Guidelines of Channel Bonding for Futuristic Cognitive Radio Sensor Networks. IEEE Commun. Surv. Tutor. 2016;18:924–948. doi: 10.1109/COMST.2015.2504408.
    1. Ambroziak S.J., Correia L.M., Katulski R.J., Mackowiak M., Oliveira C., Sadowski J., Turbic K. An Off-Body Channel Model for Body Area Networks in Indoor Environments. IEEE Trans. Antennas Propag. 2016;64:4022–4035. doi: 10.1109/TAP.2016.2586510.
    1. Seo S., Bang H., Lee H. Coloring-based scheduling for interactive game application with wireless body area networks. J. Supercomput. 2015;72:185–195. doi: 10.1007/s11227-015-1540-7.
    1. Xsens MVN System. [(accessed on 21 January 2019)]; Available online:
    1. Tian Y., Wei H.X., Tan J.D. An Adaptive-Gain Complementary Filter for Real-Time Human Motion Tracking with MARG Sensors in Free-Living Environments. IEEE Trans. Neural Syst. Rehabil. Eng. 2013;21:254–264. doi: 10.1109/TNSRE.2012.2205706.
    1. Euston M., Coote P., Mahony R., Kim J., Hamel T. A complementary filter for attitude estimation of a fixed-wing UAV; Proceedings of the 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems; Nice, France. 22–26 September 2008.
    1. Yoo T.S., Hong S.K., Yoon H.M., Park S. Gain-Scheduled Complementary Filter Design for a MEMS Based Attitude and Heading Reference System. Sensors. 2011;11:3816–3830. doi: 10.3390/s110403816.
    1. Wu Y., Liu K.S., Stankovic J.A., He T., Lin S. Efficient Multichannel Communications in Wireless Sensor Networks. ACM Trans. Sens. Netw. 2016;12:1–23. doi: 10.1145/2840808.
    1. Fafoutis X., Marchegiani L., Papadopoulos G.Z., Piechocki R., Tryfonas T., Oikonomou G.Z. Privacy Leakage of Physical Activity Levels in Wireless Embedded Wearable Systems. IEEE Signal Process. Lett. 2017;24:136–140. doi: 10.1109/LSP.2016.2642300.
    1. Ozcan K., Velipasalar S. Wearable Camera- and Accelerometer-based Fall Detection on Portable Devices. IEEE Embed. Syst. Lett. 2016;8:6–9. doi: 10.1109/LES.2015.2487241.
    1. Ferracani A., Pezzatini D., Bianchini J., Biscini G., Del Bimbo A. Locomotion by Natural Gestures for Immersive Virtual Environments; Proceedings of the 1st International Workshop on Multimedia Alternate Realities; Amsterdam, The Netherlands. 16 October 2016.
    1. Kuipers J.B. Quaternions and Rotation Sequences. Volume 66 Princeton University Press; Princeton, NJ, USA: 1999.
    1. Karney C.F. Quaternions in molecular modeling. J. Mol. Graph. Model. 2007;25:595–604. doi: 10.1016/j.jmgm.2006.04.002.
    1. Gebre-Egziabher D., Elkaim G.H., Powell J.D., Parkinson B.W. A gyro-free quaternion-based attitude determination system suitable for implementation using low cost sensors; Proceedings of the IEEE Position Location and Navigation Symposium; San Diego, CA, USA. 13–16 March 2000.
    1. Horn B.K.P., Hilden H.M., Negahdaripour S. Closed-form solution of absolute orientation using orthonormal matrices. JOSA A. 1988;5:1127–1135. doi: 10.1364/JOSAA.5.001127.
    1. Craig J.J. Introduction to Robotics: Mechanics and Control. Volume 3 Pearson/Prentice Hall; Upper Saddle River, NJ, USA: 2005.

Source: PubMed

3
Subskrybuj