Earthquake Safety Training through Virtual Drills

Changyang Li1,2

Wei Liang1

Chris Quigley2

Yibiao Zhao3

Lap-Fai Yu2

1Beijing Institute of Technology

2University of Massachusetts Boston

3Massachusetts Institute of Technology



Abstract

Recent popularity of consumer-grade virtual reality devices, such as the Oculus Rift and the HTC Vive, has enabled household users to experience highly immersive virtual environments. We take advantage of the commercial availability of these devices to provide an immersive and novel virtual reality training approach, designed to teach individuals how to survive earthquakes, in common indoor environments. Our approach makes use of virtual environments realistically populated with furniture objects for training. During a training, a virtual earthquake is simulated. The user navigates in, and manipulates with, the virtual environments to avoid getting hurt, while learning the observation and self-protection skills to survive an earthquake. We demonstrated our approach for common scene types such as offices, living rooms and dining rooms. To test the effectiveness of our approach, we conducted an evaluation by asking users to train in several rooms of a given scene type and then test in a new room of the same type. Evaluation results show that our virtual reality training approach is effective, with the participants who are trained by our approach performing better, on average, than those trained by alternative approaches in terms of the capabilities to avoid physical damage and to detect potentially dangerous objects.

Index Terms: Virtual reality, modeling and simulation, virtual worlds training simulations

Publication:

Acknowledgements:

The authors would like to thank Ana Aravena for her help on narration. We thank Mengyao Jia for serving as the model of the illustrations in the main paper and video. We also thank Haikun Huang and Yumeng Wang for helping with our experiments. This research is supported by the National Science Foundation under award number 1565978. This research is supported by the University of Massachusetts Boston StartUp Grant P20150000029280 and by the Joseph P. Healey Research Grant Program provided by the Office of the Vice Provost for Research and Strategic Initiatives & Dean of Graduate Studies of the University of Massachusetts Boston. We also acknowledge NVIDIA Corporation for graphics card donation.


An office scene used as an illustrative example of our approach



Scenes in our work