Chris
Harrison

EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets

Low cost virtual reality (VR) headsets powered by smartphones are becoming ubiquitous. Their unique position on the user's face opens interesting opportunities for interactive sensing. In this paper, we describe EyeSpyVR, a software-only eye sensing approach for smartphone-based VR, which uses a phone's front facing camera as a sensor and its display as a passive illuminator. Our proof-of-concept system, using a commodity Apple iPhone, enables four sensing modalities: detecting when the VR head set is worn, detecting blinks, recognizing the wearer's identity, and coarse gaze tracking - features typically found in high-end or specialty VR headsets. We demonstrate the utility and accuracy of EyeSpyVR in a series of studies with 70 participants, finding a worn detection of 100%, blink detection rate of 95.3%, family user identification accuracy of 81.4%, and mean gaze tracking error of 10.8° when calibrated to the wearer (12.9° without calibration). These sensing abilities can be used by developers to enable new interactive features and more immersive VR experiences on existing, off-the-shelf hardware.

Download

Reference

Ahuja, K., Islam, R., Parashar, V., Dey, K., Harrison, C. and Goel, M. 2018. EyeSpyVR: Interactive Eye Sensing Using Off-the-Shelf, Smartphone-Based VR Headsets. In Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies. UBICOMP ’18. 2, 2, Article 57 (July 2018). ACM, New York, NY. 10 pages.

© Chris Harrison