Retargeted Self-Haptics for Increased Immersion in VR without Instrumentation

The developer and user base of virtual reality (VR) has grown tremendously in recent years due to successful consumer-oriented devices like the Oculus Quest 2 and HTC VIVE Cosmos. These systems offer immersive graphics and audio, but physical touch feedback continues to be limited. The very highest-end consumer systems feature dual handheld controllers with integrated vibrotactile actuators. Of course, a buzzing sensation applied to one’s palm falls short of any realistic interaction with a physical object or surface. Research systems utilizing e.g., special room infrastructure and body exoskeletons are expensive, heavy, and generally limit mobility and consumer viability. More lightweight and mobile-friendly VR experiences prefer to avoid encumbering the user's hands and operate entirely in the air, offering no means for haptic feedback (e.g., Waltz of the Wizard on the Oculus Quest). In short, although we can build highly-detailed, near-photo-realistic digital worlds, we cannot yet reach out and feel them in ways practical for consumer adoption. For this reason, innovative haptic feedback approaches in VR is an active area of research across several communities.

In this paper, we describe a new take on haptic delivery in VR – to use one’s own body for physical feedback. This “Self-Haptics” approach is highly practical, as we do not require any additional hardware to achieve a haptic effect. Instead, through graphical (i.e., software) manipulation of a user’s virtual hand position and pose, we can guide the actual hands together in a realistic way during bimanual tasks, what is known as retargeting or redirection in the literature. At this intersection point, with careful design, one hand can provide a physical shape or surface that corresponds to an interactive element in VR, allowing the other hand to physically feel and interact with a virtual element. We note that perfect realism is not our goal – our haptic feedback is inherently coarse (i.e., lacking appropriate texture, compliance, and contour). However, we found the visuo-audio-haptic fusion makes the interaction surprisingly fun and immersive, as deemed by our study participants. We also note that our effect is not illusory, as users can readily sense their own body is being co-opted for haptic effect. In some use cases, we can even provide appropriate haptic feedback to both hands – a haptic category that found to be most successful in our user study.

After reviewing key related work, we describe a design space we used to categorize our explorations. In total, we built twelve interactive demos that exemplify three haptic categories. To investigate if these paradigms were successful, we selected two demos per haptic category to show to participants and gather feedback. At a high level, our study shows imminent feasibility; the haptic feedback works and serves to boost realism and immersion compared to experiences with no haptics. That said, the interactive design tends to be highly bespoke (unlike vibration feedback, which can be easily applied to e.g., any collision), and thus is not going to be applicable in all VR contexts and interactions. Nonetheless, we believe our approach is an intriguing new way to deliver haptics to users in a practical manner, given that it requires no new or extra hardware. Indeed, any VR headset with uninstrumented hand tracking (such as the Oculus Quest 2) could support games and other experiences incorporating self-haptic interactions. We hope this paper serves to stimulate new ideas, and we have no doubt that creative VR developers could build unique and compelling experiences around this concept.



Fang, C. and Harrison, C. 2021. Retargeted Self-Haptics for Increased Immersion in VR without Hand Instrumentation. In Proceedings of the 34th Annual ACM Symposium on User Interface Software and Technology (October 10 - 13, 2021). UIST '21. ACM, New York, NY. 1109–1121.

© Chris Harrison