Chris
Harrison

SurfaceSight: A New Spin on Touch, User, and Object Sensing for IoT Experiences

IoT appliances are gaining consumer traction, from smart thermostats to smart speakers. These devices generally have limited user interfaces, most often small buttons and touchscreens, or rely on voice control. Further, these devices know little about their surroundings – unaware of objects, people and activities happening around them. Consequently, interactions with these “smart” devices can be cumbersome and limited.

In this work, we describe SurfaceSight, an approach that enriches IoT experiences with rich touch and object sensing, offering a complementary input channel and increased contextual awareness. For sensing, we incorporate LIDAR into the base of IoT devices, providing an expansive, ad hoc plane of sensing just above the surface on which devices rest. We can recognize and track a wide array of objects, including finger input and hand gestures. We can also track people and estimate which way they are facing. We evaluate the accuracy of these new capabilities and illustrate how they can be used to power novel and contextually-aware interactive experiences.

Download

Reference

Laput, G. and Harrison, C. 2019. SurfaceSight: A New Spin on Touch, User, and Object Sensing for IoT Experiences. In Proceedings of the 37th Annual SIGCHI Conference on Human Factors in Computing Systems (Glasgow, UK, May 4 - 9, 2019). CHI '19. ACM, New York, NY. Paper 329, 12 pages.

© Chris Harrison