Supporting Responsive Cohabitation Between Virtual Interfaces and Physical Objects on Everyday Surfaces

Researchers and practitioners have long articulated the vision and promise of digitally augmented desks. Seminal work emerged in the early 1990’s, most notably Xerox PARC’s DigitalDesk. Since then, dozens of systems have been proposed and built, demonstrating superposition of content onto physical artifacts, the use of physical objects for tangible interaction, in situ remote collaboration, and more generally, interactive applications on desk surfaces. However, a notable commonality of these futuristic systems is the minimalist nature of the surfaces used – often lacking keyboards, mice, mugs, papers, knickknacks, and other contemporary and commonplace items. Of course today’s desk surfaces play host to a wide variety of items of varying shape and size. Moreover, these objects rarely conform to a grid or even common orientation. Desks are also constantly in flux, with items moving, stacking, appearing and disappearing. Example events include sliding a laptop out the way to make room for new work, or resting a fresh cup of coffee on the work surface. If digital desks do not account for these basic physical actions, applications can become brittle (e.g., does a mug placed on top of a virtual keyboard inject spurious touch input?) or inaccessible (e.g., if a book is placed over an interface, how does one access it?). Further, because physical objects cannot move or resize on their own, the burden of responsiveness falls to the digital elements. Thus, digital applications must employ a variety of strategies to successfully cohabit a work surface with physical artifacts.

To help close this gap, we conducted an elicitation study and derived a list of ten fundamental interactive behaviors that responsive desk-bound virtual applications should exhibit. To demonstrate these behaviors can be achieved practically and in real time, we built a proof-of-concept system with the necessary technical advances to support each behavior. This system had to move beyond prior work in several key ways; for example, our system requires no calibration to the world, allowing the desk scene to be in flux (i.e., there is no notion of a “background”). Further, our touch tracking approach distinguishes human “objects” (arms, hands, fingers) from other objects. This ability is critical for responsive interfaces, which must respond to user movement and input differently from changes to the physical environment (e.g., interfaces should evade from your coffee mug, but not from your hands).

Additional media can be found on Robert Xiao's site.

This research was generously supported with funding from Intel and The David and Lucile Packard Foundation.



Xiao, R., Hudson, S.E. and Harrison, C. 2017. Supporting Responsive Cohabitation Between Virtual Interfaces and Physical Objects on Everyday Surfaces. In Proceedings of the 9th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (Lisbon, Portugal, June 26 – 29, 2017). EICS ’17. ACM, New York, NY. Article 11.

© Chris Harrison