Wall++: Room-Scale Interactive and Context-Aware Sensing

Human environments are typified by walls – homes, offices, schools, museums, hospitals and pretty much every indoor context one can imagine has walls. In many cases, they make up a majority of readily accessible indoor surface area, and yet they are static – their primary function is to be a wall, separating spaces and hiding infrastructure. We present Wall++, a low-cost sensing approach that allows walls to become a smart infrastructure. Instead of merely separating spaces, walls can now enhance rooms with sensing and interactivity. Our wall treatment and sensing hardware can track users’ touch and gestures, as well as estimate body pose if they are close. By capturing airborne electromagnetic noise, we can also detect what appliances are active and where they are located. Through a series of evaluations, we demonstrate Wall++ can enable robust room-scale interactive and context-aware applications.

Additional media can be found on Yang Zhang's site.

This research was conducted at Disney Research.



Zhang, Y., Yang, C. Hudson, S., Harrison, C. and Sample, A. 2018. Wall++: Room-Scale Interactive and Context-Aware Sensing. In Proceedings of the 36th Annual SIGCHI Conference on Human Factors in Computing Systems (Montreal, Canada, April 21 -26, 2018). CHI '18. ACM, New York, NY. Paper 273, 15 pages.

© Chris Harrison