LumiWatch: On-Arm Projected Graphics and Touch Input

Compact, worn computers with projected, on-skin touch interfaces have been a long-standing yet elusive goal, largely written off as science fiction. Such devices offer the potential to mitigate the significant human input/output bottleneck inherent in worn devices with small screens. In this work, we present the first, fully-functional and self-contained projection smartwatch implementation, containing the requisite compute, power, projection and touch-sensing capabilities. Our watch offers roughly 40 square centimeters of interactive surface area – more than five times that of a typical smartwatch display. We demonstrate continuous 2D finger tracking with interactive, rectified graphics, transforming the arm into a touchscreen. We discuss our hardware and software implementation, as well as evaluation results regarding touch accuracy and projection visibility.

This research was done in collaboration with ASU Tech.

Additional media can be found on Robert Xiao's site.

Download

Reference

Xiao, R., Cao, T., Guo, N., Zhuo, J., Zhang, Y. and Harrison, C. 2018. LumiWatch: On-Arm Projected Graphics and Touch Input. In Proceedings of the 36th Annual SIGCHI Conference on Human Factors in Computing Systems (Montreal, Canada, April 21 - 26, 2018). CHI '18. ACM, New York, NY. Paper 95, 11 pages.

© Chris Harrison