Chris
Harrison

TapSense: Enhancing Finger Interaction on Touch Surfaces

At present, finger input on touch screens is handled very simplistically - essentially boiled down to an X/Y coordinate. However, human fingers are remarkably sophisticated, both in their anatomy and motor capabilities. TapSense is an enhancement to touch interaction that allows conventional screens to identify how the finger is being used for input. This is achieved by segmenting and classifying sounds resulting from a finger’s impact. Our system can recognize different finger locations – including the tip, pad, nail and knuckle – without the user having to wear any electronics. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, where input bandwidth is limited due to small screens and fat fingers. For example, a knuckle tap could serve as a “right click” for mobile device touch interaction, effectively doubling input bandwidth. Our system can also be used to identify different sets of passive tools. We conclude with a comprehensive investigation of classification accuracy and training implications. Results show our proof-of-concept system can support sets with four input types at around 95% accuracy. Small, but useful input sets of two (e.g., pen and finger discrimination) can operate in excess of 99% accuracy.

This technology has been commercialized by CMU spinoff Qeexo under the name FingerSense. Robert Xiao made innumerable contributions to the machine learning engine which powers the product.

Documents

Reference

Harrison, C., Schwarz, J. and Hudson S. E. 2011. TapSense: Enhancing Finger Interaction on Touch Surfaces. In Proceedings of the 24th Annual ACM Symposium on User interface Software and Technology. UIST '11. ACM, New York, NY. 627-636.

Videos

Photos

© Chris Harrison