Chris
Harrison

Classroom Digital Twins with Instrumentation-Free Gaze Tracking

Classroom sensing is an important and active area of research with great potential to improve instruction. Complementing professional observers - the current best practice - automated pedagogical professional development systems can attend every class and capture fine-grained details of all occupants. One particularly valuable facet to capture is class gaze behavior. For students, certain gaze patterns have been shown to correlate with interest in the material, while for instructors, student-centered gaze patterns have been shown to increase approachability and immediacy. Unfortunately, prior classroom gaze-sensing systems have limited accuracy and often require specialized external or worn sensors.

In this work, we put forward the idea of a classroom "digital twin" – a concept borrowed from the Internet of Things (IoT) research - which we believe can serve as an important contextual container for classroom sensor data, on top of which future end-user applications can be built. More formally, digital twins are a "dynamic virtual representations of a physical system, using real-time data to enable understanding, learning and reasoning". The concept is akin to a simulation, but employs authentic sensed data from actual physical environments. Importantly, it allows this measured data to be better contextualized in a rich, three-dimensional scene that can be viewed and manipulated in space and time. A classroom is a great exemplar of such a complex physical environment, which contains objects of various functions (whiteboards, projection screens, podiums, seats, tables) and occupants in at least two different roles. There are strong contextual and spatial relationships between these physical elements that can be (re)played out and analyzed in a digital twin that rows in a database or lines on a chart cannot so easily provide. As a specific proof-of-concept data source for investigation, we digitize classroom gaze: a feature made richer by being contextualized in a dynamic 3D scene. Apart from providing a rich data source for modeling, gaze also provides psychological signals of great importance for both studying and improving classroom teaching.

Download

Reference

Karan Ahuja, Deval Shah, Sujeath Pareddy, Franceska Xhakaj, Amy Ogan, Yuvraj Agarwal, and Chris Harrison. 2021. Classroom Digital Twins with Instrumentation-Free Gaze Tracking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21). Association for Computing Machinery, New York, NY, USA, Article 484, 1–9. DOI: https://doi.org/10.1145/3411764.3445711

© Chris Harrison