Abstract

Interactive learning environments with body-centric technologies lie at the intersection of the design of embodied learning activities and multimodal learning analytics. Sensing technologies can generate large amounts of fine-grained data automatically captured from student movements. Researchers can use these fine-grained data to create a high-resolution picture of the activity that takes place during these student–computer interactions and explore whether the sequence of movements has an effect on learning. We present a use-case modelling of temporal data in an interactive learning environment with hand gestures, and discuss some validity threats if temporal dependencies are not accounted for. In particular, we assess how, if ignored, the temporal dependencies in the measurement of hand gestures might affect the goodness of fit of the statistical model and would affect the measurement of the similarity between elicited and enacted movement. Our findings show that accounting for temporality is crucial for finding a meaningful fit to the data. In using temporal analytics, we are able to create a high-resolution picture of how sensorimotor coordination correlates with learning gains in our learning system.

Details

Statistics

from
to
Export