Student Performance Prediction with Eye-Gaze Data in Embodied Educational Context

Thumbnail Image

Date

2022-07-07

Journal Title

Journal ISSN

Volume Title

Type

Publisher

Kluwer Academic Publishers

Series Info

Education and Information Technologies;

Abstract

Recent advances in sensor technology, including eye-gaze tracking, have introduced the opportunity to incorporate gaze into student modelling within an embodied learning context. The produced multimodal data is used to uncover cognitive, be- havioural, and afective processes during the embodied learning activity. However, the use of eye-tracking data presenting visual attention to understand students’ be- haviours and learning performance during engagement with tangible learning activ- ity is rather unexplored. Therefore, this paper explores the integration of eye-gaze features to predict students’ learning performance during an embodied activity. We present an in-situ study where 110 primary school students (aged 8–9 years), solved a tangible learning activity for learning human body anatomy. During the experi- ment, students’ learning experience was monitored by collecting their eye-tracking data, learning profles, academic performances, and time to complete the activity. We applied predictive modelling to identify the synergies between eye-gaze features and students’ learning performance. The obtained results suggest that combining eye-gaze tracking with learning traces and behaviour attributes may support an accurate prediction of students’ learning performance. This research sheds light on the opportunities ofered in the intersection of eye-gaze tracking with learning traces, and its possible contribution to investigating students’ behaviour within an embodied learning context.

Description

Keywords

Embodied learning, Tangible user interfaces, Multimodal data, Eye tracking

Citation