Understanding how adolescents with autism respond to facial expressions in virtual reality environments

Esubalew Bekele, Zhi Zheng, Amy Swanson, Julie Crittendon, Zachary Warren, Nilanjan Sarkar, Esubalew Bekele, Zhi Zheng, Amy Swanson, Julie Crittendon, Zachary Warren, Nilanjan Sarkar

Abstract

Autism Spectrum Disorders (ASD) are characterized by atypical patterns of behaviors and impairments in social communication. Among the fundamental social impairments in the ASD population are challenges in appropriately recognizing and responding to facial expressions. Traditional intervention approaches often require intensive support and well-trained therapists to address core deficits, with many with ASD having tremendous difficulty accessing such care due to lack of available trained therapists as well as intervention costs. As a result, emerging technology such as virtual reality (VR) has the potential to offer useful technology-enabled intervention systems. In this paper, an innovative VR-based facial emotional expression presentation system was developed that allows monitoring of eye gaze and physiological signals related to emotion identification to explore new efficient therapeutic paradigms. A usability study of this new system involving ten adolescents with ASD and ten typically developing adolescents as a control group was performed. The eye tracking and physiological data were analyzed to determine intragroup and intergroup variations of gaze and physiological patterns. Performance data, eye tracking indices and physiological features indicated that there were differences in the way adolescents with ASD process and recognize emotional faces compared to their typically developing peers. These results will be used in the future for an online adaptive VR-based multimodal social interaction system to improve emotion recognition abilities of individuals with ASD.

Figures

Fig. 1
Fig. 1
VR-based facial expressions presentation system.
Fig. 2
Fig. 2
The Eye tracking application and its components.
Fig. 3
Fig. 3
Representative characters used in the study.
Fig. 4
Fig. 4
Anger (top) and surprise (bottom) with two arousal levels.
Fig. 5
Fig. 5
The five facial ROIs defined on the face region.
Fig. 6
Fig. 6
Gaze towards mouth and forehead regions.
Fig. 7
Fig. 7
Gaze towards face and non-face regions.
Fig. 8
Fig. 8
Gaze towards mouth and forehead regions.
Fig. 9
Fig. 9
Gaze towards face and non-face regions.
Fig. 10
Fig. 10
Gaze towards mouth and forehead regions.
Fig. 11
Fig. 11
Gaze towards face and non-face regions.
Fig. 12
Fig. 12
Gaze towards mouth and forehead regions.
Fig. 13
Fig. 13
Gaze towards face and non-face regions.
Fig. 14
Fig. 14
Comparisons of behavioral eye indices.
Fig. 15
Fig. 15
Comparisons of physiological eye indices.
Fig. 16
Fig. 16
(a) Top left: original ground truth clusters. (b) Top right: the Gaussian mixtures used in the GM clustering overlaid on the ground truth clusters. (c) Bottom left: the result of the k-mean clustering. (d) Bottom right: the result of GM clustering.

Source: PubMed

3
Sottoscrivi