Evaluating the relevance of gaze cues in context: A study of virtual joint attention interactions in autism and typical development
Joint attention – the ability to attend to the same thing as others – is an important social ability and its delayed development is characteristic in autism. This may reflect a specific difficulty in identifying relevant communicative cues (e.g., eye movements) embedded in a realistic social interaction. This project investigated this hypothesis by utilising an ecologically valid joint attention paradigm. Participants played a co-operative interactive game with an on-screen avatar which required the participant to evaluate and respond to the eye gaze behaviour of their partner. In Experiment1, neurotypical adults played three different versions of this game manipulating the contextual clues available prior to the joint attention bid. Here, the context conditions included non-communicative eye movements which were either (1) informative, by being predictive of the target’s location (Predictive Search), (2) non-informative, not predictive of the target’s location (Random Search), or (3) did not contain non-communicative eye movements before the joint attention bid (NoSearch). Each context was performed once with each stimulus (Eyes and Arrows). Data was analysed for accuracy and saccadic reaction times (SRT) in response to joint attention bids. Results revealed that, overall, participants made more errors with the Random Search context than both the NoSearch and Predictive contexts. They were also significantly faster to respond on the Predictive Search than the NoSearch and Random Search contexts with both Eyes and Arrows stimuli. Critically, the disadvantage (i.e., slower reaction time) for embedding Eyes in Random context was smaller than Arrows compared to NoSearch baseline. Additionally, the advantage (i.e., faster reaction time) for embedding Eyes in the Predictive context was larger than Arrows compared to the NoSearch baseline. These findings collectively suggest a relative advantage for identifying relevant Eyes rather than Arrows when embedded in a realistic context. This relative advantage could be attributed to the unique advantage of eye contact as an ostensive signal.
In Experiment 2, we asked young autistic people to play the same task, except that we investigated two contexts only (i.e., Random and NoSearch) and two stimuli (i.e., Eyes, Arrows). Comparisons between responsivity of the autistic group to a neurotypical comparison group revealed no overall significant group differences in terms of accuracy and SRT. Participants in both groups made more errors with the Random compared to NoSearch context. They were also slower to respond on the Random Search compared to NoSearch. Investigating the trend of effects for SRT in each group separately revealed that the neurotypical group showed the same relative advantage for Eyes compared to Arrows as in Experiment 1. This was again characterised by a smaller effect of Random context on responsivity for Eyes than Arrows. This relative advantage, however, was not replicated in the autistic group. This finding reveals that although young autistic individuals were able to complete the tasks with performance that is comparable to their neurotypical peers, they seem to lack sensitivity to eye contact as an ostensive signal. These findings are critical in understanding the specific factors that contribute to the difficulty faced by autistic people in responding to joint attention. More work is needed to verify these findings using larger sample sizes. Future studies should also investigate the influence of a Predictive context (as in Experiment 1) on joint attention to further understand how informative contextual clues affect responsivity in autism.