posted on 2021-06-09, 11:13authored byD Souto, O Marsh, C Hutchinson, S Judge, KB Paterson
The last twenty years have seen the development of gaze-controlled computer interfaces for augmentative communication and other assistive technology applications. In many applications, the user needs to look at symbols on a virtual on-screen keyboard and maintain gaze to make a selection. Executive control is essential to learning to use gaze-control, affecting the uptake of the technology. Specifically, the user of a gaze-controlled interface must suppress looking for its own sake, the so-called “Midas touch” problem. In a pre-registered study (https://osf.io/2mak4), we tested whether gaze-typing performance depends on executive control and whether learning-dependent plasticity leads to improved executive control as measured using the antisaccade task. Forty-two university students were recruited as participants. After five 30-min training sessions, we found shorter antisaccade latencies in a gaze-control compared to a mouse-control group, and similar error-rates. Subjective workload ratings were also similar across groups, indicating the task in both groups was matched for difficulty. These findings suggest that executive control contributes to gaze-typing performance and leads to learning-induced plasticity.
History
Citation
Computers in Human Behavior
Volume 122, September 2021, 106831
Author affiliation
Department of Neuroscience, Psychology and Behaviour, College of Life Sciences