COMING SOON! PQDT Open is getting a new home!

ProQuest Open Access Dissertations & Theses will remain freely available as part of a new and enhanced search experience at

Questions? Please refer to this FAQ.

Dissertation/Thesis Abstract

The role of attention in eye gaze cuing accuracy
by Truong, Bao Quoc, Ph.D., University of California, Irvine, 2009, 154; 3355939
Abstract (Summary)

How accurately can humans infer the direction of another person’s gaze? To study this question, I presented viewers with images of faces that gazed at locations marked by grey spheres. Viewers had to respond as soon they could detect that one of the spheres turned black. This marked the appearance of the target. Viewers responded significantly faster when the face gazed at the target (congruent gaze) versus when the face gazed in the opposite direction of the target (incongruent gaze). The design of this study was based on a target-detection paradigm. Unlike target-detection paradigms from previous eye gaze research, I presented the target and locations in stereo and varied their position in direction and depth. The face stimuli varied in: (1) stereo-cuing (stereoscopic vs. non-stereoscopic), and (2) gaze direction (congruent vs. incongruent). I also varied facial attributes by: (1) rotating the face, (2) occluding either one of the eyes, (3) removing all of the facial features except for the eyes, and (4) reversing the contrast values of the colors in the face. Under certain conditions, viewers responded significantly faster when the target appeared on the left side of the display versus when the target appeared on the right side. This pattern implicated that subjects displayed a left visual field bias (LVF), a phenomenon commonly associated with face processing. The LVF bias only occurred when: (1) the face had stereo cues, (2) the face was viewed frontally, (3) the face had both eyes exposed, and (4) the face remained gazed at the viewer before and after the target appeared. According to EEG studies, the facial attributes present during these effects were also attributes in that elicited the N170 response, a scalp recorded event related potential produced by the synchronous activation of the fusiform gyrus, superior temporal sulcus and occipital regions. Findings suggest that viewers infer gaze direction to locations in 3D by calculating a 3D vector from each eye in the gaze stimuli.

Key findings that provide novel contributions to current eye gaze research include: (1) viewers attended more to 3D locations in the visual field pointed to by a rotated head, (2) viewers attended more to 3D locations in the visual field of the occluded eye when viewing one-eyed gaze stimuli, (3) viewers accurately oriented to 3D locations pointed to by the exposed eye when viewing one-eyed gaze stimuli, (4) removing facial features leaving only two eyeballs in the stimuli resulted in less attention to 3D locations when compared to the amount of attention that was distributed when viewing a frontal face with full gaze, and (5) responses to eye gaze stimuli where the eye in the LVF was occluded were significantly slower than responses to eye gaze stimuli where both eyes were exposed.

Findings demonstrate how viewing certain stimulus classes (i.e . face and eye gaze) that are processed by specialized cognitive systems, influence how attention is distributed throughout the visual field. Such knowledge can be leveraged by professionals in the field of visual attention (e.g. researchers, marketers, etc.) when developing products and tools that convey crucial visual information.

Indexing (document details)
Advisor: Hoffman, Donald D.
Commitee: Iversion, Geoffrey J., Sperling, George
School: University of California, Irvine
Department: Psychology - Ph.D.
School Location: United States -- California
Source: DAI-B 70/05, Dissertation Abstracts International
Subjects: Cognitive psychology
Keywords: Accuracy, Attention, Eye gaze, Eye occlusion, Left visual field bias, Stereoscopic
Publication Number: 3355939
ISBN: 978-1-109-15546-4
Copyright © 2021 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy