References of "Webster, M.A."
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailAll-or-none face categorization in the human brain
Retter, Talia UL; Jiang, F.; Webster, M.A. et al

in NeuroImage (2020)

Visual categorization is integral for our interaction with the natural environment. In this process, similar selective responses are produced to a class of variable visual inputs. Whether categorization ... [more ▼]

Visual categorization is integral for our interaction with the natural environment. In this process, similar selective responses are produced to a class of variable visual inputs. Whether categorization is supported by partial (graded) or absolute (all-or-none) neural responses in high-level human brain regions is largely unknown. We address this issue with a novel frequency-sweep paradigm probing the evolution of face categorization responses between the minimal and optimal stimulus presentation times. In a first experiment, natural images of variable non-face objects were progressively swept from 120 to 3 ​Hz (8.33–333 ​ms duration) in rapid serial visual presentation sequences. Widely variable face exemplars appeared every 1 ​s, enabling an implicit frequency-tagged face-categorization electroencephalographic (EEG) response at 1 ​Hz. Face-categorization activity emerged with stimulus durations as brief as 17 ​ms (17–83 ​ms across individual participants) but was significant with 33 ​ms durations at the group level. The face categorization response amplitude increased until 83 ​ms stimulus duration (12 ​Hz), implying graded categorization responses. In a second EEG experiment, faces appeared non-periodically throughout such sequences at fixed presentation rates, while participants explicitly categorized faces. A strong correlation between response amplitude and behavioral accuracy across frequency rates suggested that dilution from missed categorizations, rather than a decreased response to each face stimulus, accounted for the graded categorization responses as found in Experiment 1. This was supported by (1) the absence of neural responses to faces that participants failed to categorize explicitly in Experiment 2 and (2) equivalent amplitudes and spatio-temporal signatures of neural responses to behaviorally categorized faces across presentation rates. Overall, these observations provide original evidence that high-level visual categorization of faces, starting at about 100 ​ms following stimulus onset in the human brain, is variable across observers tested under tight temporal constraints, but occurs in an all-or-none fashion. [less ▲]

Detailed reference viewed: 20 (0 UL)
Full Text
Peer Reviewed
See detailNeural correlates of perceptual color inferences as revealed by #thedress
Retter, Talia UL; Gwinn, O.S.; O'Neil, S.F. et al

in Journal of Vision (2020), 20(3(7)),

Color constancy involves disambiguating the spectral characteristics of lights and surfaces, for example to distinguish red in white light from white in red light. Solving this problem appears especially ... [more ▼]

Color constancy involves disambiguating the spectral characteristics of lights and surfaces, for example to distinguish red in white light from white in red light. Solving this problem appears especially challenging for bluish tints, which may be attributed more often to shading, and this bias may underlie the individual differences in whether people described the widely publicized image of #thedress as blue-black or white-gold. To probe these higher-level color inferences, we examined neural correlates of the blue-bias, using frequency-tagging and high-density electroencephalography to monitor responses to 3-Hz alternations between different color versions of #thedress. Specifically, we compared relative neural responses to the original “blue” dress image alternated with the complementary “yellow” image (formed by inverting the chromatic contrast of each pixel). This image pair produced a large modulation of the electroencephalography amplitude at the alternation frequency, consistent with a perceived contrast difference between the blue and yellow images. Furthermore, decoding topographical differences in the blue-yellow asymmetries over occipitoparietal channels predicted blue-black and white-gold observers with over 80% accuracy. The blue-yellow asymmetry was stronger than for a “red” versus “green” pair matched for the same component differences in L versus M or S versus LM chromatic contrast as the blue-yellow pair and thus cannot be accounted for by asymmetries within either precortical cardinal mechanism. Instead, the results may point to neural correlates of a higher-level perceptual representation of surface colors. [less ▲]

Detailed reference viewed: 19 (0 UL)