Perceptual averaging by eye and ear: computing summary statistics from multimodal stimuli

Atten Percept Psychophys. 2012 Jul;74(5):810-5. doi: 10.3758/s13414-012-0293-0.

Abstract

Beyond perceiving the features of individual objects, we also have the intriguing ability to efficiently perceive average values of collections of objects across various dimensions. Over what features can perceptual averaging occur? Work to date has been limited to visual properties, but perceptual experience is intrinsically multimodal. In an initial exploration of how this process operates in multimodal environments, we explored statistical summarizing in audition (averaging pitch from a sequence of tones) and vision (averaging size from a sequence of discs), and their interaction. We observed two primary results. First, not only was auditory averaging robust, but if anything, it was more accurate than visual averaging in the present study. Second, when uncorrelated visual and auditory information were simultaneously present, observers showed little cost for averaging in either modality when they did not know until the end of each trial which average they had to report. These results illustrate that perceptual averaging can span different sensory modalities, and they also illustrate how vision and audition can both cooperate and compete for resources.

MeSH terms

  • Attention*
  • Discrimination, Psychological
  • Humans
  • Judgment
  • Pattern Recognition, Visual*
  • Pitch Discrimination*
  • Problem Solving*
  • Size Perception
  • Speech Perception*
  • Students / psychology