Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published September 26, 2007 | Published + Supplemental Material
Journal Article Open

Causal Inference in Multisensory Perception

Abstract

Perceptual events derive their significance to an animal from their meaning about the world, that is from the information they carry about their causes. The brain should thus be able to efficiently infer the causes underlying our sensory events. Here we use multisensory cue combination to study causal inference in perception. We formulate an ideal-observer model that infers whether two sensory cues originate from the same location and that also estimates their location(s). This model accurately predicts the nonlinear integration of cues by human subjects in two auditory-visual localization tasks. The results show that indeed humans can efficiently infer the causal structure as well as the location of causes. By combining insights from the study of causal inference with the ideal-observer approach to sensory cue combination, we show that the capacity to infer causal structure is not limited to conscious, high-level cognition; it is also performed continually and effortlessly in perception.

Additional Information

© 2007 Körding et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Received: June 15, 2007; Accepted: September 3, 2007; Published: September 26, 2007. We would like to thank David Knill and Philip Sabes for inspiring discussions and Jose L. Pena for help with the setup of the experiment in figure 2. K.P.K., U.B., and W.J.M. contributed equally to this paper. [1] After completion of this work, we became aware of a paper by Yoshiyuki Sato, Taro Toyoizumi and Kazuyuki Aihara, who independently developed a similar model (Neural Computation, in press). KPK was supported by a DFG Heisenberg Stipend. UB and SQ were supported by the David and Lucille Packard Foundation as well as by the Moore Foundation. JBT was supported by the P. E. Newton Career Development Chair. LS was supported by UCLA Academic senate and Career development grants. Author Contributions: Conceived and designed the experiments: LS SQ JT. Performed the experiments: UB. Analyzed the data: UB KK WM. Wrote the paper: UB KK LS WM. Competing interests: The authors have declared that no competing interests exist. Supporting Information: Text S1. Supporting Information for "Causal inference in multisensory perception" (0.11 MB DOC) Figure S1. The interaction priors when fit to our dataset are shown for the causal inference model, the Roach et al. [1] and the Bresciani et al. priors[3]. (1.15 MB EPS) Figure S2. The average auditory bias, i.e. the relative influence of the visual position on the perceived auditory position, is shown as a function of the absolute spatial disparity (solid line, as in Fig. 2 main text) along with the model predictions (dashed lines). Red: causal inference model. Green: behavior derived from using the Roach et al prior. Purple: behaviour derived from using the Bresciani et al prior. (0.94 MB EPS)

Attached Files

Published - KORplosone07.pdf

Supplemental Material - KORplosone07figS1.pdf

Supplemental Material - KORplosone07figS2.pdf

Supplemental Material - KORplosone07supptext.doc

Supplemental Material - KORplosone07supptxt.pdf

Supplemental Material - journal.pone.0000943.s002.ps

Supplemental Material - journal.pone.0000943.s003.ps

Files

KORplosone07.pdf
Files (2.7 MB)
Name Size Download all
md5:ad013b2d416d28b7c0a9ac4aa8377756
284.6 kB Preview Download
md5:93dd53cebda4b4da495d688c5d02757b
1.1 MB Download
md5:589a0bcd5618049d6eb254bf152f5317
51.2 kB Preview Download
md5:476736636c94730401a5aed26bbd4d90
48.0 kB Preview Download
md5:922a36617660083e5584282ab3451cf2
938.2 kB Download
md5:cd8cf6cfc6c571c30cdc573b138a30a3
41.2 kB Preview Download
md5:f64ae58555ef1731257ca76bb5a82729
155.1 kB Download

Additional details

Created:
August 22, 2023
Modified:
October 16, 2023