Bimodal moment-by-moment coupling in perceptual multistability

Multistable perception occurs in all sensory modalities, and there is ongoing theoretical debate about whether there are overarching mechanisms driving multistability across modalities. Here we study whether multistable percepts are coupled across vision and audition on a moment-by-moment basis. To assess perception simultaneously for both modalities without provoking a dual-task situation, we query auditory perception by direct report, while measuring visual perception indirectly via eye movements. A support-vector-machine (SVM)–based classifier allows us to decode visual perception from the eye-tracking data on a moment-by-moment basis. For each timepoint, we compare visual percept (SVM output) and auditory percept (report) and quantify the co-occurrence of integrated (one-object) or segregated (two-objects) interpretations in the two modalities. Our results show an above-chance coupling of auditory and visual perceptual interpretations. By titrating stimulus parameters toward an approximately symmetric distribution of integrated and segregated percepts for each modality and individual, we minimize the amount of coupling expected by chance. Because of the nature of our task, we can rule out that the coupling stems from postperceptual levels (i.e., decision or response interference). Our results thus indicate moment-by-moment perceptual coupling in the resolution of visual and auditory multistability, lending support to theories that postulate joint mechanisms for multistable perception across the senses.

This article is published in the "Journal of Vision" (2024).

Bibliographic information

Title:  Bimodal moment-by-moment coupling in perceptual multistability

Written by:  J. Grenzebach, T. G. G. Wegner, W. Einhäuser, A. Bendixen

in: Journal of Vision, Volume 24, Issue 5, 2024.  pages: 1-18, DOI: 10.1167/jov.24.5.16

Download file "Bimodal moment-by-moment coupling in perceptual multistability" (PDF, 5 MB, Not barrier-free file)