Abstract
Studies of memory and learning have usually focused on a single sensory modality, although human perception is multisensory in nature. In the present study, we investigated the effects of audiovisual encoding on later unisensory recognition memory performance. The participants were to memorize auditory or visual stimuli (sounds, pictures, spoken words, or written words), each of which co-occurred with either a semantically congruent stimulus, incongruent stimulus, or a neutral (non-semantic noise) stimulus in the other modality during encoding. Subsequent memory performance was overall better when the stimulus to be memorized was initially accompanied by a semantically congruent stimulus in the other modality than when it was accompanied by a neutral stimulus. These results suggest that semantically congruent multisensory experiences enhance encoding of both nonverbal and verbal materials, resulting in an improvement in their later recognition memory.
References
2004). The Handbook of Multisensory Processes. London, UK: The MIT Press.
(2010). When hearing the bark helps to identify the dog: Semantically-congruent sounds modulate the identification of masked pictures. Cognition, 114, 389–404.
(2009). Auditory recognition memory is inferior to visual recognition memory. Proceedings of the National Academy of Sciences of the United States of America, 106, 6008–6010.
(2009). Semantic encoding in working memory: Is there a (multi)modality effect? Memory, 17, 655–663.
(2013). Seeing and hearing a word: Combining eye and ear is more efficient than combining the parts of word. PLOS One, 8(5), e64803.
(2005). Bimodal format effects in working memory. American Journal of Psychology, 118, 61–77.
(2004). Semantic congruence is a critical factor in multisensory behavioral performance. Experimental Brain Research, 158, 405–414.
(2005). The role of multisensory memories in unisensory object discrimination. Cognitive Brain Research, 24, 326–334.
(1993). The effects of redundancy in bimodal word processing. Human Performance, 6, 229–239.
(2004). Multisensory visual-auditory object recognition in humans: A high-density electrical mapping study. Cerebral Cortex, 14, 452–465.
(2005). The brain uses single-trial multisensory memories to discriminate without awareness. NeuroImage, 27, 473–478.
(2004). Rapid discrimination of visual and multisensory memories revealed by electrical neuroimaging. NeuroImage, 21, 125–135.
(2010). Repetition blindness and the Colavita effect. Neuroscience Letters, 480, 186–190.
(2000). Reactivation of encoding-related brain activity during memory retrieval. Proceedings of the National Academy of Sciences of the United States of America, 97, 11120–11124.
(1993). Very short-term conceptual memory. Memory and Cognition, 21, 156–161.
(2012). Conceptual short term memory in perception and thought. Frontiers in Psychology, 3, 1–11.
(2006). Working memory for visual objects: Complementary roles of inferior temporal, medial temporal, and prefrontal cortex. Neuroscience, 139, 277–289.
(2005). Two brain pathways for attended and ignored words. NeuroImage, 27, 852–861.
(2008). Multisensory identification of natural objects in a two-way crossmodal priming paradigm. Experimental Psychology, 55, 121–132.
(2008). Benefits of multisensory learning. Trends in Cognitive Sciences, 12, 411–417.
(2009). Integration of auditory and visual information in the recognition of realistic objects. Experimental Brain Research, 194, 91–102.
(2012). Electrical neuroimaging of memory discrimination based on single-trial multisensory learning. NeuroImage, 62, 1478–1488.
(1994). Memory for pictures and sounds: Independence of auditory and visual codes. Canadian Journal of Experimental Psychology, 48, 380–398.
(2002). Evidence for cortical encoding specificity in episodic memory: Memory-induced re-activation of picture processing areas. Neuropsychologia, 40, 2136–2143.
(2006). Implicit multisensory associations influence voice recognition. PLoS Biology, 4, 1809–1819.
(2000). Memory’s echo: Vivid remembering reactivates sensory-specific cortex. Proceedings of the National Academy of Sciences of the United States of America, 97, 11125–11129.
(2009). The dog’s meow: Asymmetrical interaction in cross-modal object recognition. Experimental Brain Research, 193, 603–614.
(