037-22 – Hippocampal theta tracks audio-visual integration in natural speech and predicts episodic memory formation

037-22
Hippocampal theta tracks audio-visual integration in natural speech and predicts episodic memory formation
Emmanuel Biau
University of Liverpool
The Abstract
Abstract Body

Recent studies showed that the brain takes advantage of the synchrony between rhythmic visual and auditory stimulations peaking at 4Hz theta rate to form new episodic memories. We extended this principle to the audio-visual speech domain where syllable onsets temporally align lip movements with speech sounds on a common theta phase. We hypothesised that corresponding theta activity in the hippocampus tracks audio-visual synchrony during speech perception to bind information and form new speech memories. We designed a memory task presenting participants with short audio-visual clips of speakers’ face engaged in conversations, for which lip movements and auditory speech signals were either synchronous or asynchronous. In a subsequent retrieval test, participants were visually cued with a face picture taken from an audio-visual clip, and they had to recall which auditory speech was associated with the face cue during previous encoding. We recorded participants’ magneto-encephalogram signal (MEG) to address whether hippocampal oscillatory during encoding predicted memory performances, depending on audio-visual speech synchrony. Results revealed that synchronous stimuli elicited greater theta activity in the language networks at cortical level. Downstream, the audio-visual asynchrony decreased theta power responses in the hippocampus during speech encoding, which predicted differences of memory accuracy between synchronous and asynchronous conditions. Finally, phase-coupling analyses revealed that the visual cortex and the hippocampus synchronised better through theta activity during synchronous speech encoding, which also predicted subsequent memory accuracy. Altogether, these results show that theta oscillations from the hippocampus track audio-visual speech integration to form new episodic memories.

Additional Authors
Danying Wang
Hyojin Park
Ole Jensen
Simon Hanslmayr
Additional Institutions
University of Glasgow
University of Birmingham