Predictive feedback to V1 dynamically updates with sensory input

Grace Edwards, Petra Vetter, Fiona McGruer, Lucy Petro, Lars Muckli

Research output: Contribution to journalArticlepeer-review

65 Downloads (Pure)

Abstract

Predictive coding theories propose that the brain creates internal models of the environment to predict upcoming sensory input. Hierarchical predictive coding models of vision postulate that higher visual areas generate predictions of sensory inputs and feed them back to early visual cortex. In V1, sensory inputs that do not match the predictions lead to amplified brain activation, but does this amplification process dynamically update to new retinotopic locations with eye-movements? We investigated the effect of eye-movements in predictive feedback using functional brain imaging and eye-tracking whilst presenting an apparent motion illusion. Apparent motion induces an internal model of motion, during which sensory predictions of the illusory motion feed back to V1. We observed attenuated BOLD responses to predicted stimuli at the new post-saccadic location in V1. Therefore, pre-saccadic predictions update their retinotopic location in time for post-saccadic input, validating dynamic predictive coding theories in V1.
Original languageEnglish
Article number16538
Pages (from-to)1-12
Number of pages12
JournalScientific Reports
Volume7
DOIs
Publication statusPublished - 28 Nov 2017

Keywords

  • fMRI, predictive coding, eye movements, saccades, vision, early visual cortex, V1, apparent motion, feedback, top-down

Cite this