Abstract This talk summarizes our recent attempts to integrate action and perception within a single optimization framework. We start with a statistical formulation of Helmholtz's ideas about neural energy to furnish a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. Using constructs from statistical physics it can be shown that the problems of inferring the causes of our sensory inputs and learning regularities in the sensorium can be resolved using exactly the same principles. Furthermore, inference and learning can proceed in a biologically plausible fashion. The ensuing scheme rests on Empirical Bayes and hierarchical models of how sensory information is generated. The use of hierarchical models enables the brain to construct prior expectations in a dynamic and context-sensitive fashion. This scheme provides a principled way to understand many aspects of the brain's organization and responses. We will demonstrate the brain-like dynamics that this scheme entails by using models of bird songs that are based on chaotic attractors with autonomous dynamics. This provides a nice example of how nonlinear dynamics can be exploited by the brain to represent and predict dynamics in the sensorium.. Attractors in Song Cognitive Neuroimaging Seminar on Monday 18th
Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview
agent - m environment Separated by a Markov blanket External states Internal states Sensation Action Exchange with the environment
Perceptual inference Perceptual learning Perceptual uncertainty Action to minimise a bound on surprise The free-energy principle Perception to optimise the bound The conditional density and separation of scales
Mean-field partition Synaptic gain Synaptic activitySynaptic efficacy Activity-dependent plasticity Functional specialization Attentional gain Enables plasticity Perception and inference Attention and uncertainty Learning and memory
Hierarchical (deep) dynamic models
Empirical Bayes and DEM Bottom-upLateral E-Step Perceptual learning M-Step Perceptual uncertainty D-Step Perceptual inference Recurrent message passing among neuronal populations, with top-down predictions changing to suppress bottom-up prediction error Friston K Kilner J Harrison L A free energy principle for the brain. J. Physiol. Paris Associative plasticity, modulated by precision Encoding of precision through classical neuromodulation or plasticity of lateral connections Top-down
L4 SG IG Message passing in neuronal hierarchies Backward predictions Forward prediction error
Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview
Synthetic song-birds SyrinxVocal centre Time (sec) Frequency Sonogram
prediction and error time hidden states time Backward predictions Forward prediction error Causal states time (bins) Hierarchical recognition stimulus time (seconds)
Perceptual categorization Frequency (Hz) Song A time (seconds) Song B Song C
Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview
Sequences of sequences Syrinx Neuronal hierarchy Time (sec) Frequency (KHz) sonogram
Hierarchical perception prediction and error time causes time hidden states time hidden states time time (seconds) stimulus percept Prediction error encoded by superficial pyramidal cells that generate ERPs
… omitting the last chirps prediction and error time hidden states time causes - level 2 time hidden states time
omission and violation of predictions Stimulus but no percept Percept but no stimulus Frequency (Hz) stimulus (sonogram) Time (sec) Frequency (Hz) percept peristimulus time (ms) LFP (micro-volts) ERP (error) without last syllable Time (sec) percept peristimulus time (ms) LFP (micro-volts) with omission
Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview
Suppression of inferotemporal responses to repeated faces Main effect of faces Henson et al 2000 Repetition suppression and the MMN The MMN is an enhanced negativity seen in response to any change (deviant) compared to the standard response.
Hierarchical learning hidden states time causes time prediction and error time Time (sec) Frequency (Hz) Synaptic adaptationSynaptic efficacy
Perceptual inference: suppressing error over peristimulus time Perceptual learning: suppression over repetitions Simulating ERPs to repeated chirps LFP (micro-volts) prediction error hidden states Frequency (Hz) percept LFP (micro-volts) Frequency (Hz) LFP (micro-volts) Frequency (Hz) LFP (micro-volts) Frequency (Hz) LFP (micro-volts) Frequency (Hz) peristimulus time (ms) LFP (micro-volts) Time (sec) Frequency (Hz) Time (sec)
Synthetic MMN Last presentation (after learning) First presentation (before learning) Synaptic efficacy presentation changes in parameters Synaptic gain presentation hyperparameters primary level (N1/P1) peristimulus time (ms) Difference waveform secondary level (MMN) peristimulus time (ms) Difference waveform primary level prediction error secondary level
Synthetic and real ERPs presentation changes in parameters presentation hyperparameters A1 STG subcortical input STG repetition effects Intrinsic connections Extrinsic connections presentation Synaptic efficacySynaptic gain
Inference and learning under the free energy principle Hierarchical Bayesian inference Bird songs (inference) Perceptual categorisation Prediction and omission Bird songs (learning) Repetition suppression The mismatch negativity A simple experiment Overview
A brain imaging experiment with sparse visual stimuli V2 V1 Angelucci et al Coherent and predicable Random and unpredictable top-down suppression of prediction error when coherent? CRF V1 ~1 o Horizontal V1 ~2 o Feedback V2 ~5 o Feedback V3 ~10 o Classical receptive field V1 Extra-classical receptive field Classical receptive field V2 ?
V2 V1 V5 pCG V5 Random Stationary Coherent V1 V5 V2 Suppression of prediction error with coherent stimuli regional responses (90% confidence intervals) decreasesincreases Harrison et al NeuroImage 2006
This model of brain function explains a wide range of anatomical and physiological facts; for example, the hierarchical deployment of cortical areas, recurrent architectures using forward and backward connections and their functional asymmetries (Angelucci et al, 2002). In terms of synaptic physiology, it predicts Hebbian or associative plasticity and, for dynamic models, spike–timing-dependent plasticity. In terms of electrophysiology it accounts for classical and extra-classical receptive field effects and long-latency or endogenous components of evoked cortical responses (Rao and Ballard, 1998). It predicts the attenuation of responses encoding prediction error, with perceptual learning, and explains many phenomena like repetition suppression, mismatch negativity and the P300 in electroencephalography. In psychophysical terms, it accounts for the behavioural correlates of these physiological phenomena, e.g., priming, and global precedence. Predictions about the brain
Summary A free energy principle can account for several aspects of action and perception The architecture of cortical systems speak to hierarchical generative models Estimation of hierarchical dynamic models corresponds to a generalised deconvolution of inputs to disclose their causes This deconvolution can be implemented in a neuronally plausible fashion by constructing a dynamic system that self-organises when exposed to inputs to suppress its free-energy
Thank you And thanks to collaborators: Jean Daunizeau Lee Harrison Stefan Kiebel James Kilner Klaas Stephan And colleagues: Peter Dayan Jörn Diedrichsen Paul Verschure Florentin Wörgötter