[Comp-neuro] Network Plasticity as Bayesian Inference

David Kappel david at igi.tugraz.at
Thu Apr 30 13:00:40 CEST 2015


We would like to announce our new paper on stable network function through
stochastic plasticity.

Summary:

Networks of neurons in the brain are exposed to a multitude of internal and
external changes and perturbations, to which they have to respond quickly in
order to maintain stable functionality. In addition, experimental data 
suggest
that these networks are simultaneously able to maintain structural 
constraints
such as the empirically found connection probability between specific 
types of
neurons, and heavy-tailed distributions of synaptic weights. Other 
experimental
data point to surprising ongoing fluctuations in dendritic spines and spine
volumes, to some extent even in the adult brain and in the absence of 
synaptic
activity.

In our paper "Network Plasticity as Bayesian Inference" we have shown that
stochasticity of synaptic connection may support stable network function. It
enables networks to sample parameters from some low-dimensional manifold in
a high-dimensional parameter space that represents attractive 
combinations of
structural constraints and a good fit to empirical evidence (e.g., sensory
inputs). The resulting new theory of network plasticity explains from a
functional perspective the experimentally observed ongoing fluctuations and
structural priors  that previously appeared to be quite puzzling, and 
provides
a viable alternative to existing models that propose convergence of 
parameters
to point estimates of their optimal values, e.g. to maximum likelihood 
values.

A preprint of the paper is available at: http://arxiv.org/abs/1504.05143v1
The supplement at: http://www.igi.tugraz.at/kappel/pdfs/ms-541-suppl.pdf

-- 
David Kappel
Institute for Theoretical Computer Science
Graz University of Technology
Inffeldgasse 16b,  A-8010 Graz,  Austria
Tel.:  ++43/316/873-5847
http://www.igi.tugraz.at/kappel/




More information about the Comp-neuro mailing list