[Comp-neuro] Axonal backpropagation in real neural circuits

kdharris at andromeda.rutgers.edu kdharris at andromeda.rutgers.edu
Fri Mar 21 20:07:02 CET 2008

Dear Computational Neuroscientists,

The "backprop" algorithm is the most successful neural network algorithm
for real-world applications, but is not considered a serious model for
neuronal plasticity in the brain. Overwhelming experimental evidence shows
that information can flow backward along axons; it just cannot do so fast
enough for implementation of the backprop algorithm.

In a recent article, I review the experimental evidence on such
"retroaxonal" signaling in the nervous system, and suggest how retroaxonal
signals propagating at physiologically reasonable speeds could control
learning in neuronal networks, based on the hypothesis that strengthening
of a neuron's output synapses stabilizes recent changes in the same
neuron's inputs. As a consequence, neural representations that provide
useful information to their downstream targets, and thus for behavior, are
stabilized. A candidate molecular mechanism for this process, involving
the activation of CREB by retrograde neurotrophin signals, is proposed.

The article may be found in the latest issue of Trends in Neurosciences,
or online at: http://andromeda.rutgers.edu/~kdharris/backprop.pdf

"Stability of the fittest: organizing learning with retroaxonal signals",
TINS 31:130-136 (2008).

Kenneth D. Harris

Kenneth D. Harris, Ph.D.
Assistant Professor
Center for Molecular and Behavioral Neuroscience
Rutgers, The State University of New Jersey
197 University Avenue
Newark  NJ 07102, USA
phone: 973 353 1080, x3331
fax: 973 353 1272

Visiting Assistant Professor
Smilow Neuroscience Program and Dept of Otolaryngology
NYU Medical Center 550 1st Avenue, New York NY 10016
phone: 212 263 9295

email: kdharris at andromeda.rutgers.edu
web: http://qneuro.rutgers.edu

More information about the Comp-neuro mailing list