[Comp-neuro] DISCUSSION: Re: Attractors, variability and noise

Todd Troyer todd.troyer at utsa.edu
Fri Aug 22 15:06:35 CEST 2008


Jim claims that it is harder to adjust realistic models than abstract models
to get them to do what you want to do.  I'm not so sure, but it's certainly
true that it's going to be harder to find an optimal location in the much
bigger state space of the realistic model. This of course argues that
evolution is more likely to act at a level closer to abstract models than
the level of realistic models.  We may even be able to understand important
properties and principles that govern behavior on the dominant low
dimensional space.  Another cause for optimism is that the genetic code is
simply too small to specify all the details. So again, evolution is likely
to act on the 'rules' that govern self-organization rather than on the
organization of the brain itself.

These factors suggest that evolution is likely to act predominantly at a
level of functional simplicity just so that it can operate with a degree of
flexibility.   KCC aside, we have a chance at reasonably capturing some
(many?) important principles of brain function within simplified, abstract
models. Note also that 'understanding' is not a binary concept - it's not
like we understand the brain or we don't.  My personal opionion is that
we'll never have a 'complete' understanding of the brain, whether we could
theoretically or not. Given where we're at now, I'm not too concerned with
the endgame.

The flip-side of this positive story is that evolution also has the nasty
habit of linking mechanisms across levels of analysis in devilishly
complicated ways.  To cook up an example, suppose that '40 hz' oscillations
are important for some aspect of brain function.  To optimize this function,
evolution will alter mechanisms that have preferential effects at 40 Hz
without regard to which 'level of analysis' label we'd place on that
mechanism.  If gamma band oscillations are important, we should EXPECT
evolution to come up with very specific values for network dynamics
(connection densities, excitatory inhibitory balance, etc.), synaptic
dynamics (receptor dynamics, time constants of
depression/facilitation/desensitization, etc.), neuronal dynamics
(voltage-dependent channels, adaptation dynamics, propagation speeds between
soma and dendrite, etc.), all in such a way to influence the strength and/or
presence of 40 Hz oscillations.  And there is no reason to stop with
electrical signaling, One might expect calcium and other chemical cascades
to be particularly sensitive to and act on (e.g. Ca-dependent K channels) 40
Hz oscillations, potentially even regulated by the spatial distributions and
sizes of spines, degree of molecular crowding etc.  Worse, we should expect
evolution to CO-regulate these values leading to specific (i.e. non-random)
cross-level dependencies.

A related example might be the neural Turing test mentioned by Bard, where
one asks whether you can tell the difference between the response of a
simple model and realistic model (or a real neuron) to a complex pattern of
input.  The difficulty here is that evolution has the tendency to co-evolve
neural properties and input patterns so that (a) neurons are most sensitive
to the input patterns they actually receive and (b) the brain generates
input patterns that are most effective in controlling neural firing
patterns.  So to really run the neural Turing test you need to consider both
the neuron model and the inputs.  Personally, I think the field could use
more systematic comparisons between simplified and more complex models.  But
great care has to be taken to choose the framework for the comparison (in
particular, the nature of the inputs).

Todd

------------------------------------------------------
Todd Troyer
UT San Antonio, Biology Dept.
One UTSA Circle
San Antonio TX 78249
Todd.troyer at utsa.edu
W: 210-458-5487, FAX: 210-458-5658
www.utsa.edu/troyerlab
------------------------------------------------------




More information about the Comp-neuro mailing list