[Comp-neuro] CFP Special Session "Learning in Spiking Neural Networks: Beyond Hebbian Learning" at IJCNN 2012

Andre Gruning a.gruning at surrey.ac.uk
Mon Dec 5 16:03:11 CET 2011


Call for Papers: 
Special Session "Learning in Spiking Neural Networks:
Beyond Hebbian Learning" 

IJCNN 2012, Brisbane 10--15/06/2012

Deadline: 19/12/2011

Submission: via IJCNN submission system at the IJCNN 2011 homepage at 
http://www.ieee-wcci2012.org/

Homepage: 
http://www.computing.surrey.ac.uk/personal/st/A.Gruning/ijcnn/special_session_spiking.pdf

Keywords:
Spiking Neural Networks
Learning Algorithms
Supervised Learning
Unsupervised Learning.


Organizers:

Andre Gruning
Scott Notley
Yaochu Jin

Nature-Inspired Computing and Engineering (NICE)	
Department of Computing, University of Surrey
Email: [a.gruning|s.notley|yaochu.jin]@surrey.ac.uk

This special session aims to bring together researchers from
computational neuroscience, computational intelligence, machine
learning and cognitive science to discuss new ideas and present
efficient learning algorithms that go beyond Hebbian learning for
feed-forward, recurrent and reservoir based spiking neural
networks. 

Today much evidence has been found in neuroscience that learning in
biological neural networks is correlation-based (Hebbian-style,
e.g. spike time dependent plasticity (STDP)). However cognitive
behaviour is often considered to be target-driven, which indicates a
supervised approach to learning rather than pure correlation-based
learning.   

While a large number of both supervised and unsupervised efficient 
learning algorithms have been developed and a wide range of
applications have been found for artificial neural networks, most
learning algorithms for spiking neural networks are still
correlation-based with few exceptions and limited success has been
reported on applying spiking neural networks to solving real-world
problems. 

Topics of interest include but are not limited to: 

How can correlation-based learning on the neural level lead to
supervised learning behaviour on a higher functional level? 

Can we implement learning algorithms that are technically efficient in
a biologically plausible way in networks of spiking neurons?  

How synaptic, homeostatic and intrinsic plasticity rules influence
dynamics and learning performance of spiking neural networks? 

What are typical applications of spiking neural networks where the
spiking behaviour is a real plus over the standard use of rate
neurons? 

Program Committee

Sander Bohte, Centrum Wiskunde & Informatica (CWI), Netherlands.
Andre Gruning, University of Surrey, UK.
Rasvan Florian, Coneural – Center for Cognitive and Neural Studies, Romania.
Yaochu Jin, University of Surrey, UK.
Nikola Kasabov, AUT, Australia.
Jian Liu, University of Goettingen, Germany.
Yan Meng, Stevens Institute of Technology, USA 
Scott Notley, University of Surrey.
Filip Ponulak, Brain Corporation, San Diego, USA.
Peter Tino, University of Birmingham, UK.
Pierre Yger, University College London, UK.


More information about the Comp-neuro mailing list