[Comp-neuro] Special issue: Information Decomposition of Target Effects from Multi-Source Interactions
joseph.lizier at sydney.edu.au
Fri Mar 24 13:14:57 CET 2017
We are pleased to announce the following special issue in Entropy on
information decompositions. We hope that you will consider submitting a
new research paper or review, etc., on this topic.
If you are planning to submit, we would appreciate it if you could let
us know in advance.
Special Issue: "Information Decomposition of Target Effects from
Submission Deadline: May 31, 2017 (open for submission now!)
* Dr. Joseph Lizier; Centre for Complex Systems, Faculty of
Engineering and IT, The University of Sydney, Australia
* Dr. Nils Bertschinger; Frankfurt Institute of Advanced Studies
(FIAS), Frankfurt, Germany
* Prof. Juergen Jost; Max Planck Institute for Mathematics in the
Sciences, Leipzig, Germany and Santa Fe Institute, NM, USA
* Prof. Michael Wibral; MEG Unit, Brain Imaging Center, Goethe
University, Frankfurt, Germany
Shannon information theory has provided rigorous ways to capture our
intuitive notions regarding uncertainty and information, and made an
enormous impact in doing so. One of the fundamental measures here is
mutual information, which captures the average information contained in
one variable about another, and vice versa. If we have two source
variables and a target, for example, we can measure the information held
by one source about the target, the information held by the other source
about the target, and the information held by those sources together
about the target. Any other notion about the directed information
relationship between these variables, which can be captured by classical
information-theoretic measures (e.g., conditional mutual information
terms) is linearly redundant with those three quantities.
However, intuitively, there is strong desire to measure further notions
of how this directed information interaction may be decomposed, e.g.,
how much information the two source variables hold redundantly about the
target, how much each source variable holds uniquely, and how much
information can only be discerned by synergistically examining the two
sources together. These notions go beyond the traditional
information-theoretic view of a channel serving the purpose of reliable
communication, considering now the situation of multiple communication
streams converging on a single target. This is a common situation in
biology, and in particular in neuroscience, where, say, the ability of a
target to synergistically fuse multiple information sources in a
non-trivial fashion is likely to have its own intrinsic value,
independently of reliability of communication.
The absence of measures for such decompositions into redundant, unique
and synergistic information is arguably the most fundamental missing
piece in classical information theory. Triggered by the formulation of
the Partial Information Decomposition framework by Williams and Beer in
2010, the past few years have witnessed a concentration of work by the
community in proposing, contrasting, and investigating new measures to
capture these notions of information decomposition. Other theoretical
developments consider how these measures relate to concepts of
information processing in terms of storage, transfer and modification.
Meanwhile, computational neuroscience has emerged as a primary
application area due to significant interest in questions surrounding
how target neurons integrate information from large numbers of sources,
as well as the availability of data sets to investigate these questions
This Special Issue seeks to bring together these efforts, to capture a
snapshot of the current research, as well as to provide impetus for and
focused scrutiny on newer work. We also seek to present progress to the
wider community and attract further research. We welcome research
articles proposing new measures or pointing out future directions,
review articles on existing approaches, commentary on properties and
limitations of such approaches, philosophical contributions on how such
measures may be used or interpreted, applications to empirical data
(e.g., neural imaging data), and more.
Please see the special issue website for full details.
Manuscripts can be submitted until the deadline. Papers will be
published continuously (as soon as accepted) and will be listed together
on the special issue website. Research articles, review articles as well
as communications are invited. Submitted manuscripts should not have
been published previously, nor be under consideration for publication
elsewhere (except conference proceedings papers).
For planned papers, a title and short abstract (about 100 words) can be
sent to the Editorial Office for announcement on the website.
Entropy is an open access journal which maintains a rigorous and fast
peer-review system and accepted papers are immediately published online.
The Impact Factor in 2015 for Entropy is 1.743 and it is fully covered
by the leading indexing and abstracting services, including Google
Scholar, MathSciNet, Scopus and Science Citation Index Expanded (Web of
Science). The Article Processing Charge (APC) for publication in this
open access journal is 1500 CHF (Swiss Francs).
Joe, Nils, Michael and Juergen
Dr. JOSEPH LIZIER | ARC DECRA Fellow | Senior Lecturer
Complex Systems Research Group
Faculty of Engineering and IT
THE UNIVERSITY OF SYDNEY
Rm 338A, Building J05 | The University of Sydney | NSW | 2006
T+61 2 9351 3208 | F+61 2 9351 3343 | M+61 408 186 901
E joseph.lizier at sydney.edu.au | W sydney.edu.au
TW @jlizier | W lizier.me/joseph
This email plus any attachments to it are confidential. Any unauthorised
use is strictly prohibited. If you receive this email in error, please
delete it and any attachments.
Please think of our environment and only print this e-mail if
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Comp-neuro