[Comp-neuro] Solomonoff 85th memorial conf', Nov/Dec 2011,
1st Call for Papers
David Dowe (Infotech)
david.dowe at monash.edu
Mon Dec 6 21:21:44 CET 2010
Solomonoff 85th memorial conf', Nov/Dec 2011, 1st Call for Papers
(Apologies for cross-postings.)
RAY SOLOMONOFF (1926-2009) 85th MEMORIAL CONFERENCE
1st Call for Papers
Wedn 30/Nov/2011 - Fri 2/Dec/2011
Submission deadline: 20 May 2011
Ray Solomonoff (1926-2009) was the originator (in 1964) of algorithmic
information theory. Solomonoff's (1964) work preceded the slightly
later independent work of Kolmogorov (1965) [from whom we have the term
Kolmogorov complexity], shortly before the not unrelated work of the
then teenage G. J. Chaitin (1966). But, unlike the slightly later
Kolmogorov and Chaitin, Solomonoff (1964) also saw the relevance of this
new area to statistics, machine learning, artificial intelligence and
prediction - and coined the term algorithmic probability (ALP). Given
a body of data, the algorithmic probability distribution behind
Solomonoff prediction is obtained by doing a posterior-weighted averaging
of the outputs of all available computable theories - with the prior
probabilities of theories depending (monotonically decreasingly) upon the
lengths of their encodings on the chosen Universal Turing Machine (UTM).
Independently of and shortly after the above was the Minimum Message
Length (MML) work of Wallace and Boulton (1968), based on very similar
Bayesian information-theoretic principles but instead focussing on the
one single best model for statistical and inductive inference (and
whose relationship with algorithmic information theory was formalised
in the 1990s). The related Minimum Description Length (MDL) principle
followed a decade later in Rissanen (1978), co-incidentally taking the
same form as Schwarz's (1978) Bayesian Information Criterion (BIC) of
the same year - and with some approaches [such as the still popular
but largely unrelated Akaike's Information Criterion (AIC)] formed
after MML but before MDL. The (algorithmic) information theory behind
both Solomonoff prediction and (the two-part form of) MML inference
(or model selection and point estimation) leads to a variety of
statistical consistency (or convergence) results - apparently more
general than for other approaches - and likewise makes the results of
both approaches statistically invariant to re-parameterisation.
These approaches - both the MML inductive or inferential approach to
choosing the single ``best'' model and the Solomonoff predictive
approach of weighting over the posterior to form a predictive
distribution - are two of at least as many approaches from (Kolmogorov
complexity or) algorithmic information theory which have been applied
to a range of areas. Such areas include (e.g.) statistical inference
(and model selection and point estimation) and prediction, machine
learning, econometrics (including time series and panel data), in
principle proofs of financial market inefficiency, knowledge discovery
and ``data mining'', theories of (quantifying) intelligence and new
forms of (universal) intelligence test (for robotic, terrestrial and
extra-terrestrial life), philosophy of science, the problem of
induction, bioinformatics, linguistics, evolutionary (tree) models in
biology and linguistics, geography, climate modelling and bush-fire
detection, environmental science, image processing, spectral analysis,
engineering, arguments that entropy is not the arrow of time, etc.
Of course, this list will continue to grow and is not exhaustive.
Perhaps Solomonoff's next main contribution was the notion of
``infinity point'' (Solomonoff, 1985), later referred to as the
``singularity'', where machine intelligence catches up to and
overtakes human intelligence - an increasingly discussed scenario
which forms the basis of many science fiction films.
Solomonoff's obituary from the New York Times (January 2010) is at
In the year in which Ray Solomonoff would have had his 85th birthday and
some weeks before the year in which Alan Turing (upon whose Universal
Turing Machines much of Solomonoff's work is based) would have turned
100, this multi-disclipinary conference is timed for late 2011. It also
follows on 15 years after the Information, Statistics and Induction in
Science (ISIS) conference in 1996 and also held in Melbourne, Australia
- whose invited speakers included Ray Solomonoff, (Turing Award winner
and fellow artificial intelligence pioneer) Marvin Minsky, Jorma Rissanen
(of Minimum Description Length [MDL]) and (prominent machine learning
researcher) J. Ross Quinlan.
The contributions sought for this Solomonoff 85th memorial conference
are the abovementioned themes and/or anything (else) directly or at
least indirectly comparing with or building upon Solomonoff's work.
This inter-disciplinary conference will be held in Melbourne, Australia.
The conference will run for three days, from Wedn 30 November 2011
to Friday 2 December 2011, but might possibly be preceded by a day
or half-day of workshops and/or tutorials on Tues 29 November 2011.
Conference proceedings will be fully-refereed and published with a
suitable prestigious publisher. Selected papers on suitable topics
might be chosen to be expanded upon for journal special issues.
Andrew Barron, Statistics, Yale Univ, U.S.A.
Greg Chaitin, IBM T.J. Watson Research, U.S.A.
Fouad Chedid, Notre Dame Univ, Lebanon
Bertrand Clarke, Medical Statistics, Univ Miami, U.S.A.
A. Phil Dawid, Statistics, Cambridge University, U.K.
David Dowe (Conference and Program chair), Monash Univ
Peter Gacs, Boston University, U.S.A.
Alex Gammerman, Royal Holloway Univ London, England
Marcus Hutter, Australian National Univ (ANU)
Leonid Levin, Boston University, U.S.A.
Ming Li, Mathematics, U Waterloo, Canada
Kee Siong Ng, ANU (Australia) & EMC Corp
Juergen Schmidhuber, IDSIA, Switzerland
Farshid Vahid, Econometrics, Monash Univ, Australia
Paul Vitanyi, CWI, Amsterdam, Holland
Vladimir Vovk, Royal Holloway Univ London, England
Submission deadline: 20 May 2011
Conference dates: Wedn 30/Nov/2011 - Fri 2/Dec/2011
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Comp-neuro