**Bayesics**

John Skilling, Cambridge UK.

John will give a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its role in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive power. These models are patterns of belief; there is no need to claim external reality.

**Markov chain Monte Carlo in Practice**

G. Larry Bretthorst, Washington University, USA.

Larry will introduce Markov chain Monte Carlo methods and demonstrate their application to parameter estimation and model selection in concrete problems.

**Probability and Entropy : Statements and Questions**

Kevin H. Knuth, NASA Ames Research Center, USA.

Kevin will demonstrate that probability theory and information theory are literally dual to each other via a duality relation called Birkhoff's Representation Theorem. Probability is a generalization of implication among a set of logical statements, so that probability describes the degree to which one statement implies another. Similarly, a quantity called relevance generalizes the notion of answering among questions, so that relevance describes the degree to which one question answers another. The Boolean lattice of logical statements gives rise to the Free Distributive lattice of questions; the two lattices being related by Birkhoff's theorem. Requiring that the relevance of one question to another can be expressed in terms of probabilities of their answers leads directly to relevances being expressed in terms of entropies. This reveals the precise nature of the duality between probability theory and a natural generalization of information theory.

**Information Theory based Inference, and Applications for Data Massives**

Mihai Datcu, German Aerospace Center DLR, Germany.

Traditionally Information Theory focused to applications in [tele]communications, it refers mainly to coding, transmission, or compression of signals. However, implicitly, from its very beginning, information theory closely related to statistics and machine learning. Thus, many other fields like stochastic inference, estimation and decision theory, optimization, communication or knowledge representation benefit from basic results from information theory. The goal of the tutorial is to overview applications and new developments in information theory relevant to inference, as well as general methods for information processing and understanding. Among the topics envisaged are: applications and extensions of Rate-Distortion theory, the methods of information bottleneck, ICA and source separation, MDL and related methods, information and/or complexity based estimation, and inference, and applications like entropy of shape, coding of image semantics, image information mining, etc.