Markovian modeling and Bayesian learning, fall 2010

Last modified by corander@helsinki_fi on 2024/03/27 10:09

Markovian modeling and Bayesian learning, fall 2010

Lecturer

Jukka Corander

Scope

5 cu.

Type

Course provides an introduction to the principles of various types of Markovian probability models, such as ordinary discrete time Markov chains, continuous time Markov chains, higher order Markov chains, variable order Markov chains, hidden Markov models, graphical Markov models etc. Different forms of Markovian assumptions about independence are ubiquitous in modern probabilistic modeling. A central aim is to gain understanding in how models can be built for various phenomena by using ordinary and hidden Markov assumptions. Introduction to Bayesian learning and its uses in applications are presented side by side with the Markov models. However, due to time constraints, deeper philosophical and general issues on Bayesian learning will not be considered in this course. Such material will be presented in the course 'Bayesian theory with applications'. Those interested in the philosophical issues of probability and randomness are encouraged to study the book by Jaynes (see bibliography below).

This course is:

                     • Johdatus tilastolliseen päättelyyn, osa 2 (Introduction to statistical inference, part 2).

                     • One of the compulsory courses in EuroBayes Master´s degree program

                     • One of the optionally compulsory courses in Bioinformatics Master´s degree program (MBI)

Prerequisites

Basic calculus and linear algebra. An introductory course on probability. Those lacking such a course can take the intensive course: Introduction to probability with R,

9.9 - 23.9, Thursday 10-14, Friday 14-16, B120

This accessory course/reading group is for foreign students. Finnish students are supposed to have prerequisites on the basis of probability lectures provided by ordinary teaching program: Jjohdatus todennäköisyyslaiskentaan or/(and) Todennäköisyyslaskenta. This accessory course does not compensate these courses.

Lectures

28.9 - 14.10, Tuesday 14-17, Wednesday 16-18, Thursday 14-17, D122

NB! No lecture on Wed Sept 29th as I'll give the public colloquim lecture of the Department of Mathematics and statistics. The title is 'Have I seen You before? Principles of predictive classification revisited'. Course participants are welcome to the lecture as well, check out exact time and place from the department webpages.

Lecture diary:

Tue 28.9.
Teaser trailer, Eye-opener on conditional probabilities and Bayes' theorem, basic properties of discrete-time Markov chains (definitions, hitting probabilities, invariant distribution). To get going with the basics of simulating Markov chains, you might find these Matlab codes useful.

Thu 30.9.

Mechanics of Bayesian learning, Bayesian statistics without tears (article) and how updating probabilities can be understood in terms of simulation, model comparison, Bayes factors and model averaging (article Bayes factors by Kass & Raftery).

Tue 5.10.

Continuous-time Markov chains (CTMCs), general properties such as Chapman-Kolmogorov equations and applications to bioinformatics (n-coalescent, Poisson process, nucleotide subsitution models). Properties of the generator and transition probability matrices for CTMCs. For details, see chapter 2 of Koski's book online.

Wed 6.10.

CTMCs continued, ergodicity, holding times, reversibility, general time-reversible model of molecular evolution, rate heterogeneity by compounding with gamma distribution (see ch. 2-4 in Koski's online book).

Thu 7.10.

Mathematical basis of learning CTMCs, likelihood expressions and recursive calculation, Pulley principle (see ch. 6 in Koski's online book).

Tue 12.10.

Learning for ordinary discrete time Markov chains (DTMCs), ML and Bayesian inference (notes by Gu; Koski. Hidden Markov models for bioinformatics. Kluwer, 2001).
 Higher-order DTMCs and their learning, these two excerpts from the HMM book were discussed in detail during the lecture: part1,part2.

Wed 13.10.

Variable-length Markov chains, theory and learning, see Mächler & Buhlmann. DTMCs continued, examples of weight matrices and word distributions are presented in these slides which were kindly provided by professor Timo Koski at KTH.

Thu 14.10.

Hidden Markov models, general theory, applications, e.g. noisy Markov model and HMMs in DNA segmentation (Braun & Muller; Koski. Hidden Markov models for bioinformatics. Kluwer, 2001). The central mathematical properties related to backward and forward recursions for Derin's algorithm are proven. These two excerpts from the HMM book were discussed in detail during the lecture: part1,part2.

Exams

NB! The deadline for assignments is extended to end of 3rd period!!

There will be no written exam, but the credits can be gained by successfully completing at least 60% of the assigments listed below. Grading of the participants will be done on the basis of the number of returned assignments and quality of the assignment solutions. Solutions must be returned to the lecturer by end of study period III (2011).

Assignments I.
Assignments II.
Assignments III.
Assignments IV.

Bibliography

Various references will be used during the course. Parts of the following books will be considered:

Baclawski, Kenneth. Introduction to probability with R. Chapman & Hall, 2008.
 Timo Koski. Hidden Markov models for bioinformatics. Kluwer, 2001.
 Timo Koski & John M. Noble. Bayesian networks: An introduction. Wiley, 2009.
 Timo Koski. Lectures at RNI on Probabilistic Models and Inference for Phylogenetics. Free e-book available here.
 E.T. Jaynes, Probability theory. Available in print and online.

In addition, we will consider a number of articles & tutorials (articles not directly linked here are generally available form JSTOR collection or are otherwise online):

Braun, J.V. & Muller, H-G. Statistical methods for DNA sequence segmentation. Statistical Science, 13, 142-162, 1998.
 Sirl, D. Markov Chains: An Introduction/Review. pdf.
 Norris, J. Markov chains. CUP, see online resource.
 Gu, L. Notes on Dirichlet distribution with relatives. This document provides a concise recapitulation of some of the central formulas that are needed in the assignments when doing Bayesian learning. More comprehensive derivations can be found in several books on Bayesian modeling, e.g. in Koski & Noble (2009), which is listed above.
 Mächler, M. & Buhlmann, P. Variable length Markov chains: Methodology, computing and software. Journal of Computational and Graphical Statistics 13, 435-455, 2004. Preprint available here
 Kass, R.E. & Raftery, A.E. Bayes factors. Journal of the American Statistical Association, 90, 773-795, 1995.
 Smith, A.F.M. & Gelfand, A.E. Bayesian statistics without tears: A sampling-resampling perspective. The American Statistician, 46, 84-88, 1992.
 Jordan, M.I. Graphical models. Statistical Science, 19, 140-155, 2004. Preprint available here
 Frank, O. & Strauss, D. Markov graphs. Journal of the American Statistical Association, 81, 832-842, 1986.
 Rabiner, L.R. A tutorial on hidden Markov models and selected applications in speech recognition. IEEE Proc, 1989.

Registration for Markovian modeling and Bayesian learning

Registration for Introduction to probability with R

Did you forget to register? What to do.

Exercise groups

Group

Day

Time

Place

Instructor

1.

 

 

 

 

2.

 

 

 

 

3.

 

 

 

 

4.

 

 

 

 

5.

 

 

 

 

6.