Last modified by corander@helsinki_fi on 2024/03/27 10:37

Show last authors
1 = Markovian modelling and Bayesian learning, fall 2012 =
2
3 === Lecturer ===
4
5 [[Jukka Corander>>doc:mathstatHenkilokunta.Corander, Jukka]]
6
7 === Scope ===
8
9 5 cu.
10
11 === Type ===
12
13 Advanced studies
14
15 === Prerequisites ===
16
17 Basic calculus, linear algebra, introductory course on probability and statistical inference are absolutely necessary. First course level knowledge on algebra, probability and inference will be recommendable for many parts of the course.
18
19 === Lectures ===
20
21 Weeks 44-50, Tuesday 12-14 and Thursday 12-14 in room B120. NB! No lectures on Nov 1st, Nov 15th, Dec 4th. These lectures are replaced by ADDITIONAL lectures for which time & place is announced later on this webpage.
22
23 === Exercises. ===
24
25 During weeks 45-49 there will be a weekly exercise session in room B120 on Thursdays 14-16, except on Dec 6th which is a public holiday. The exercise session for that week is held in room C124 on Friday 7th between 12-14. The teacher responsible for the exercise sessions is [[Lu Cheng>>url:http://wiki.helsinki.fi/display/mathstatHenkilokunta/Cheng%2C+Lu||shape="rect"]] (first.last@helsinki.fi).
26
27 Exercises for week 45 are available [[here>>url:http://www.helsinki.fi/bsg/filer/Exercises1.pdf||shape="rect"]]
28 Exercises for week 46 are available [[here>>url:http://www.helsinki.fi/bsg/filer/Exercises2.pdf||shape="rect"]]
29 Exercises for week 47 are available [[here>>url:http://www.helsinki.fi/bsg/filer/Exercises3.pdf||shape="rect"]]
30 Exercises for week 48 are available [[here>>url:http://www.helsinki.fi/bsg/filer/Exercises4.pdf||shape="rect"]]
31 Exercises for week 49 are available [[here>>url:http://www.helsinki.fi/bsg/filer/Exercises5.pdf||shape="rect"]]
32
33 === Exams ===
34
35 To gain the credits from this course, it is necessary to do at least 50% of the exercises and a home exam. Additional solved exercises will yield bonus points for the grade. The home exam will consist of a number of larger assignments that must be returned by May 1st 2013 to the lecturer. Home exam assignments are available [[here>>url:http://www.helsinki.fi/bsg/filer/HomeExam.pdf||shape="rect"]].
36
37 === Preliminary lecture diary ===
38
39 Week 44:
40 Tue
41 [[Teaser trailer>>url:http://www.helsinki.fi/bsg/filer/Journey2MarkovianLands.pdf||shape="rect"]], [[Eye-opener>>url:http://www.helsinki.fi/bsg/filer/SistersParadoxByBayes.pdf||shape="rect"]] on conditional probabilities and Bayes' theorem, basic properties of Markov chains. [[This excerpt>>url:http://www.helsinki.fi/bsg/filer/Koski2.pdf||shape="rect"]] from the HMM book by T. Koski is mainly used during the lectures and also [[this short excerpt>>url:http://www.helsinki.fi/bsg/filer/IsaacsonBasics.pdf||shape="rect"]] on periodicity from the book of Isaacson & Madsen, Markov chains. For further illustrations and mathematical details on Markov chains, see the link to Sirl and Norris in Bibliography.
42 Thu
43 Basic properties of Markov chains continued. To get going with the basics of simulating Markov chains, you might find [[these Matlab codes>>url:http://www-math.bgsu.edu/z/ap/||shape="rect"]] useful.
44 Week 45:
45 Tue
46 Properties of Markov chains continued. Basics of ML and Bayesian learning, see [[this excerpt>>url:http://www.helsinki.fi/bsg/filer/Koski1.pdf||shape="rect"]] from the HMM book by T. Koski.
47 Thu
48 Statistical learning for DTMC's, see [[this excerpt>>url:http://www.helsinki.fi/bsg/filer/Koski3.pdf||shape="rect"]] from the HMM book by T. Koski. Also, [[this appendix>>url:http://www.helsinki.fi/bsg/filer/KoskiCh3Appendix.pdf||shape="rect"]] from the HMM book is useful for refreshing details on various distributions.
49 Week 46:
50 Tue
51 [[A primer on Occham's razor and Bayesian model comparison for Markov chains>>url:http://www.helsinki.fi/bsg/filer/PrimerOnOccham.pdf||shape="rect"]], [[Information-theoretic book by D MacKay where Ch 28 contains a detailed explanation of the Occham's razor principle and Bayesian model comparison>>url:http://www.inference.phy.cam.ac.uk/mackay/itila/||shape="rect"]], Bayesian learning of the order of a DTMC, continuous-time Markov chains (see the [[e-book by Koski>>url:http://www.ep.liu.se/ea/lsm/2004/001/||shape="rect"]].).
52 Thu
53 Continuous-time Markov chains.
54 Week 47:
55 Continuous-time Markov chains, basic properties of hidden Markov models, see: [[Ch. 10>>url:http://www.helsinki.fi/bsg/filer/KoskiBookHMM1.pdf||shape="rect"]],[[Ch. 12>>url:http://www.helsinki.fi/bsg/filer/KoskiCh12.pdf||shape="rect"]],[[Ch. 13>>url:http://www.helsinki.fi/bsg/filer/KoskiBookHMM2.pdf||shape="rect"]],[[Ch. 14>>url:http://www.helsinki.fi/bsg/filer/KoskiCh14.pdf||shape="rect"]] from the HMM book. [[An example of using HMM in classification>>url:http://dx.doi.org/10.1016/j.jspi.2012.07.013||shape="rect"]].
56 Weeks 48-49:
57 Week 50: CTMS and HMMs continued, Variable Length Markov chains (see the article by Mächler & Buhlmann mentioned in the bibliography)
58
59 === Bibliography ===
60
61 Various references will be used during the course. The lecture diary will also include links to some additional materials. Parts of the following books will be considered:
62
63 Baclawski, Kenneth. Introduction to probability with R. Chapman & Hall, 2008.
64 Timo Koski. Hidden Markov models for bioinformatics. Kluwer, 2001.
65 Timo Koski & John M. Noble. Bayesian networks: An introduction. Wiley, 2009.
66 Timo Koski. Lectures at RNI on Probabilistic Models and Inference for Phylogenetics. Free e-book available [[here>>url:http://www.ep.liu.se/ea/lsm/2004/001/||shape="rect"]].
67
68 In addition, we will consider a number of articles & tutorials (articles not directly linked here are generally available form JSTOR collection or are otherwise online):
69
70 Braun, J.V. & Muller, H-G. Statistical methods for DNA sequence segmentation. Statistical Science, 13, 142-162, 1998.
71 Sirl, D. Markov Chains: An Introduction/Review. [[pdf>>url:http://www.maths.uq.edu.au/MASCOS/Markov05/Sirl.pdf||shape="rect"]].
72 Norris, J. Markov chains. CUP, [[see online resource>>url:http://www.statslab.cam.ac.uk/~~james/Markov/||shape="rect"]].
73 Gu, L. [[Notes on Dirichlet distribution with relatives>>url:http://www.cs.cmu.edu/~~epxing/Class/10701-08s/recitation/dirichlet.pdf||shape="rect"]]. This document provides a concise recapitulation of some of the central formulas that are needed in the exercises and assignments when doing Bayesian learning. More comprehensive derivations can be found in several books on Bayesian modeling, e.g. in Koski & Noble (2009), which is listed above.
74 Mächler, M. & Buhlmann, P. Variable length Markov chains: Methodology, computing and software. Journal of Computational and Graphical Statistics 13, 435-455, 2004. Preprint available [[here>>url:ftp://stat.ethz.ch/Research-Reports/104.html||shape="rect"]]
75 Kass, R.E. & Raftery, A.E. Bayes factors. Journal of the American Statistical Association, 90, 773-795, 1995.
76 Smith, A.F.M. & Gelfand, A.E. Bayesian statistics without tears: A sampling-resampling perspective. The American Statistician, 46, 84-88, 1992.
77 Jordan, M.I. Graphical models. Statistical Science, 19, 140-155, 2004. Preprint available [[here>>url:http://www.cs.berkeley.edu/~~jordan/papers/statsci.ps||shape="rect"]]
78
79 === [[Registration>>url:https://oodi-www.it.helsinki.fi/hy/opintjakstied.jsp?html=1&Tunniste=57059||shape="rect"]] ===
80
81 Did you forget to register? [[What to do>>doc:mathstatOpiskelu.Kysymys4]].
82
83 === Exercise groups ===
84
85 |=(((
86 Group
87 )))|=(((
88 Day
89 )))|=(((
90 Time
91 )))|=(((
92 Place
93 )))|=(((
94 Instructor
95 )))
96 |(((
97 1.
98 )))|(((
99
100 )))|(((
101
102 )))|(((
103
104 )))|(((
105
106 )))