Date: Sat, 13 Aug 2022 05:57:27 +0300 (EEST) Message-ID: <1644467718.35716.1660359447844@wiki-1.it.helsinki.fi> Subject: Exported From Confluence MIME-Version: 1.0 Content-Type: multipart/related; boundary="----=_Part_35715_1305895562.1660359447844" ------=_Part_35715_1305895562.1660359447844 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Content-Location: file:///C:/exported.html Study Group, spring 2012

# Bayesian core: a practical appro= ach to computational Bayesian statistics (study group)

=20

### Scope

=20

4 weekly exercises (1 for everybody plus 3 exercise of your own choice) = are needed to collect credits.

=20

=20

=20

### Prerequisites

=20

The minimal prerequisites for this c= ourse are a mastering of basic probability theory for discrete and continuo= us variables and of basic statistics (MLE, sufficient statistics).=20

### Lectures

=20

Room C131 on Thursdays 10-12 during period IV (on May 3 in room C130).=20

### Content scheme

=20

The purpose of this book is to provide a self-contained entry to practic= al & computational Bayesian statistics using generic examples from the = most common models. The emphasis on practice is a strong feature of this bo= ok in that its primary audience is made of graduate students that need to u= se (Bayesian) statistics as a tool to analyze their experiments and/or data= sets. The book should also appeal to scientists in all fields, given the v= ersatility of the Bayesian tools. It can also be used for a more classical = statistics audience when aiming at teaching a quick entry to Bayesian stati= stics at the end of an undergraduate program for instance.

=20

The chapters of the book and their topics are:

=20
=20
1. Normal models=20
=20
• Conditional distributions, priors, posteriors, improper priors, conjuga= te priors, exponential families, tests, Bayes factors, decision theory, imp= ortance sampling
• =20
2. =20
3. Regression and variable selection=20
=20
• G-priors, noninformative priors, Gibbs sampling, variable selection=20
4. =20
5. Generalised linear models=20
=20
• Probit, logit and log-linear models, Metropolis Hastings algorithms, mo= del choice
• =20
6. =20
7. Capture--recapture experiments=20
=20
• Sampling models, open populations, accept reject algorithm, Arnason Sch= warz model
• =20
8. =20
9. Mixture models=20
=20
• Completion, variable dimensional models, label switching, tempering, re= versible jump MCMC
• =20
10. =20
11. Dynamic models=20
=20
• AR, MA and ARMA models, state-space representation, hidden Markov model= s, forward-backward algorithm
• =20
12. =20
13. Image analysis=20
=20
• k-nearest-neighbor, supervised classification, segmentation, Markov ran= dom fields, Potts model
• =20
14. =20
=20
=20
=20
• Readings (and required exercise for everyone):
• =20
• Week 1 (31.01.2012) : Chapter 2
• =20
• Week 2 (07.02.2012) : - (exercise 2.22)
• =20
• Week 3 (14.02.2012) : Chapter 3 up to 3.3 (exercise 3.4)
• =20
• Week 4 (21.02.2012) : Chapter 3 from 3.3 to end (exercise 3.13)
• =20
• Week 5 (28.02.2012) : No meeting because of exams
• =20
• Week 6 (06.03.2012) : No meeting because of period break
• =20
• Week 7 (15.03.2012) : All of Chapter 4 (exercise 4.4)
• =20
• Week 8 (22.03.2012) : All of Chapter 5 (exercise 5.5)
• =20
• Week 9 (29.03.2012) : Chapter 6 up to 6.6 (exercise 6.1)
• =20
• Week 10 (05.04.2012) : No meeting because of easter break
• =20
• Week 11 (12.04.2012) : Chapter 6 from 6.6 to end (exercise 6.17)
• = =20
• Week 12 (19.04.2012) : Chapter 7 up to 7.2.2 (exercise 7.9)
• =20
• Week 13 (26.04.2012) : Chapter 8 up to 8.3
• =20
=20

### Bibliography

=20
------=_Part_35715_1305895562.1660359447844--