Computational statistics, fall 2016
- The website will be continuously updated from now on.
- We will have to change the times 14.-17.11 for II teaching period because I'll be abroad.
- On Wednesday Sep 20 we finished conjugate analysis and looked at Gibbs sampling.
- Exercise 0 (solutions)
- Exercise 1 (solutions)
- Exercise 2
- Exercise 3
- Exercise 4
- Exercise 5
- Assignment (deadline is Dec 9, 2016)
Exercises are to be solved before each session. The solutions and their implementation as well as particular theory concepts will be discussed during each session. You will get additional points from solving exercises. These points will be added to your points from course exams, according to the formula max( 0, floor( ( n - 2 ) / 5 ) ). There will be a list going around during each session.
- Predictive density
- Inverse transform sampling
- Accept-reject sampling
- Gibbs sampling and mean-field VB approximation
- Matched curvature candidate density in independent Metropolis-Hastings sampling
The exam for I-period part will be on Oct 24, 2016 at 10:00 in D122.
Home assignment II-period part
The home assignment (2-3 pages, not including title page) documents together with your C++ code how you solved the problem. Due date will be decided on.
There will be several examples which show how the methods can be implemented using the R system for statistical computing. R is convenient for us since it is freely available and widely used and it enables easy visualization of results and contains simulation functions for lots of distributions. However, the methods are in no way tied to the R environment, and the methods can as easily be used in many other environments (such as Matlab together with its statistics toolbox).
- Petri Koistinen, Computational statistics. 2013. Chapter 1-4.
- Petri Koistinen, Computational statistics. 2013. Chapter 5-6.
- Petri Koistinen, Computational statistics. 2013. Chapter 7-11.
- Review of probability and Bayesian inference
- Methods for generating independent samples from distributions
- Classical Monte Carlo integration and importance sampling
- Approximating the posterior distribution using numerical quadrature or Laplace expansion
- MCMC methods: Gibbs and Metropolis-Hastings sampling
- Auxiliary variable methods in MCMC
- EM algorithm
- Multi-model inference
- MCMC theory
This part of the course is about implementing a computational method for statistical variable selection. The implementation will be carried out within the C++ programming language but you do not need to know C++ as a prerequisites for participating in the course. Several examples and data-sets will be used to illustrate the underlying methodology and get you started in C++.
Course feedback can be given at any point during the course. Click here.