Nonlinear dimensionality reduction
Nonlinear dimensionality reduction, Fall 2015
Teacher: Zhirong Yang, HIIT (zhirong.yang at helsinki.fi)
Scope: 5 op
Type: advanced studies
Teaching: seminar, to pass the course:
- attend at least 80% of the classes
- make an oral presentation on a chosen topic
- small project: implement the method in another topic and write a report
- review two reports by other students
- Grading: attendance (10%), reviewing (20%) small project (30%) presentation (40%)
Topics: Many real-world data objects are represented by many numbers. For example, images are originally given with numerous pixels. In data mining and machine learning, we need to discover the intrinsic embedding manifold of the data which is often nonlinear and low-dimensional. This helps us understand and extract the pattern of interest. If the manifold is of low enough dimension, the data can be visualized in the low-dimensional space. This seminar aims to equip the students with the basic knowledge of nonlinear dimensionality reduction (NLDR). We will also discuss some frontier methods in this field.
Material: PDF textbook John A. Lee and Michel Verleysen. Nonlinear Dimensionality Reduction. Springer. 2007 + a collection of research articles.
Teaching schedule
I and II periods, Wednesdays 14-16, classroom C129, Exactum
The first class (introductory lecture + practicals) is on 9 September, 2015. (Originally it was 2 September, but it is postponed due the overlap with the COIN day event).
More updated information is in Moodle
Registration
Did you forget to register? What to do?
Course feedback
Course feedback can be given at any point during the course. Click here.