Child pages
  • Nonlinear dimensionality reduction
Skip to end of metadata
Go to start of metadata

Nonlinear dimensionality reduction, Fall 2015


Teacher:  Zhirong Yang, HIIT  (zhirong.yang at

Scope: 5 op

Type: advanced studies

Teaching: seminar, to pass the course:

  1. attend at least 80% of the classes
  2. make an oral presentation on a chosen topic
  3. small project: implement the method in another topic and write a report
  4. review two reports by other students
  • Grading: attendance (10%), reviewing (20%) small project (30%) presentation (40%)

Topics: Many real-world data objects are represented by many numbers. For example, images are originally given with numerous pixels. In data mining and machine learning, we need to discover the intrinsic embedding manifold of the data which is often nonlinear and low-dimensional. This helps us understand and extract the pattern of interest. If the manifold is of low enough dimension, the data can be visualized in the low-dimensional space. This seminar aims to equip the students with the basic knowledge of nonlinear dimensionality reduction (NLDR). We will also discuss some frontier methods in this field. 

Material:  PDF textbook  John A. Lee and Michel Verleysen. Nonlinear Dimensionality Reduction. Springer. 2007 + a collection of research articles.


Teaching schedule

I and II periods, Wednesdays 14-16, classroom C129, Exactum

The first class (introductory lecture + practicals) is on 9 September, 2015. (Originally it was 2 September, but it is postponed due the overlap with the COIN day event).

More updated information is in Moodle



Did you forget to register? What to do?


Course feedback

Course feedback can be given at any point during the course. Click here.

  • No labels