Pattern Recognition and Exercises

LecturerKenichiro ISHII, Professor
DepartmentSchool of Engineering / Graduate School of Engineering, 2011 Spring
Recommended for:School of Informatics and Science (33 hours / session One session / week 15 weeks / semester)

Key Features

  1. In order to provide a better understanding of the lecture, two kinds of exercises are given. One is to solve pattern recognition problems by hand, and the other is to solve problems using a computer. These problems are given in the first (lecture time) and the second (exercise time) hours, respectively.
  2. In the exercises, hand-printed characters written by students during the course are used, so that students comprehend the technology as completely as possible.
  3. During each lecture some application examples and related topics are introduced by audio-visual demonstration.
  4. After several sessions have been completed, the course is assessed by students using questionnaires. The result of the assessment is utilized to improve the rest of the sessions.

Course Objectives

The major objectives of this course are, to understand the basic ideas of pattern recognition, and to acquire skills in solving actual problems using classification/learning algorithms.

Course Outline

This course consists of two parts. The first part is a lecture, where not only technological explanation but also some exercise problems are given. The second part is an exercise for solving pattern recognition problems using a computer.

A sufficient knowledge of linear algebra, probability theory and statistics is required. Having some programming skills is preferable in order to perform computer simulations in exercises.

Course Schedule

SessionContents
1 pattern recognition system, feature extraction, feature vector
2 prototype, nearest neighbor rule, linear discriminant function
3 perceptron learning rule, weight space, solution region
4 perceptron convergence theorem, dimension size, sample size
5 majority voting, piecewise linear discriminant function
6 Widrow-Hoff learning rule, multiple regression analysis
7 error estimation and perceptron
8 back propagation method, neural network, feature evaluation
9 transformation of feature space, Fisher's method
10 K-L expansion
11 empirical probability, subjective probability, Bayes theorem
12 Bayesian updating, Bayesian estimation
13 Bayes decision rule, Bayes error
14 parameter estimation by maximum likelihood method
15examination

Grading

Grading is based on reports and the final examination.


Last updated

May 10, 2020