# DON BOSCO INSTITUTE OF TECHNOLOGY

```DON BOSCO INSTITUTE OF TECHNOLOGY
DEPARTMENT OF INFORMATION SCIENCE & ENGINEERING
Course
Code
Course Title
Core/Elective
Prerequisite
10IS664
Pattern Recognition
Core
Probability Theory,
Linear Algebra
,Multivariate Statistic
Course
Objectiv
es
Contact
Hours
L T
P
4
1
-
Total Hrs/
Sessions
52
On completion of this subject, students will be expected to:
 Explain and compare a variety of pattern classification, structural pattern recognition,
and pattern classifier combination techniques.
 Apply performance evaluation methods for pattern recognition, and critique
comparisons of techniques made in the research literature.
 Apply pattern recognition techniques to real-world problems such as document
analysis and recognition.
 Implement simple pattern classifiers, classifier combinations, and structural pattern
recognizers.
Syllabus
PART – A
UNIT – 1
6 Hours
Introduction: Machine perception, an example; Pattern Recognition System; The Design Cycle;
Learning and Adaptation.
UNIT – 2
7 Hours
Bayesian Decision Theory: Introduction, Bayesian Decision Theory; Continuous Features, Minimum
error rate, classification, classifiers, discriminant functions, and decision surfaces; The normal density;
Discriminant
functions
for
the
normal
density.
UNIT – 3
7 Hours
Maximum-likelihood and Bayesian Parameter Estimation: Introduction; Maximum-likelihood
estimation; Bayesian Estimation; Bayesian parameter estimation: Gaussian Case, general theory; Hidden
Markov Models.
UNIT – 4
6 Hours
Non-parametric Techniques: Introduction; Density Estimation; Parzen windows; kn – NearestNeighbor Estimation; The Nearest- Neighbor Rule; Metrics and Nearest-Neighbor Classification.
PART – B
UNIT – 5
7 Hours
Linear Discriminant Functions: Introduction; Linear Discriminant Functions and Decision Surfaces;
Generalized Linear Discriminant Functions; The Two-Category Linearly Separable case; Minimizing the
Perception Criterion Functions; Relaxation Procedures; Non-separable Behavior; Minimum SquaredError procedures; The Ho-Kashyap procedures.
UNIT – 6
6 Hours
Stochastic Methods: Introduction; Stochastic Search; Boltzmann Learning; Boltzmann Networks and
Graphical Models; Evolutionary Methods.
UNIT – 7
6 Hours
Non-Metric Methods: Introduction; Decision Trees; CART; Other Tree Methods; Recognition with
Strings; Grammatical Methods.
UNIT – 8
7 Hours
Unsupervised Learning and Clustering: Introduction; Mixture Densities and Identifiability; MaximumLikelihood Estimates; Application to Normal Mixtures; Unsupervised Bayesian Learning; Data
Description and Clustering; Criterion Functions for Clustering
List of Text Books
Text Books:
1. Richard O. Duda, Peter E. Hart, and David G. Stork: Pattern Classification, 2nd Edition, WileyInter science, 2001.
List of Reference Books
DON BOSCO INSTITUTE OF TECHNOLOGY
DEPARTMENT OF INFORMATION SCIENCE & ENGINEERING
1. Earl Gose, Richard Johnsonbaugh, Steve Jost: Pattern Recognition and Image Analysis, PHI,
Indian Reprint 2008.
List of URLs-Text Books, Notes, Multimedia Content, etc
1. http://www.cs.rit.edu/~rlaz/20092/
2. http://www.cs.rit.edu/~rlaz/prec20092/Resources.html.
On completion of this subject students will be able to:



Formulate and describe various applications in pattern recognition
Understand the Bayesian approach to pattern recognition
Be able to mathematically derive, construct, and utilize Bayesian-based
classifiers and non-Bayesian classifiers both theoretically and practically.
Course
 Understand basic concepts such as the central limit theorem, the curse of
Outcome
dimensionality, the bias-variance dilemma, and cross-validation
 Validate and assess different clustering techniques
 Apply various dimensionality reduction methods whether through feature
selection or feature extraction
 Assess classifier complexity and regularization parameters
 Be able to combine various classifiers using fixed rules or trained combiners
and boost their performance
Internal Assessment Marks:(50) 3 Internal Assessment Tests are conducted during the semester and
marks allotted based on average of best two performances and reduced to 25 marks.
External Marks: (100) Students have to answer 5 questions out of 8 questions choosing at least 2 out of
4 questions from PART – A and at least 2 out of 4 questions from PART – B and 1 question from either
of the part.
Program Outcomes mapping with Course
Subject Name
PATTERN
RECOGNITION
Note:
a
b
c
d
Program Outcomes
e
f
g
h
i
j
k
l
4
4
2
4
4
3
2
4
4
4 = Strong Contribution
3 = Average Contribution
2
1
1
2 = Weak Contribution 1 = No Contribution
Program Educational Objectives mapping with Course
Subject Name
PATTERN
RECOGNITION
Note:
4 = Strong Contribution
PEO1
4
Program Educational Objectives
PEO2
PEO3
PEO4
3
3 = Average Contribution
4
2
PEO5
4
2 = Weak Contribution 1 = No Contribution
```