Monday, 23 December 2013

BM2401 PATTERN RECOGNITION AND NEURAL NETWORKS SYALLBUS R-2008



BM2401 PATTERN RECOGNITION AND NEURAL NETWORKS SYALLBUS R-2008

BM2401 PATTERN RECOGNITION AND NEURAL NETWORKS L T P C

3 0 0 3 UNIT I INTRODUCTION AND SIMPLE NEURAL NET 9 Elementary neurophysiology and biological neural network-Artificial neural network – Architecture, biases and thresholds, Hebb net, Perceptron, Adaline and Madaline.

UNIT II BACK PROPOGATION AND ASSOCIATIVE MEMORY 9
Back propogation network, generalized delta rule, Bidirectional Associative memory, Hopefield network

UNIT III NEURAL NETWORKS BASED ON COMPETITION 9
Kohonen Self organising map, Learning Vector Quantisation, counter propogation network.

UNIT IV UNSUPERVISED LEARNING AND CLUSTERING ANALYSIS 9
Patterns and features, training and learning in pattern recognition, discriminant functions, different types of pattern recognition. Unsupervised learning- hierarchical clustering, partitional clustering. Neural pattern recognition approach – perceptron model

UNIT V SUPERVISED LEARNING USING PARAMETRIC AND NON 9
PARAMETRIC APPROACH
Bayesian classifier, non parametric density estimation, histograms, kernels, window estimators, k-nearest neighbour classifier , estimation of error rates.

TOTAL : 45 PERIODS

TEXT BOOKS:
1. Hagan, Demuth and Beale, "Neural network design", Vikas Publishing
House Pvt. Ltd., New Delhi , 2002
2. Freeman J.A., and Skapura B.M, " Neural networks, algorithms, applications and programming techniques", Addison – Wesley,2003
3. Duda R.O, Hart P.G, "Pattern classification and scene analysis", Wiley Edition,2000
4. Earl Gose, Richard Johnsonbaugh, Steve Jost, "Pattern Recognition and Image Analysis", Prentice Hall of India Pvt. Ltd., New Delhi, 1999.

REFERENCES:
1. Robert Schalkoff, " Pattern recognition, Statistical, Structural and neural approaches" John Wiley and Sons(Asia) Pte. Ltd., Singapore, 2005
2. Laurene Fausett ," Fundamentals of neural networks – Architectures, algorithms and applications", Prentice Hall, 1994.

No comments:

Post a Comment