Volkan Cevher


ELEC 633 Graphical Models

ENEE 633 Statistical and Neural Pattern Recognition



DHS 10 Unsupervised Learning and Clustering. I also encourage all of you to focus on graph-theoretic approaches to the clustering problems. Recommended reading: manifold learning methods. I also enjoyed the talk on the inherent dimensionality of problems that Prof. Alfred Hero presented earlier this semester at CFAR.


DHS 6.7 and 6.8 Practical aspects of Neural Networks. 


DHS 6.1-6.6 Multilayer Neural Networks. Recommended reading: Mehrotra, Mohan, and Ranka. Also, take a look at the books and links section here.

Third and Final Homework:  DHS: 6.21, 25, 26 (10pts each) and, CE2 (20pts). (Due date: 18Dec06)


Project discussions. Project due date is extended. New due date: 4DEC06.

No class till 4DEC06. I will be away for the ASC.

No class till 22NOV06.


Bootstrapping: Introduction to Metropolis-Hastings algorithm and Gibbs sampler. Recommended reading for Monte Carlo Markov chain sampling methods: Hastings, Chib and Greenberg, and Casella and George. From B. Efron's bootstrapping paper:  I also wish to thank the many friends who suggested names more colorful  than Bootstrap, including Swiss Army Knife, Meat Axe, Swan-Dive, Jack-Rabbit, and my personal favorite, the Shotgun, which, to paraphrase Tukey, "can blow off the head of any problems if the statistician can stand the resulting mess."

Project Data Files (due date 22NOV06):

data.mat, POSE.mat, GENDER.zip


Bootstrapping: I will give an introduction to the particle filters. To obtain a copy of the slides, please bring a flash drive to my office (AVW #4409). Including the video files, the presentation material occupies 100MB space. Recommended reading for particle filters: take a look at the papers in this site. Focus on papers by Doucet first in chronological order. Also, Liu and Chen, and Isard and Blake.


DHS 9. Read Wikipedia for No Free Lunch Theorem, the Ugly Ducking Theorem, and Berry's paradox. MDL reference: MDL on the web.


DHS 5.11. Support Vector Machines. Recommended reading: Burges.


Project discussions.


Project proposals are due 16OCT06.


DHS 5.1-5.8. Recommended reading for optimization methods: Boyd and Vandenberghe (a complete online convex optimization book). My personal favorites on the subject: Nocedal and Wright/Bertsekas (lecture notes)/Nemirovski (lecture notes).


DHS 4. Recommended reading for Parzen windows: Parzen. Recommended reading for k-nearest neighbor classification: Fukunaga and Hostetler, Short and Fukunaga, and Cover and Hart. On the bias of nearest neighbor error estimates: Fukunaga and Hummels, IEEE PAMI, vol. 9, no. 1, January 1987, pp. 103-112.


DHS 3.10. Hidden Markov Models. Recommended reading: Rabiner. There is also a good tutorial about the Viterbi algorithm by Forney.

Second Homework:  DHS: 17, 29, 36, 39, 40, 44 (10pts each), CE13 (40pts) (download HMM toolbox for MATLAB). (Due date: 11OCT06, late submissions: -10pts for each day)


DHS 3.8-3.9. Recommended reading: Mixture densities, maximum likelihood, and the EM algorithm by Redner and Walker.

Heads up: Midterm 1 will have one question where I will ask you to use the EM algorithm to calculate the mixture probabilities of a mixture Gaussian distribution. (Hint: Pages 21-24 concentrates on how to determine the mixture probabilities using the EM algorithm.)


DHS 3.5-3.7. We will also cover Bernardo's reference priors, Fisher information matrix, and the information inequality. Recommended reading for sufficient statistics: Ferguson/Lehmann. Recommended reading for information inequality: Lehmann and Casella/Poor. Recommended reading for reference priors: Bernardo.

First Homework is due.


DHS 3.1-3.4.


DHS 2.6-2.9.

First Homework here (Due date: 18SEPT06)


DHS 2.1-2.5.


First Class. You can find the syllabus here.

Quiz 0.

I will talk about the development of AI, computers, and computer vision. 

Classroom: 4424 @ AV Williams.

Schedule: MW 11:30-1:00

Text book: Duda, Hart, and Stork, "Pattern Classification," 2nd Edition.