ELEC 633 Graphical Models
ENEE 633 Statistical and
Neural Pattern Recognition
DHS 10 Unsupervised Learning and
Clustering. I also encourage all of you to focus on
graph-theoretic approaches to the clustering problems.
manifold learning methods. I also enjoyed the talk
on the inherent dimensionality of problems that Prof.
Alfred Hero presented earlier this semester at CFAR.
DHS 6.7 and 6.8 Practical aspects of
DHS 6.1-6.6 Multilayer Neural Networks.
Mehrotra, Mohan, and Ranka. Also, take a look at the
books and links section
Third and Final Homework:
DHS: 6.21, 25, 26 (10pts each) and, CE2
(20pts). (Due date: 18Dec06)
Project due date is
extended. New due date: 4DEC06.
No class till 4DEC06. I
will be away for the ASC.
No class till 22NOV06.
Bootstrapping: Introduction to
Metropolis-Hastings algorithm and Gibbs sampler.
Recommended reading for Monte Carlo Markov chain
Chib and Greenberg, and
Casella and George. From
B. Efron's bootstrapping paper: I also wish to
thank the many friends who suggested names more colorful
than Bootstrap, including Swiss Army Knife, Meat Axe,
Swan-Dive, Jack-Rabbit, and my personal favorite, the
Shotgun, which, to paraphrase Tukey, "can blow off the
head of any problems if the statistician can stand the
Project Data Files (due
Bootstrapping: I will give an introduction to the
particle filters. To obtain a copy of the slides, please
bring a flash drive to my office (AVW #4409). Including
the video files, the presentation material occupies
100MB space. Recommended reading for particle filters:
take a look at the papers in this
Focus on papers by Doucet first in chronological order.
Liu and Chen, and
Isard and Blake.
DHS 9. Read Wikipedia for No Free Lunch
Theorem, the Ugly Ducking Theorem, and Berry's paradox.
MDL on the web.
DHS 5.11. Support Vector Machines.
Project proposals are due
DHS 5.1-5.8. Recommended reading for
Boyd and Vandenberghe (a complete online convex
optimization book). My personal favorites on the
Nocedal and Wright/Bertsekas
DHS 4. Recommended reading for Parzen
Parzen. Recommended reading for k-nearest neighbor
Fukunaga and Hostetler,
and Fukunaga, and
Cover and Hart. On the bias of nearest neighbor
error estimates: Fukunaga and Hummels, IEEE PAMI, vol.
9, no. 1, January 1987, pp. 103-112.
DHS 3.10. Hidden Markov Models.
Rabiner. There is also a good tutorial about the
Viterbi algorithm by
DHS: 17, 29, 36, 39, 40, 44 (10pts
each), CE13 (40pts) (download
HMM toolbox for MATLAB). (Due date: 11OCT06, late
submissions: -10pts for each day)
DHS 3.8-3.9. Recommended reading: Mixture
densities, maximum likelihood, and the EM algorithm by
Redner and Walker.
Heads up: Midterm 1 will have one
question where I will ask you to use the EM algorithm to
calculate the mixture probabilities of a mixture
Gaussian distribution. (Hint: Pages 21-24 concentrates
on how to determine the mixture probabilities using the
DHS 3.5-3.7. We will also cover
Bernardo's reference priors, Fisher information matrix,
and the information inequality. Recommended reading for
sufficient statistics: Ferguson/Lehmann. Recommended
reading for information inequality: Lehmann and Casella/Poor.
Recommended reading for reference priors:
First Homework is due.
here (Due date: 18SEPT06)
First Class. You can find the syllabus
I will talk about the development of AI,
computers, and computer vision.
Classroom: 4424 @ AV Williams.
Schedule: MW 11:30-1:00
Text book: Duda,
Hart, and Stork, "Pattern Classification," 2nd Edition.