|
||
|
|
|
This course provides an undergraduate-level introduction to Statistical Learning. It address problems such as classification and detection, parameter and model estimation, or clustering, which are common in signal processing, communications, image processing, computer vision, artificial intelligence, speech analysis and recognition, data-mining, computational biology, bio-informatics, etc. |
||
Instructor: | Nuno Vasconcelos | |
n u n o @ e c e . u c s d . e d u, EBU1-5602 | ||
Text: | Introduction to Machine Learning | |
Ethem Alpaydin, MIT Press | ||
Syllabus: | [pdf] | |
Homework: | Problem set 1 [pdf] Not due | |
Problem set 2 [pdf, data] Due: Lecture 6 | ||
Problem set 3 [pdf] Due: Lecture 8 | ||
Problem set 4 [pdf, data] Due: Lecture 14 | ||
Problem set 5 [pdf] Due: Lecture 16 | ||
Problem set 6 [pdf] Due: Lecture 18 | ||
Problem set 7 [pdf, libsvm, instructions, example] Due: Lecture 20 | ||
Topics: | Lecture 1: introduction [slides] | |
Lecture 2: review of linear algebra [slides] | ||
Lecture 3: review of linear algebra (continued) | ||
Lecture 4: review of probability [slides] | ||
Lecture 5: metrics, whitening, nearest neighbors [slides][NN example] | ||
Lecture 6: Bayes decision rule [slides] | ||
Lecture 7: Bayes decision rule [slides] | ||
Lecture 8: Bayes decision rule (continued) | ||
Lecture 9: mid-term review [problems,solutions] | ||
Lecture 10: mid-term | ||
Lecture 11: Maximum Likelihood Estimation [slides] | ||
Lecture 12: MLE & Regression[slides] | ||
Lecture 13: MLE & Regression (continued) | ||
Lecture 14: Least Squares [slides] | ||
Lecture 15: Clustering, k-means [slides] | ||
Lecture 16: Clustering, EM [slides] | ||
Lecture 17: Principal component analysis [slides] | ||
Lecture 18: Kernels [slides] | ||
Lecture 19: Support Vector Machine [slides] | ||
Lecture 20: Support Vector Machine [slides] | ||