1) - UH



Dr. Eick

COSC 6342 Final Exam Review List

Scheduled for Mo., May 5, 2014 2-4p

in 201 SEC

last updated: April 25, 5p

Relevant Topics 2014 Final Exam:

1. *** Introduction to Machine Learning? What is Machine Learning? (Textbook pages, class transparencies covering this theme, likely essay question)

2. ** Overfitting, Underfitting, generalization error, bias, variance, noise (transparencies (might not be listed below) and pages in textbook which cover these topics)

3. ***** Decision Trees/Regression Trees (class transparencies, textbook pages)

4. **** Hidden Markov Models (class transparencies, textbook pages)

5. *****Support Vector Machines/ Kernels (class transparencies (only those that were actually covered, textbook pages, first 3 pages Smola/Schoelkopf Tutorial on SV regression, Wikipedia webpages which are referenced by the class transparencies)

6. **** Ensembles (class transparencies, mandatory Wikipedia pages listed on the first transparency)

7. ****Non-parametric Density Estimation and Prediction (class transparencies, textbook pages)

8. **** Belief Networks (Basic Properties; what can they do? How are the different from Naïve Bayesian Approaches? Simple probability computations in belief networks, and determine d-separability of simple cases).

9. ** Design and Analysis of Machine Learning Experiments (Textbook pages, class transparencies Topic 11)

There also will be one essay-style question in the exam!

The exam is open book and notes. The following pages of the textbook are relevant for the 2014 final exam: 1-14, 37-43, 73-84, , 163-172, 174-181, 185-197, 309-338, 363-382 (except 15.8 and 15.9), 387-396, 483-493.

Relevant Powerpoint Slide Sets:

Course Organization ML Spring 2014

Topic 1: Introduction to Machine Learning(Eick/Alpaydin Introduction, Tom Mitchell's Introduction to ML---only slides 1-8 and 15-16 will be used)

Topic 2: Supervised Learning (examples of classification techniques: Decision Trees, k-NN)

Topic 3: Bayesian Decision Theory (excluding Belief Networks) and Naive Bayes (Eick on Naive Bayes)

Topic 4: Using Curve Fitting as an Example to Discuss Major Issues in ML (read Bishop Chapter1 in conjuction with this material; not covered in 2011)

Topic 5: Parametric Model Estimation

Topic 6: Dimensionality Reduction Centering on PCA (PCA Tutorial, Arindam Banerjee's More Formal Discussion of the Objectives of Dimensionality Reduction)

Topic 7: Clustering1: Mixture Models, K-Means and EM (Introduction to Clustering, Modified Alpaydin transparencies, Top 10 Data Mining Algorithms paper)

Topic 8: Non-Parametric Methods Centering on kNN and Density Estimation(kNN, Non-Parametric Density Estimation, Summary Non-Parametric Density Estimation, Editing and Condensing Techniques to Enhance kNN, Toussant's survey paper on editing, condesing and proximity graphs)

Topic 9: Clustering 2: Density-based Clustering (DBSCAN paper, DENCLUE2 paper)

Topic 10: Decision Trees

Topic 11: Comparing Classifiers

Topic 12: Ensembles: Combining Multiple Learners for Better Accuracy

Topic 13: Support Vector Machines (Eick: Introduction to Support Vector Machines, Alpaydin on Support Vectors and the Use of Support Vector Machines for Regression, PCA, and Outlier Detection (only transperencies which carry the word "cover" will be discussed), Smola/Schoelkopf Tutorial on Support Vector Regression)

Topic 14: More on Kernel Methods(Arindam Banerjee on Kernels, Nuno Vasconcelos Kernel Lecture, Bishop on Kernels; only transparencies 13-25 and 30-35 of the excellent Vasconcelos( Homepage) slides 13-19, 22-24, 30-35 will be covered in 2013)

Topic 15: Naive Bayes and Belief Networks(Eick on Naive Bayes, Eick on Belief Networks (used in the lecture), Bishop on Belief Networks (not used in the lecture, but might be useful for preparing for the final exam)

Topic 16: Successful Application of Machine Learning

Topic 17: Active Learning (might be covered in 2014, if enough time)

Topic 18: Reinforcement Learning (Alpaydin on RL (not used), Eick on RL---try to understand those transparencies; Using Reinforcement Learning for Robot Soccer, Sutton "Ideas of RL" Video (to be shown and discussed in part in 2013), Kaelbling's RL Survey Article---read sections 1, 2, 3, 4.1, 4.2, 8.1 and 9 centering on what was discussed in the lecture)

Topic 19: Brief Introduction to Hidden Markov Models (HMM in BioInformatics)

Topic 20: Computational Learning Theory(Greiner on PAC Learning,...)

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download