AUC: a Better Measure than Accuracy in Comparing Learning ...

[Pages:16]AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 1 /16

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms

Authors: Charles X. Ling, Department of Computer Science, University

of Western Ontario, Canada &

Jin Huang, Department of Computer Science, University of Western Ontario, Canada &

Harry Zhang, Faculty of Computer Science, University of New Brunswick, Canada

Presented by: William Elazmeh, Ottawa-Carleton Institute for Computer

Science, Canada

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 2 /16

Introduction

? The focus is visualization of classifier's performance ? Traditionally, performance = predictive accuracy ? Accuracy ignores probability estimations of classifi-

cation in favor of class labels ? ROC curves show the trade off between false positive

and true positive rates ? AUC of ROC is a better measure than accuracy ? AUC as a criteria for comparing learning algorithms ? AUC replaces accuracy when comparing classifiers ? Experimental results show AUC indicates a differ-

ence in performance between decision trees and Naive Bayes (significantly better)

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 3 /16

Matrices

Confusion Matrix +-

Y T+ F+ N F- T-

F+

Rate

=

F+ -

T+

Rate

(Recall)

=

T+ +

Precision

=

T+ Y

Accuracy

=

(T +)+(T -) (+)+(-)

F-Score =

Precision ? Recall

Error Rate = 1 - Accuracy

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 4 /16

ROC Space

1A

All Positive B

C D

Trivial Classifiers

True Positive Rate

E

All Negative F

0

False Positive Rate

1

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 5 /16

True Positive Rate

ROC Curves

0.30

1

0.1

0.34

0.33

0.38 0.37

0.35

0.4

0.39 0.36

0.51

0.505

0.54 0.53

0.52

0.55

0.6

0.8

0.7

0.9

0

0

False Positive Rate

1

# Class Score # Class Score

1 + 0.9 11 + 0.4

2 + 0.8 12 - 0.39

3 - 0.7 13 + 0.38

4 + 0.6 14 - 0.37

5 + 0.55 15 - 0.36

6 + 0.54 16 - 0.35

7 - 0.53 17 + 0.34

8 - 0.52 18 - 0.33

9 + 0.51 19 + 0.30

10 - 0.505 20 - 0.1

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 6 /16

ROC Curves

1

True Positive Rate

0

False Positive Rate

1

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 7 /16

Comparing Classifier Performance ROC

1

True Positive Rate

0

False Positive Rate

1

AUC: a Better Measure than Accuracy in Comparing Learning Algorithms 8 /16

Choosing Between Classifiers ROC

1

True Positive Rate

0

False Positive Rate

1

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download