Presentation is loading. Please wait.

Presentation is loading. Please wait.

Receiver Operating Curves

Similar presentations


Presentation on theme: "Receiver Operating Curves"— Presentation transcript:

1 Receiver Operating Curves
Organized by Farrokh Alemi, Ph.D. This section provides a brief introduction to receiver Operating Curves, a tool used to test the predictive accuracy of models. This brief presentation was organized by Dr. Alemi.

2 Sensitivity & Specificity
The accuracy of a predictive model is reported by repeatedly examining its sensitivity and specificity at various cutoff levels.

3 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <=Cutoff True Condition True Positive False Positive Absent False Negative True Negative Think through a contingency table. On the left side is the true condition that we want to predict. It is either present or absent. When present we call it positive and when absent we call it negative.

4 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <=Cutoff True Condition True Positive False Negative Absent False Positive True Negative On top is the posterior odds of the predictions. This number is determined by the predictive model.

5 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <= Cutoff True Condition True Positive False Negative Absent False Positive True Negative When the odds are above a particular cutoff then we predict that the condition is present.

6 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <= Cutoff True Condition True Positive False Positive Absent False Negative True Negative When the odds are below or equal to the cutoff then we predict that the condition is absent.

7 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <= Cutoff True Condition True Positive False Positive Absent False Negative True Negative True positives refer to times that the model predicts the event and it has happened. The model predicts that the condition will occur and in truth the condition actually does occur. This count tells us how many times we are accurate in predicting the condition.

8 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <= Cutoff True Condition True Positive False Positive Absent False Negative True Negative True negatives refer to number of time the condition is absent and we correctly predict that it was absent. The negative here refers to the absence of the condition and not errors in prediction. The higher the true negatives the more accurate we are in predicting absence of the condition.

9 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <= Cutoff True Condition True Positive False Positive Absent False Negative True Negative False negative refer to errors in predicting the absence of the condition. False refers to error in prediction and negative refers to absence of the condition.

10 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <= Cutoff True Condition True Positive False Positive Absent False Negative True Negative False positives refer to errors in predicting the condition. It is called false because it is an erroneous prediction. It is called positive because the error occurs when the condition is present.

11 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <=Cutoff True Condition True Positive False Positive Absent False Negative True Negative A sensitive index has a high probability of detecting the condition. It is calculated as the portion of true positive conditions that are correctly detected by the predictive model. The higher the sensitivity the more accurate we are in predicting the presence of the condition. Sensitivity= True Positives True Condition Present

12 Posterior Odds True Condition
Posterior Odds Present >Cutoff Absent <=Cutoff True Condition True Positive False Positive Absent False Negative True Negative A specific index has a high probability of predicting situations where the condition is absent. Specificity is the portion of patients correctly classified among patients who do not have the condition. The higher the specificity the more accurate we are in predicting the absence of the condition. Specificity= True Negatives True Condition Absent

13 Accuracy in Predicting Presence Accuracy in Predicting Absence
Sensitivity Accuracy in Predicting Absence Specificity Thus, sensitivity is the accuracy in predicting the condition, what we want to predict. High sensitivity means that we detect the condition when it is present. Specificity is accuracy in predicting the opposite, patients who do not have the condition. High specificity means that we do not often mistake other things for the condition.

14 Sample Calculations Let us try to calculate sensitivity and specificity for a model that is predicting patient’s prognosis.

15 Posterior Odds of Alive
Posterior Odds of Alive Alive >Cutoff Dead <=Cutoff True Condition True Positive False Positive False Negative True Negative If we are predicting whether the patient lives 6 months, then the true condition is the status of the patient at end of 6 months,  The posterior odds for being alive is calculated through the predictive model. If these odds exceed a cutoff constant, then we will predict the patient will live 6 months and otherwise we will predict that he will die within 6 months.  The true positives is the count of patients we predict to live that who in fact do live. True negatives is count of patients who we predict will die and who in fact do die. Sensitivity is the portion of alive patients we accurately predict.  Specificity is the portion of dead patients we accurately predict.

16 Probability of Death Total True Condition
Probability of Death Total 0 to .2 .2 to .4 .4 to .6 .6 to .8 .8 to 1 True Condition Alive 33 6 11 2 58 Dead 3 51 36 8 22 35 109 This table gives the predicted probability of death in 6 months at different intervals. We can calculate the sensitivity and specificity of the predictions at different cutoff levels.  Each column gives us the count of people who were alive and dead. For example, the first column tell us of people who had a probability of death in the range 0 to .2, 33 actually lived and 3 died. We are going to use this table and its data to calculate sensitivity and specificity.

17 Probability of Death Total True Condition
Probability of Death Total 0 to .2 .2 to .4 .4 to .6 .6 to .8 .8 to 1 True Condition Alive 33 6 11 2 58 Dead 3 51 36 8 22 35 109 For example, if we select a cutoff level of .4 then probabilities below this are predicted to live and probabilities above it predicted to die.  Probability levels below .4 are shown in yellow and above the cutoff of .4 the probability levels are shown in bleu.

18 Probability of Death Total True Condition
Probability of Death Total 0 to .2 .2 to .4 .4 to .6 .6 to .8 .8 to 1 True Condition Alive 33 6 11 2 58 Dead 3 51 36 8 22 35 109 There are 33 plus 6 or a total of  39 persons who are predicted to live who actually do live.   

19 Probability of Death Total True Condition Specificity = 39 / 58 = 0.67
Probability of Death Total 0 to .2 .2 to .4 .4 to .6 .6 to .8 .8 to 1 True Condition Alive 33 6 11 2 58 Dead 3 51 36 8 22 35 109 These 39 people were correctly predicted from a total of 58 patients. The specificity, that is the portion of alive patients correctly predicted, at this cutoff level is 39 divided by 58 or 0.67. Specificity = 39 / 58 = 0.67

20 Probability of Death Total True Condition
Probability of Death Total 0 to .2 .2 to .4 .4 to .6 .6 to .8 .8 to 1 True Condition Alive 33 6 11 2 58 Dead 3 51 36 8 22 35 109 There are also 2, plus 11, plus 33, or a total of 46 patients, who are predicted to die, who actually do die.   

21 Probability of Death Total True Condition
Probability of Death Total 0 to .2 .2 to .4 .4 to .6 .6 to .8 .8 to 1 True Condition Alive 33 6 11 2 58 Dead 3 51 36 8 22 35 109 These 46 people were correctly predicted from a total of 51 patients. The sensitivity, that is the portion of dead patients correctly predicted, at this cutoff level was 46  divided by 51 or 0.90. Sensitivity = 46 / 51 = 0.90

22 Values Depend on Cutoff
Note that the calculation of the sensitivity and specificity depends on the cutoff value.

23 Predicted Dead if > Cutoff >=0 >.2 >.4 >.6 >.8 >1
Correct Alive Predictions 33 39 45 56 58 Correct Death Predictions 51 48 46 44 Specificity 0.57 0.67 0.78 0.97 1.00 1-Specificity 1 0.43 0.33 0.22 0.03 0.00 Sensitivity 0.94 0.90 0.86 0.65 Random For any index, the sensitivity and specificity of the index depend on the cutoff point at which we predict the patient will live or die. The column in yellow shows the calculation at the cutoff of We could also calculated sensitivity and specificity at different cutoff levels. Different cutoff points produce different levels of sensitivity and specificity.

24 Predicted Dead if > Cutoff >=0 >.2 >.4 >.6 >.8 >1
Correct Alive Predictions 33 39 45 56 58 Correct Death Predictions 51 48 46 44 Specificity 0.57 0.67 0.78 0.97 1.00 1-Specificity 1 0.43 0.33 0.22 0.03 0.00 Sensitivity 0.94 0.90 0.86 0.65 Random At cutoff of 1 we will predict that all patients will live, in which case we would have a specific index but not a very sensitive one. We would predict everyone who lived accurately but none of the patients who died.

25 Predicted Dead if > Cutoff >=0 >.2 >.4 >.6 >.8
>=1 Correct Alive Predictions 33 39 45 56 58 Correct Death Predictions 51 48 46 44 Specificity 0.57 0.67 0.78 0.97 1.00 1-Specificity 1 0.43 0.33 0.22 0.03 0.00 Sensitivity 0.94 0.90 0.86 0.65 Random If we set the cutoff to 0 then all patients would be predicted to die. We would be perfectly accurate in predicting dead patients but terribly inaccurate in predicting alive patients. Our specificity will be zero and our sensitivity will be 1.

26 Receiver Operating Curve
The Receiver Operating Curve plots sensitivity and specificity at different cutoff points. It helps us get a sense of overall accuracy of the predictions.

27 In order to understand changes in specificity and sensitivity we draw a receiving operating curve. In these curves sensitivity is on the Y axis and one minus specificity is on the X axis. The Receiver Operating Curve plots sensitivity and one minus specificity at different cutoff points. We show the cutoff points so that you can see how changes in cutoff points leads to different levels of sensitivity and specificity.

28 For example, the cutoff point greater than
For example, the cutoff point greater than .4 is one point on the receiver operating curve.

29 The straight line shows what will happen if we predict randomly.

30 The difference of area under the receiver operating curve and the random number shows how much the sensitivity and specificity of the predictive model is better than the prediction of a random number. A random prediction has an area of A perfect prediction has an area under the receiver operating curve of 1. The extent to which the operating curve is higher than 0.5 shows how accurate the predictive model is.

31 SQL Code Let us try to calculate sensitivity and specificity using SQL.

32 -- Step 1 decide on a sample of cutoff levels to try
-- Step 2 classify predicted scores by comparison to cutoff -- Step 3 calculate sensitivity and Specificity We present a code that calculates sensitivity and specificity in three steps.  First, we select a range of cutoff values we want to try. Then we compare the predictions to the cutoff value and last we calculate the sensitivity and specificity.

33 -- Step 1 decide on a sample of cutoff levels to try ---
CREATE TABLE #Cutoff (Cutoff Float); INSERT INTO #Cutoff (Cutoff) VALUES (0.0), (0.2), (0.4), (0.8), (0.9), (0.95), (1.0); In the first step we generate a set of cutoff values ranging from 0 to 1. Here we show 7 cutoff values but in practice many more cutoff values are considered, specially near 1 and near 0.  

34 SELECT Row_Number()Over(ORDER BY Predicted) as Row
, [Predicted] AS [Prob], Actual INTO #OrderedData FROM [ROC].[dbo].[Data] order By [Predicted] SELECT (b.[Prob]+a.Prob)/2 as Cutoff INTO #Cutoffs FROM #OrderedData b inner join #OrderedData a ON a.Row = b.Row+1 INSERT INTO #Cutoffs (Cutoff) VALUES (0.0), (1.0); Another and in our view a preferred approach is to set the cutoff values as the average of every two consecutive predictions.   First we order the data in terms of predicted value and calculate a row number for each value.

35 SELECT Row_Number()Over(ORDER BY Predicted) as Row
, [Predicted] AS [Prob], Actual INTO #OrderedData FROM [ROC].[dbo].[Data] order By [Predicted] SELECT (b.[Prob]+a.Prob)/2 as Cutoff INTO #Cutoffs FROM #OrderedData b inner join #OrderedData a ON a.Row = b.Row+1 INSERT INTO #Cutoffs (Cutoff) VALUES (0.0), (1.0); Then we join ordered data to itself and select two adjacent rows of data.  

36 SELECT Row_Number()Over(ORDER BY Predicted) as Row
, [Predicted] AS [Prob], Actual INTO #OrderedData FROM [ROC].[dbo].[Data] order By [Predicted] SELECT (b.[Prob]+a.Prob)/2 as Cutoff INTO #Cutoffs FROM #OrderedData b inner join #OrderedData a ON a.Row = b.Row+1 INSERT INTO #Cutoffs (Cutoff) VALUES (0.0), (1.0); We then calculate the average of the two adjacent values as the new cutoff.  Finally we inset 0 and 1 cutoff values so that we have a comprehensive set of cutoffs.  

37 -- Step 2 classify predicted scores by comparison to cutoff --
DROP TABLE #temp1 SELECT cutoff , IIF(a.[Prob] > b.[Cutoff],1.,0.) AS Predicted , IIF(a.[AgeAtDeath] is null, 0.,1.) AS Actual INTO #Temp1 FROM #Data a Cross Join #Cutoff b In step 2 we compare the predicted posterior probability to the cutoff value.  Any value above the cutoff is assigned a 1 and other values are assigned 0. If we have calculated posterior odds we need to first convert the odds to posterior probability before comparing to cutoff values.

38 -- Step 3 calculate sensitivity and Specificity --- SELECT Cutoff
, SUM(CAST(Actual AS FLOAT)*CAST(Predicted AS FLOAT))/ Sum(CAST(Actual AS FLOAT)) AS Sensitivity , SUM((1-Predicted)*(1-Actual))/SUM(1-Actual) AS Specificity , ROW_NUMBER() OVER(ORDER BY cutoff DESC) AS rnum INTO #sensspec FROM #Temp1 GROUP BY Cutoff In step 3 we calculate the sensitivity and specificity.  

39 -- Step 3 calculate sensitivity and Specificity --- SELECT Cutoff
, SUM(CAST(Actual AS FLOAT)*CAST(Predicted AS FLOAT))/ Sum(CAST(Actual AS FLOAT)) AS Sensitivity , SUM((1-Predicted)*(1-Actual))/SUM(1-Actual) AS Specificity , ROW_NUMBER() OVER(ORDER BY cutoff DESC) AS rnum INTO #sensspec FROM #Temp1 GROUP BY Cutoff Sensitivity is calculated as the true positives divided by the number with the condition.  True positives is the number where both the actual and predicted fields are set to 1. This is calculated as the product of the predicted and actual fields. The product is one only when both are one.   

40 -- Step 3 calculate sensitivity and Specificity --- SELECT Cutoff
, SUM(CAST(Actual AS FLOAT)*CAST(Predicted AS FLOAT))/ Sum(CAST(Actual AS FLOAT)) AS Sensitivity , SUM((1-Predicted)*(1-Actual))/SUM(1-Actual) AS Specificity , ROW_NUMBER() OVER(ORDER BY cutoff DESC) AS rnum INTO #sensspec FROM #Temp1 GROUP BY Cutoff Specificity is calculated as the ratio of true negatives to total number where condition is absent.  True negatives is calculated as number of cases where neither predicted or actual are one. Operationally this is calculated as the product of one minus predicted times one minus actual.  This product is one only when neither predicted or actual are one.

41 SQL Code for AROC We can also use SQL to calculate the area under the receiver curve.

42 Since the area under the receiver curve indicates the accuracy of the prediction, we need a method to calculate this area. The area can be calculated in R or other statistical software but it is best if we can also use SQL to make these calculations.

43 We approximate the curve as a series of adjacent trapezoids.
Run

44 The tops of the trapezoids approximate the curve line
Run

45 The area under the curve is the sum of the areas of the trapezoids
Run

46 Run Let us look at one of these trapezoids. The base of the trapezoid is the X-axis. The length of the base is the difference of the values of the two points on X-axis and is referred to as run. Run

47 Run Rise The height of the trapezoid is uneven and corresponds to the height of each point on the Y-axis. The difference in the two heights is referred to as rise.

48 Run *Rise/2 Rise The area for the trapezoid consists of two elements, a triangle on top with area calculated as Rise times half of Run Run

49 Run Rise Run *Rise/2 Run * Minimum Height
In addition we need to calculate the area in the rectangle, which has the area of Run times the minimum height of the trapezoid.

50 >.2 to >.4 Rise 0.04 Run 0.10 Area of Triangle 0.00
>.2 to >.4 Rise 0.04 Run 0.10 Area of Triangle 0.00 Height of Rectangle 0.90 Area of Rectangle 0.09 Total Area For example, between the two cutoff points >.2 and >.4, the triangle has the rise in sensitivity of 0.04 and a run of The rectangle below the triangle has run of 0.10 and height of 0.90 (the minimum of sensitivity at these two points). The net triangle and rectangle area are and 0.09, for a total area of 0.10.

51 >0 to >.2 >.2 to >.4 >.4 to >.6 >.6 to >.8
>0 to >.2 >.2 to >.4 >.4 to >.6 >.6 to >.8 >.8 to >1 Total Rise 0.06 0.04 0.22 0.65 Run 0.57 0.10 0.19 0.03 Area of Triangle 0.02 0.00 0.01 Height of Rectangle 0.94 0.90 0.86 Area of Rectangle 0.54 0.09 0.12 Total Area 0.55 0.14 0.89 Across all cutoff points the area under the receiving curves is 0.89, which is relatively large and close to maximum of 1.

52 -- Calculate the area of each section
SELECT IIF(b.sensitivity> a.sensitivity, b.sensitivity, a.sensitivity) * Abs(b.specificity-a.specificity) + Abs(b.sensitivity - a.sensitivity) * Abs(b.specificity-a.specificity)/2 AS Area INTO #Areas FROM #sensspec a inner join #sensspec b ON b.rnum-1 = a.rnum SELECT * FROM #Areas -- Calculate the total area under the curve SELECT SUM (Area) AS Area FROM #Areas This SQL code calculates the area within each trapezoids and sums these areas to calculate the area under the receiver operating curve.

53 -- Calculate the area of each section
SELECT IIF(b.sensitivity> a.sensitivity, b.sensitivity, a.sensitivity) * Abs(b.specificity-a.specificity) + Abs(b.sensitivity - a.sensitivity) * Abs(b.specificity-a.specificity)/2 AS Area INTO #Areas FROM #sensspec a inner join #sensspec b ON b.rnum-1 = a.rnum SELECT * FROM #Areas -- Calculate the total area under the curve SELECT SUM (Area) AS Area FROM #Areas This code takes the sensitivity and specificity of adjacent columns of data. We are joining the ordered table of calculated sensitivity and specificity with itself and telling the computer to make sure to select the next row in the data. This method of joining assumes that the data are ordered by cutoff values.

54 -- Calculate the area of each section
SELECT IIF(b.sensitivity> a.sensitivity, b.sensitivity, a.sensitivity) * Abs(b.specificity-a.specificity) + Abs(b.sensitivity - a.sensitivity) * Abs(b.specificity-a.specificity)/2 AS Area INTO #Areas FROM #sensspec a inner join #sensspec b ON b.rnum-1 = a.rnum SELECT * FROM #Areas -- Calculate the total area under the curve SELECT SUM (Area) AS Area FROM #Areas Next we calculate the area for each trapezoid by calculating the rectangle and triangle portion of the area. The rectangle area is calculated as the minimum of the sensitivity values times the run. The triangle area is calculated as half of rise times the run.

55 -- Calculate the area of each section
SELECT IIF(b.sensitivity> a.sensitivity, b.sensitivity, a.sensitivity) * Abs(b.specificity-a.specificity) + Abs(b.sensitivity - a.sensitivity) * Abs(b.specificity-a.specificity)/2 AS Area INTO #Areas FROM #sensspec a inner join #sensspec b ON b.rnum-1 = a.rnum SELECT * FROM #Areas -- Calculate the total area under the curve SELECT SUM (Area) AS Area FROM #Areas In the final step all of the trapezoid areas are summed to obtain the total area under the curve.

56 Area under the receiver Operating Curve (AROC) measures the accuracy of model predictions
The Area under the receiver Operating Curve measures the accuracy of model predictions.


Download ppt "Receiver Operating Curves"

Similar presentations


Ads by Google