cancel
Showing results for 
Search instead for 
Did you mean: 


Difference in confusion matrix

oscarjung
Blue LED

Hello,

 

I was wondering what the difference in the options between 'global', 'actual and 'predicted' mean in the confusion matrix?

 

oscarjung_0-1638415947563.png

@Eu Jin 

 

Labels (1)
1 Reply
Eu Jin
Data Scientist
Data Scientist

hey @oscarjung 

 

Thanks for posting your question and welcome to the community! 

 

In Short 

  • Global looks at absolute accuracy
  • Actual looks at Recall 
  • Predicted looks at Precision 

Lets use this dataset as an example. The targets are A, B and C for simplicity in that order down and across the matrix: 

 

Lets start with the Actuals tab

Screen Shot 2021-12-02 at 2.53.17 PM.png

In the first cell, the score is 98% because 50 out of the 51 of the actual A class, we predicted correctly. DataRobot missed 1 record and predicted it as Category C, hence 2%. Summing up the Category A row gives you 100%. You can see on the image above, there's a very small display of the % breakdown. You can export the CSV file to look at it in more detail, by clicking on that "Export" button like i've circled on the image

 

The next tab is predicted but this time we look column wise where each category that DataRobot predicted, it sums up to 100%. This allows us to see when DataRobot makes the prediction for Category A for example, how many did it hit on Category A, and how many did it misclassified on Category B. If see see the screen shot below, 65% of the prediction that DataRObot made is on Category A which it go right. But 14% of the prediction is actually Category B and 21% is Category C which is incorrectly predicted. Of the total predictions made on Category A, only 65% is correct 

Screen Shot 2021-12-02 at 3.08.04 PM.png

 

Hope that helps! 

 

Regards

@Eu Jin