Skip to Content

AI-900: Interpreting Confusion Matrices to Evaluate Classification Models

Confusion matrices provide key insights into machine learning classification performance. Learn how to read and analyze these tables to improve predictive accuracy.

Question 53

You are developing a model to predict events by using classification. You have a confusion matrix for the model scored on test data as shown in the following exhibit.

You are developing a model to predict events by using classification.

You are developing a model to predict events by using classification.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic. NOTE: Each correct selection is worth one point.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.

Answer

Explanation

Box 1: 11

TP = True Positive.
The class labels in the training set can take on only two possible values, which we usually refer to as positive or negative. The positive and negative instances that a classifier predicts correctly are called true positives (TP) and true negatives (TN), respectively. Similarly, the incorrectly classified instances are called false positives (FP) and false negatives (FN).

Box 2: 1,033

FN = False Negative

Reference

Microsoft Learn > Previous Versions > Azure > Train models > Evaluate model performance in Machine Learning Studio (classic)

Microsoft Azure AI Fundamentals AI-900 certification exam practice question and answer (Q&A) dump with detail explanation and reference available free, helpful to pass the Microsoft Azure AI Fundamentals AI-900 exam and earn Microsoft Azure AI Fundamentals AI-900 certification.

Microsoft Azure AI Fundamentals AI-900 certification exam practice question and answer (Q&A) dump