Skip to content

False positives and false negatives swapped? #3

Open
@nomadictype

Description

@nomadictype

Hi, I found a small mistake in one of the course videos (sorry if this is the wrong place to post such issues, I tried to leave feedback directly on pluralsight but couldn't figure out how to do so).

Specifically, in the video "Evaluating the Naive Bayes Model", at time 1:30, I believe the FP and FN labels are swapped. FP should be bottom left, FN should be top right. I noticed this when trying to compute the recall, which (if FN were equal to 33) would be TP / (TP + FN) = 52 / (52 + 33) = 0.61, not 0.65 which we should be getting.

I also recomputed the number of true/false positives and negatives from scratch using numpy primitives to verify this:

num_tp = np.logical_and(nb_predict_test == 1, y_test.transpose() == 1).sum()
num_fp = np.logical_and(nb_predict_test == 1, y_test.transpose() == 0).sum()
num_fn = np.logical_and(nb_predict_test == 0, y_test.transpose() == 1).sum()
num_tn = np.logical_and(nb_predict_test == 0, y_test.transpose() == 0).sum()
print("Number of true positives: {0}".format(num_tp))
print("Number of false positives: {0}".format(num_fp))
print("Number of false negatives: {0}".format(num_fn))
print("Number of true negatives: {0}".format(num_tn))

Output:

Number of true positives: 52
Number of false positives: 33
Number of false negatives: 28
Number of true negatives: 118

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions