Description
Is your feature request related to a problem? Please describe.
Currently all the confusion matrix based metrics ask for an already binarized input of prediction with a threshold predecided. In order to calculate metrics when the chosen binarisation threshold provides a given (other) confusion metric value, there should be methods linked between transform:post and confusion matrix so as to define that threshold and then calculate the relevant confusion matrix metrics
Describe the solution you'd like
A function to find the relevant threshold at which to binarise to reach a certain value of confusion matrix and then calculate the associated other confusion metrics values at that threshold. Can be derived from the calculation of ROC, Precision/Recall curve or FPROC. This is necessary for the following:
- PPV@Sensitivity
- Sensitivity@PPV
- Specificity@Sensitivity
- Sensitivity@Specificity
- Sensitivity@FPPI