Open
Description
Hello,
In classification Is there any way to interpret the obtained Predictive Uncertainty? After computing the predictive uncertainty is there any way to calculate any threshold or cutoff value(as you have mentioned here:https://camo.githubusercontent.com/e78af0e93f0ea7cc80e38f7b9273486bbf6f37f6/687474703a2f2f7777772e63732e6f782e61632e756b2f70656f706c652f616e67656c6f732e66696c6f732f6173736574732f62646c2d62656e63686d61726b732f646961676e6f7369732e706e67) so that if the predictive variance is above that value we can say that the model is uncertain or below which it is certain about its prediction?
Uncertain if (predictive variance>=threshold) || Certain if (predictive variance<threshold)
How to compute this threshold!
Thanks!
Metadata
Metadata
Assignees
Labels
No labels