-
Notifications
You must be signed in to change notification settings - Fork 366
Description
I have a problem where I have an objective function that can be evaluated, but it is intrinsically noisy. I'm quite happy to be using TBPSA on it. However, for each point i have the value of the objective function, as well as a good measure of the noise (i.e. x -> y +- sigma), that latter is not homogeneous by default (and in fact, can be controlled somewhat, so is typically larger when far away from the optimum).
The question is if there is any method implemented that can use that information on the noise (i.e. sigma) to obtain a more realistic estimate of the minimum (e.g. quadratic approximations of the minimum could do 'the fit' with known uncertainties on the data points).
Edit: a naive approach I could see is to move to the ask and tell interface, and 'tell' the same datapoint multiple times, if the the associated error is small.