Skip to content

Noisy optimization with available error bounds #1700

@vondele

Description

@vondele

I have a problem where I have an objective function that can be evaluated, but it is intrinsically noisy. I'm quite happy to be using TBPSA on it. However, for each point i have the value of the objective function, as well as a good measure of the noise (i.e. x -> y +- sigma), that latter is not homogeneous by default (and in fact, can be controlled somewhat, so is typically larger when far away from the optimum).

The question is if there is any method implemented that can use that information on the noise (i.e. sigma) to obtain a more realistic estimate of the minimum (e.g. quadratic approximations of the minimum could do 'the fit' with known uncertainties on the data points).

Edit: a naive approach I could see is to move to the ask and tell interface, and 'tell' the same datapoint multiple times, if the the associated error is small.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions