Skip to content

Single model, multiple data #132

Open
@fjebaker

Description

@fjebaker

There's often a need to fit a single model to several datasets, where only e.g. the normalisation is changing.

At the moment, the model is evaluated once for each dataset, but there's much room for improvement here: we need only evaluate the model once and can use different normalisations for each one. The caveat is; if the normalisation is being used in a convolution model, the effect is potentially non-linear, but in theory we have all the information to know that at compile time.

For now:

  • A solution would be to introduce a new data wrapper that detects if the domains are all the same / overlapping, and then uses the transformer function to effectively copy the model n times and apply the normalisations.
  • Abuse or implement an alternative AutoCache
  • Just concatenate all the datasets into a new dataset wrapper that implements the dataset API so the model is just evaluated once (but then can't adjust normalisations).

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions