Open
Description
Summary
Update the options for the scheduler in the scheduler_config()
function in autoemulate/experimental/emulators/base.py
which was created in #582
Basic Example
Co-pilot has suggested a variety of alternate parameter sets, which we should investigate and decide if we want to include:
@classmethod
def scheduler_config(cls) -> dict:
"""
Returns a random configuration for the learning rate scheduler.
This should be added to the `get_tune_config()` method of subclasses
to allow tuning of the scheduler parameters.
"""
all_params = [
{
"scheduler_cls": [ExponentialLR],
"scheduler_kwargs": [
{"gamma": 0.9},
{"gamma": 0.95},
],
},
# TODO: investigate these suggestions from copilot
# {
# "scheduler_cls": [CosineAnnealingLR],
# "scheduler_kwargs": [{"T_max": 10, "eta_min": 0.01}],
# },
# {
# "scheduler_cls": [ReduceLROnPlateau],
# "scheduler_kwargs": [{"mode": "min", "factor": 0.1, "patience": 5}],
# },
# {
# "scheduler_cls": [StepLR],
# "scheduler_kwargs": [{"step_size": 10, "gamma": 0.1}],
# },
# {
# "scheduler_cls": [CyclicLR],
# "scheduler_kwargs": [{
# "base_lr": 1e-3,
# "max_lr": 1e-1,
# "step_size_up": 5,
# "step_size_down": 5,
# }],
# },
# {
# "scheduler_cls": [OneCycleLR],
# "scheduler_kwargs": [{
# "max_lr": 1e-1,
# "total_steps": self.epochs,
# "pct_start": 0.3,
# "anneal_strategy": "linear",
# }],
# },
]
# Randomly select one of the parameter sets
return random.choice(all_params)
Drawbacks
We may not want to vary scheduler hyperparamaters
Unresolved questions
No response
Implementation PR
No response
Reference Issues
No response