You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
step-name should correspond to the main step of interest in target-config.jsonnet that provides the results we are trying to optimize for (might require #142). For example, this could be a validation/eval step that spits out some metrics of your model on a dataset.
The sweep-config.jsonnet would define which hyperparameters to search and how to search over them. By "hyperparemeters" I really just mean any fields in target-config.jsonnet. There are many ways we could do the search, and this is an active area of research. So I think it would be ideal if we were able to integrate with existing hyperparameter sweep frameworks / platforms, like W&B, Optuna, etc. These integrations should be optional, however, and I think we should provide a simple default search method, which could just be grid search.
Under the hood tango sweep could use the tango run subcommand with the --overrides parameter to select hyperparameter values. We should also be able to run the search in parallel.
The text was updated successfully, but these errors were encountered:
This feature would be very helpful. With allennlp I had been using the ability of registering a new subcommand, to register a train-with-wandb command and use it with sweeps on W&B. All that command did was translate the hyperparameters supplied by the wandb server into JSON string that could be passed as the --overrides to the top-level call to the function responsible for training with the allennlp train command.
However, I could not find a way to add a new subcommand with the click based cli for tango. Is there a way I could replicate the setup mention above and call tango's _run() function from my subcommand?
Is using forwarding as described in the click documentation, a good candidate?
It would be great if Tango provided a simple yet general mechanism for doing hyperparameter searches. Here's an outline of how that could look 👇
We provide a new subcommand:
tango sweep
. This command takesFor example:
step-name
should correspond to the main step of interest intarget-config.jsonnet
that provides the results we are trying to optimize for (might require #142). For example, this could be a validation/eval step that spits out some metrics of your model on a dataset.The
sweep-config.jsonnet
would define which hyperparameters to search and how to search over them. By "hyperparemeters" I really just mean any fields intarget-config.jsonnet
. There are many ways we could do the search, and this is an active area of research. So I think it would be ideal if we were able to integrate with existing hyperparameter sweep frameworks / platforms, like W&B, Optuna, etc. These integrations should be optional, however, and I think we should provide a simple default search method, which could just be grid search.Under the hood
tango sweep
could use thetango run
subcommand with the--overrides
parameter to select hyperparameter values. We should also be able to run the search in parallel.The text was updated successfully, but these errors were encountered: