Skip to content

Commit bdd853f

Browse files
gitttt-1234claude
andauthored
Mask wandb API key in initial_config.yaml (#372)
## Summary - Fixed security issue where wandb API key was being saved to `initial_config.yaml` in plain text - Added API key masking for `_initial_config` alongside existing masking for `training_config.yaml` - Added test to verify API key is properly masked in both config files ## Details Previously, only `training_config.yaml` had the wandb API key masked (set to empty string) before saving to disk. However, `initial_config.yaml` retained the actual API key because: - `_initial_config` is created at line 141 (before API key masking) - API key masking happens at line 723 only for `self.config` - `_initial_config` is saved at line 985-988 without masking ## Changes - **model_trainer.py (line 725-726)**: Added API key masking for `_initial_config` - **test_model_trainer.py (line 384-388)**: Added assertion to verify API key is masked in `initial_config.yaml` ## Test plan - [x] Test passes: `test_model_trainer_centered_instance` now verifies both config files have masked API keys - [x] Linting passes - [ ] CI/CD pipeline passes 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-authored-by: Claude <[email protected]>
1 parent dacd184 commit bdd853f

File tree

2 files changed

+9
-0
lines changed

2 files changed

+9
-0
lines changed

sleap_nn/training/model_trainer.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -720,7 +720,10 @@ def _setup_loggers_callbacks(self, viz_train_dataset, viz_val_dataset):
720720
loggers.append(wandb_logger)
721721

722722
# save the configs as yaml in the checkpoint dir
723+
# Mask API key in both configs to prevent saving to disk
723724
self.config.trainer_config.wandb.api_key = ""
725+
if self._initial_config is not None:
726+
self._initial_config.trainer_config.wandb.api_key = ""
724727

725728
# zmq callbacks
726729
if self.config.trainer_config.zmq.controller_port is not None:

tests/training/test_model_trainer.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -381,6 +381,12 @@ def test_model_trainer_centered_instance(caplog, config, tmp_path: str):
381381
assert training_config.data_config.skeletons
382382
assert training_config.data_config.preprocessing.crop_size == 104
383383

384+
# Verify API key is also masked in initial_config.yaml
385+
initial_config = OmegaConf.load(
386+
f"{model_trainer.config.trainer_config.ckpt_dir}/{model_trainer.config.trainer_config.run_name}/initial_config.yaml"
387+
)
388+
assert initial_config.trainer_config.wandb.api_key == ""
389+
384390
checkpoint = torch.load(
385391
(
386392
Path(model_trainer.config.trainer_config.ckpt_dir)

0 commit comments

Comments
 (0)