Skip to content

Commit d106fe7

Browse files
gitttt-1234claude
andcommitted
Fix confusing weight loading logging for legacy models
When loading backbone/head weights separately from a legacy .h5 file, the logging showed confusing counts like "22/26 loaded" instead of "22/22" because it used total legacy weights as the denominator. Changes: - Use PyTorch model parameters as denominator instead of total legacy weights - Change "No matching PyTorch parameter" messages from warning to debug level since mismatches are expected when loading components separately - Add debug message showing count of unmatched legacy weights Before: "Successfully mapped 22/26 legacy weights to PyTorch parameters" After: "Successfully mapped 22/22 PyTorch parameters from legacy weights" Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 parent 699a8e2 commit d106fe7

File tree

1 file changed

+7
-2
lines changed

1 file changed

+7
-2
lines changed

sleap_nn/legacy_models.py

Lines changed: 7 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -260,7 +260,7 @@ def map_legacy_to_pytorch_layers(
260260
if matching_pytorch_name:
261261
mapping[legacy_path] = matching_pytorch_name
262262
else:
263-
logger.warning(f"No matching PyTorch parameter found for {legacy_path}")
263+
logger.debug(f"No matching PyTorch parameter found for {legacy_path}")
264264

265265
# Log mapping results
266266
if not mapping:
@@ -270,8 +270,13 @@ def map_legacy_to_pytorch_layers(
270270
)
271271
else:
272272
logger.info(
273-
f"Successfully mapped {len(mapping)}/{len(legacy_weights)} legacy weights to PyTorch parameters"
273+
f"Successfully mapped {len(mapping)}/{len(pytorch_params)} PyTorch parameters from legacy weights"
274274
)
275+
unmatched_count = len(legacy_weights) - len(mapping)
276+
if unmatched_count > 0:
277+
logger.debug(
278+
f"({unmatched_count} legacy weights did not match any parameters in this model component)"
279+
)
275280

276281
return mapping
277282

0 commit comments

Comments
 (0)