Skip to content

Conversation

@gitttt-1234
Copy link
Collaborator

This PR fixes checkpoint loading in the inference pipeline: we shouldn't reapply pretrained_backbone_weights when constructing the Lightning module for inference. Only the trained checkpoint state is loaded, which avoids accelerator/device mapping errors (e.g., trained on GPU, inferred on CPU) that arose from tying pretrained weight loading to trainer_accelerator. This PR makes inference device-agnostic and ensures only the trained weights are loaded.

@codecov
Copy link

codecov bot commented Aug 22, 2025

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 95.39%. Comparing base (ff91433) to head (5011a3d).
⚠️ Report is 20 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #305      +/-   ##
==========================================
+ Coverage   95.28%   95.39%   +0.10%     
==========================================
  Files          49       49              
  Lines        6765     6990     +225     
==========================================
+ Hits         6446     6668     +222     
- Misses        319      322       +3     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@gitttt-1234 gitttt-1234 merged commit 78b90e1 into main Aug 22, 2025
8 checks passed
@gitttt-1234 gitttt-1234 deleted the divya/fix-loading-weights-inference branch August 22, 2025 18:42
@gitttt-1234 gitttt-1234 restored the divya/fix-loading-weights-inference branch August 24, 2025 16:44
@gitttt-1234 gitttt-1234 deleted the divya/fix-loading-weights-inference branch October 28, 2025 17:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants