Skip to content

Conversation

@SurbhiJainUSC
Copy link
Collaborator

@SurbhiJainUSC SurbhiJainUSC commented Jan 15, 2026

Description

Tests

Distillation old command that works:

python3 -m MaxText.distillation.train_distill \
src/MaxText/configs/distillation.yml \
run_name=$RUN_NAME \
base_output_directory=$BASE_OUTPUT_DIRECTORY \
checkpoint_period=5 \
hf_access_token=$HF_TOKEN \
steps=10 \
save_checkpoint_on_completion=True \
teacher_overrides.load_parameters_path=gs://maxtext-model-checkpoints/llama3.1-8b/2025-01-17-04-13/scanned/0/items

This old command will work but will also print a warning: 'src.MaxText.distillation.train_distill' is deprecated; use 'src.maxtext.trainers.post_train.distillation.train_distill' instead..

Distillation new command that works:

python3 -m maxtext.trainers.post_train.distillation.train_distill \
src/MaxText/configs/distillation.yml \
run_name=$RUN_NAME \
base_output_directory=$BASE_OUTPUT_DIRECTORY \
checkpoint_period=5 \
hf_access_token=$HF_TOKEN \
steps=10 \
save_checkpoint_on_completion=True \
teacher_overrides.load_parameters_path=gs://maxtext-model-checkpoints/llama3.1-8b/2025-01-17-04-13/scanned/0/items

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@SurbhiJainUSC SurbhiJainUSC force-pushed the distillation_restructure branch 2 times, most recently from 5445a78 to 432e882 Compare January 15, 2026 23:21
@codecov
Copy link

codecov bot commented Jan 15, 2026

Codecov Report

❌ Patch coverage is 44.06780% with 99 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
.../trainers/post_train/distillation/train_distill.py 44.06% 94 Missing and 5 partials ⚠️

📢 Thoughts on this report? Let us know!

@SurbhiJainUSC SurbhiJainUSC force-pushed the distillation_restructure branch from 432e882 to 1bce907 Compare January 16, 2026 01:27
@shralex
Copy link
Collaborator

shralex commented Jan 16, 2026

@codecov-ai-reviewer review

@github-actions
Copy link

🤖 Hi @SurbhiJainUSC, I've received your request, and I'm working on it now! You can track my progress in the logs for more details.

Copy link

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

📋 Review Summary

This Pull Request successfully moves the distillation module to a new location (src/maxtext/trainers/post_train/distillation) as part of the repo restructuring. The changes are well-implemented with backward compatibility in mind through a shim, and relevant configuration files have been updated.

🔍 General Feedback

  • The approach of using a shim for backward compatibility is a good practice to minimize disruption during code migration.
  • The updates to .coveragerc, codecov.yml, and GitHub Actions workflows (run_pathways_tests.yml, run_tests_against_package.yml) are thorough and correctly reflect the new module structure.

Copy link
Collaborator

@gagika gagika left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants