Skip to content

[AutoParallel]Change the trigger position of autodp fake r to s #74243

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

Difers
Copy link
Contributor

@Difers Difers commented Jul 25, 2025

PR Category

Auto Parallel

PR Types

Bug fixes

Description

Pcard-73145
提前auto dp的replicate_grad_to_partial操作到sharding 优化之前,避免对后续优化操作造成影响

Copy link

paddle-bot bot commented Jul 25, 2025

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

if auto_dp:
if (
paddle.distributed.auto_parallel.auto_dp_utils.need_convert_grad_for_auto_dp()
):
paddle.distributed.auto_parallel.auto_dp_utils._convert_fake_replicate_grad_to_partial(
params_grads
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的 _convert_fake_replicate_grad_to_partial 是不是不需要了,前边已经转过了,到这里任何场景下都不需要 replicated to partial

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的 _convert_fake_replicate_grad_to_partial 是不是不需要了,前边已经转过了,到这里任何场景下都不需要 replicated to partial

这里有需要的,原本 auto dp mode 下是可以不调用 shard optimizer 的,如果去掉的话,就必须调用 shard optimize 才能用 auto dp 了

@codecov-commenter
Copy link

Codecov Report

Attention: Patch coverage is 70.00000% with 3 lines in your changes missing coverage. Please review.

Please upload report for BASE (develop@98204ab). Learn more about missing BASE report.

Files with missing lines Patch % Lines
python/paddle/distributed/auto_parallel/api.py 50.00% 2 Missing ⚠️
.../paddle/distributed/auto_parallel/auto_dp_utils.py 80.00% 1 Missing ⚠️

❌ Your patch status has failed because the patch coverage (70.00%) is below the target coverage (90.00%). You can increase the patch coverage or adjust the target coverage.

Additional details and impacted files
@@            Coverage Diff             @@
##             develop   #74243   +/-   ##
==========================================
  Coverage           ?   70.00%           
==========================================
  Files              ?        3           
  Lines              ?       10           
  Branches           ?        0           
==========================================
  Hits               ?        7           
  Misses             ?        3           
  Partials           ?        0           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants