Skip to content

Create rule S6981: Gradients should be scaled when using mixed precision #3966

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

github-actions[bot]
Copy link
Contributor

@github-actions github-actions bot commented Jun 4, 2024

You can preview this rule here (updated a few minutes after each push).

Review

A dedicated reviewer checked the rule description successfully for:

  • logical errors and incorrect information
  • information gaps and missing content
  • text style and tone
  • PR summary and labels follow the guidelines

@ghislainpiot ghislainpiot force-pushed the rule/add-RSPEC-S6981 branch from 4cfbb88 to 0660c9d Compare June 5, 2024 09:24
Copy link
Contributor

@joke1196 joke1196 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Just a small change to make the CI pass. And a suggestion for the example. Nothing major.


If the gradients underflow, the model might not learn properly and the training might be unstable.

== How to fix it in Pytorch
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be PyTorch, this is in the list of potential frameworks. But I think as long as we don't have the rule specified for multiple frameworks we should just leave it with a simple How to fix it

with torch.autocast(device_type="cuda"):
output = model(x)
loss = torch.nn.functional.cross_entropy(output, y)
loss.backward()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here maybe we could just add the Noncompliant comment

Copy link

sonarqube-next bot commented Jun 5, 2024

Quality Gate passed Quality Gate passed for 'rspec-frontend'

Issues
0 New issues
0 Fixed issues
0 Accepted issues

Measures
0 Security Hotspots
No data about Coverage
No data about Duplication

See analysis details on SonarQube

Copy link

sonarqube-next bot commented Jun 5, 2024

Quality Gate passed Quality Gate passed for 'rspec-tools'

Issues
0 New issues
0 Fixed issues
0 Accepted issues

Measures
0 Security Hotspots
No data about Coverage
No data about Duplication

See analysis details on SonarQube

@ghislainpiot ghislainpiot requested a review from joke1196 June 5, 2024 12:41
Copy link
Contributor

@joke1196 joke1196 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Don't forget to add a title to the PR.

Copy link

@jean-jimbo-sonarsource jean-jimbo-sonarsource left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

An implementation challenge, but possibly a good example of an issue to write a blog post about?

@ghislainpiot ghislainpiot changed the title Create rule S6981 Create rule S6981: Print statements should not be used in production code Jul 24, 2025
@ghislainpiot ghislainpiot changed the title Create rule S6981: Print statements should not be used in production code Create rule S6981: Gradients should be scaled when using mixed precision Jul 24, 2025
@ghislainpiot ghislainpiot force-pushed the rule/add-RSPEC-S6981 branch from 1b9243b to 36a7291 Compare July 24, 2025 15:05
Copy link

Quality Gate passed Quality Gate passed for 'rspec-frontend'

Issues
0 New issues
0 Fixed issues
0 Accepted issues

Measures
0 Security Hotspots
0 Dependency risks
No data about Coverage
No data about Duplication

See analysis details on SonarQube

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants