Skip to content

Conversation

@ahmed-bsod
Copy link
Contributor

Motivation

changed test_silu_mul_quant_fuse and test_rsnorm_quant_fuse to use pytorch kernels in reference function instead of aiter non-triton kernels

@ahmed-bsod ahmed-bsod requested review from a team and Copilot January 22, 2026 23:04
@ahmed-bsod ahmed-bsod force-pushed the ahmed-bsod/test_fused_fp8_quant branch from 4e6eda5 to 4975dd7 Compare January 22, 2026 23:04
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR refactors test reference functions to use PyTorch kernels instead of aiter non-triton kernels for better portability and maintainability, and adds support for the gfx1250 GPU architecture.

Changes:

  • Replaced aiter-specific rmsnorm2d_fwd and silu_and_mul functions with PyTorch equivalents (torch.nn.functional.silu and a custom rmsnorm implementation)
  • Added gfx1250 architecture support with float8_e4m3fn dtype
  • Simplified function signatures by removing unnecessary parameters

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 3 comments.

File Description
op_tests/triton_tests/quant/test_fused_fp8_quant.py Refactored reference functions to use PyTorch kernels; removed aiter imports and simplified test code
aiter/utility/dtypes.py Added gfx1250 architecture with fp8 dtype support

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@ahmed-bsod ahmed-bsod force-pushed the ahmed-bsod/test_fused_fp8_quant branch from 4975dd7 to 4c16a9f Compare January 22, 2026 23:11
…torch kernels in reference function instead of aiter non-triton kernels
@ahmed-bsod ahmed-bsod force-pushed the ahmed-bsod/test_fused_fp8_quant branch from 4c16a9f to 5bff35a Compare January 22, 2026 23:11
Copy link
Contributor

@azaidy azaidy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants