Skip to content

Fix an issue when piping attn_logits_soft_cap through in vllm. (#8600) #10851

Fix an issue when piping attn_logits_soft_cap through in vllm. (#8600)

Fix an issue when piping attn_logits_soft_cap through in vllm. (#8600) #10851

Triggered via push January 22, 2025 21:07
Status Failure
Total duration 1h 43m 3s
Artifacts 2
get-torch-commit
1s
get-torch-commit
Build PyTorch/XLA  /  build
33m 48s
Build PyTorch/XLA / build
Matrix: CPU tests / test
Fit to window
Zoom out
Zoom in

Annotations

1 error and 2 warnings
CPU tests / test (python_tests, torch_mp_op)
Process completed with exit code 134.
get-torch-commit
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
Build docs / build-docs
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636

Artifacts

Produced during runtime
Name Size
cpp-test-bin
670 MB
torch-xla-wheels
224 MB