Commit f6727da
Raise XPU tolerances for bf16 ResNet & BotNet TorchBench (#170552)
Summary:
Multiple TorchBench models on XPU fail accuracy tests due to numeric tolerance being too strict rather. Two contributing factors identified:
1. Measurement methodology change (PyTorch 2.6.0 enforcing cosine_similarity https://github.com/pytorch/pytorch/blob/main/benchmarks/dynamo/common.py#L2227) surfaced limitations and increased sensitivity in error checks for phlippe_resnet.
2. BatchNorm decomposition noise (~1e-5 RMSE per BN in fp16) accumulates through the iteration in botnet26t_256, pushing aggregate diffs beyond current thresholds.
**Analysis**
- phlippe_resnet failures reproduce across CPU and XPU; fp16 already uses higher tolerance, implying bf16 thresholds are misaligned.
- Disabling BN decomposition brings botnet26t_256 outputs within tolerance; with decomposition enabled, cumulative numeric error is expected.
- CI health indicates changes are non-disruptive; failures, where present, are unrelated to these PRs.
Fixes intel/torch-xpu-ops#1799
Fixes intel/torch-xpu-ops#1305
X-link: pytorch/pytorch#170552
Approved by: https://github.com/EikanWang, https://github.com/desertfire
Reviewed By: seemethere
Differential Revision: D89434646
fbshipit-source-id: e5ce062b497201158578abb1bdebaac4b593dbfd
Co-authored-by: Tomasz Bohutyn <tbohutyn@habana.ai>1 parent c65e4e7 commit f6727da
File tree
2 files changed
+11
-0
lines changed- userbenchmark/dynamo/dynamobench
2 files changed
+11
-0
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
71 | 71 | | |
72 | 72 | | |
73 | 73 | | |
| 74 | + | |
| 75 | + | |
| 76 | + | |
| 77 | + | |
74 | 78 | | |
75 | 79 | | |
76 | 80 | | |
| |||
366 | 370 | | |
367 | 371 | | |
368 | 372 | | |
| 373 | + | |
| 374 | + | |
| 375 | + | |
| 376 | + | |
| 377 | + | |
| 378 | + | |
369 | 379 | | |
370 | 380 | | |
371 | 381 | | |
| |||
| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
52 | 52 | | |
53 | 53 | | |
54 | 54 | | |
| 55 | + | |
55 | 56 | | |
56 | 57 | | |
57 | 58 | | |
| |||
0 commit comments