Skip to content

Conversation

@xuzhao9
Copy link
Contributor

@xuzhao9 xuzhao9 commented Oct 31, 2025

torch.compile has numeric issue with used with torch.amp.autocast and bias

The problem is gone when:

  1. Not using amp, or
  2. Not using bias

#607

Reproduce:

LD_LIBRARY_PATH="$HOME/.conda/envs/py312/lib" python run.py --op mlp  --metrics accuracy --use_bias --num-inputs 1

The issue is not related to amp. It is only related to the post_grad pass of pt2 where it decomposes addmm into mm + triton add. This should be easily reproduced with a simpler test.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants