-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test failure: baseservices/exceptions/stackoverflow/stackoverflowtester/stackoverflowtester.cmd #110173
Comments
Failed in: runtime-coreclr outerloop 20241127.3 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241202.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241203.2 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241204.3 Failed tests:
Error message:
Stack trace:
|
This issue is also affecting optional pipelines: |
CC @mangod9. |
Failed in: runtime-coreclr pgostress 20241206.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr r2r-extra 20241208.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr pgo 20241211.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241212.3 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr pgo 20241215.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr r2r 20241216.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241219.3 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstressregs 20241221.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr r2r-extra 20241222.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241224.3 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr pgo 20241225.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr crossgen2 20241225.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr pgostress 20241227.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20241230.3 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstress 20241231.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr crossgen2 20250104.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20250107.7 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr crossgen2 20250107.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstress 20250108.2 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr pgostress 20250110.1 Failed tests:
Error message:
Stack trace:
|
cc @janvorli, any possibility you could take a look at this? Looks like some form of hang. |
So far seems to be Linux only |
I have looked at this in the past. The infinite loop (the hang @jakobbotsch mentions) is sometimes intentional, it happens when another thread is already handling stack overflow. We don't allow multiple threads to handle it. The thread that was handling the stack overflow should have terminated the process. |
Failed in: runtime-coreclr pgo 20250113.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20250115.4 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstress 20250115.2 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr pgostress 20250117.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20250120.4 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20250121.3 Failed tests:
Error message:
Stack trace:
|
@janvorli - were you able to add the checks? This again failed in this week's run. |
Not yet, but it is on my TODO list to do as soon as I can. |
Failed in: runtime-coreclr outerloop 20250123.3 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstress 20250123.4 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstress2-jitstressregs 20250126.1 Failed tests:
Error message:
Stack trace:
|
I've created PR #111867 with additional logging to help further investigations. |
Failed in: runtime-coreclr pgostress 20250131.2 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr jitstress 20250204.1 Failed tests:
Error message:
Stack trace:
|
Failed in: runtime-coreclr outerloop 20250206.1 Failed tests:
Error message:
Stack trace:
|
@kunalspathak even with the added logging, I was unable to figure out how that can occur. I can see that it detects that the stack overflow was hit on the main thread for 2nd time, but I don't see how that could happen. The added logging indicates that we don't return from the sigsegv handler, which would be the only way I could imagine we would hit the sigsegv at the same place. The stack trace dumped from the hung process seems to indicate that the 2nd sigsegv came from the managed code of the test. I have tried to repro it locally on both x64 and arm64 linux and was unable to get a repro even in 10000 iterations. One more idea I have that I'll try to create a PR for today is to trigger actual dump generation when I detect that we got the sigsegv on the same thread again. Maybe that's going to reveal more. |
Failed in: runtime-coreclr outerloop 20241125.3
Failed tests:
Error message:
Stack trace:
Known issue validation
Build: 🔎 https://dev.azure.com/dnceng-public/public/_build/results?buildId=878786
Error message validated:
[baseservices/exceptions/baseservices-exceptions/../stackoverflow/stackoverflowtester/stackoverflowtester.sh Timed Out (timeout in milliseconds: 600000 from variable __TestTimeout
]Result validation: ✅ Known issue matched with the provided build.
Validation performed at: 12/4/2024 10:05:14 PM UTC
Report
Summary
The text was updated successfully, but these errors were encountered: