-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add memory reporting for UI tests and exit for E4Testable on OOM #2433
base: master
Are you sure you want to change the base?
Conversation
Looks like OOM's are happening now later, after UITestsuite is executed:
The last memory output looks like
The only test after that was |
55d06ea
to
cc079cf
Compare
Thank you for investing time to track down this problem. |
Last build had no OOM's https://ci.eclipse.org/platform/job/eclipse.platform.ui/job/PR-2433/2/#showFailuresLink I guess the OOM problem could be related / fixed by eclipse-jdt/eclipse.jdt.core#3126, I saw eclipse-jdt/eclipse.jdt.core@9c11818 broke lot of things in our company internal tests, including endless loops and test crashes. However, our tests rely a lot on JDT, while platform UI tests only use JDT in one or two tests indirectly... But that could be just coincidence, I will retrigger tests once again. |
See https://ci.eclipse.org/platform/job/eclipse.platform.ui/job/PR-2433/3/ |
Hmm, interestingly there are OOM errors on #2438. |
Neither test execution order nor bundle build time are fixed, this makes it almost impossible why the OOM happens on Jenkins IMO. My way to work towards this is to reduce tests to do the minimum setup needed and to reduce the usage of older versions of libs, hoping that at some point that would pay off not only in easier to understand tests but also with less stress on build machines. |
Not sure why do you think the test execution order is not fixed, it seem to be defined by the UiTestSuite, at least it is that what I observe in log.
I don't think this is applicable here, so far it looks like the OOM's appearing at the end of the suite, and so far all measurements printed didn't show any excessive memory use at all. So either we have *something that hits memory at the test end (what???) or it is JVM that is lazy to call GC timely and crashes with OOM's just because GC has no free thread/CPU core to do the work. Later one would match the observation that we have lot of blocked threads and so also lot of fails due eclipse-platform/eclipse.platform#1592. So far I saw no OOM's after adding explicit gc() calls on teardown on this PR. If so, adding explicit gc() calls could stabilize test execution on such poor VM's we have. |
"From version 4.11, JUnit will by default use a deterministic, but not predictable, order. " from https://github.com/junit-team/junit4/wiki/test-execution-order . So as long as nothing changes order stays the same but whenever there is some change that order can go totally different (in my experience). |
Sure, this is about test methods in the test class, I was talking about test classes. |
cc079cf
to
6037f01
Compare
Still failing with OOMs, still till the end no sign of any memory issues. So it could be a memory spike on shutdown (why?) or in the tycho/surefire post-processing code. @laeubi : were there any updates on surefire recently that could be related? I remember we had a memory leak in the past after surefire update, so maybe something similar happened again? To make sure it has no relationship to failed tests because of not deleted files, will wait for the SDK build with the fix for eclipse-platform/eclipse.platform#1593 |
Also exit E4Testable on OOM. This is supposed to workaround and to understand OOM errors on jenkins. See eclipse-platform#2432
6037f01
to
820b875
Compare
We probably should simply increase heap size for tests from default (1/4 RAM == 1 GB) to at least 2 GB. |
Maybe this could help understanding OOM errors on jenkins.
See #2432