setup unittests for tracemonkey builds

RESOLVED FIXED

Status

defect
P1
normal
RESOLVED FIXED
11 years ago
6 years ago

People

(Reporter: joduinn, Assigned: lsblakk)

Tracking

({fixed1.9.1})

Bug Flags:
blocking1.9.1 +

Firefox Tracking Flags

(Not tracked)

Details

Attachments

(2 attachments)

Spinning out from bug#442522.

This bug is to track running unittests, or adding new unittest suites, to run on each tracemonkey build. From discussions in bug#442522, its unclear to me what exact tests, on what exact OS are of interest. Is this summary correct, or did I miss something?

Run following suites:
mochitest 
js/tests suite (-L lc2 lc3 spidermonkey-n.tests slow-n.tests)


Unittest results should be posted on the same waterfall page as the tracemonkey build results... which is still being figured out in bug#442522.
Priority: -- → P3
I think you missed something, though that summary is correct, since https://bugzilla.mozilla.org/show_bug.cgi?id=442522#c0 exactly describes which tests, and which operating systems, are required.  Comments 4 and 5 describe the timeline, answering your comment 3.

What's to figure out about the waterfall page?  The tracemonkey build results are on MozillaTest, and everyone is happy with that AFAICT.  Please don't let refactoring the waterfall block us actually getting test reports for the first time; we're still flying blind, 2 weeks after the original requested timeline, and that's already going to have us being runnable in browser and landing in m-c later. :/
(What do "Future" and "P3" mean here?  If it's going to be more than another few days to get this up, can I just get a login on the appropriate machines so I can do the work myself?)

Comment 3

11 years ago
Silly question - can we just enable mochi and js tests on the builders already building here? 
Lukas was working on build+unittest on the same slaves -- how far did you get there?

To be honest, I'd be afraid of breaking the other builds going on these slaves (m-c, actionmonkey). Provided we test it will in staging that'd be OK, but given that we haven't done it before it could end up being more work than bringing up new unittest slaves.

The packages it ships off would end up with testing bits in them but I don't think anyone is using those anyways...so if the size of them isn't ridiculous that's not a big deal.
The size of them is not an issue.
Assignee: nobody → lukasblakk
Component: Release Engineering: Future → Release Engineering
Priority: P3 → P1
It seems like the fastest way to unjam this is to take a currently working unittest machine from the mozilla-central unittests and switch it over to tracemonkey instead.

Shaver was interested in a win32 machines, so moving qm-win2k3-unittest-hw from:
- reporting mozilla-central unittests on Firefox tinderbox
...to:
- reporting tracemonkey unittests on MozillaTest tinderbox
...seems like the fastest way to get things going. We have other win32 unittest running on mozilla-central, so this should not impact mozilla-central too much. (It might slightly complicate the debugging-intermittent-unittests work, but not too badly, I hope.)
Mochitests added tonight to Tracemonkey builds on unittest-staging setup.  This may look a bit different than what you are used to on Build. 

First off, in the Check step on a windows build the process hangs on:
../../_tests/xpcshell-simple/xpcom/unit/test_bug374754.js:

and I have to connect remotely and say no to debugging an unhandled xpcshell.exe exception (2x)

then, when running Mochitests, there is another unhandled exception that hangs when they are first initiated.  Clearing this makes the mochitests fail out without running.

That's where we are at tonight.

Status: NEW → ASSIGNED
(In reply to comment #6)
> It seems like the fastest way to unjam this is to take a currently working
> unittest machine from the mozilla-central unittests and switch it over to
> tracemonkey instead.
Lukas had a win32-based-mac-mini "bm-win2k3-unittest-02-hw" running unittests for mozilla-central newly up and reporting consistently green to MozillaTest. 

Rather then touching any of the mozilla-central production unittest machines, we decided to switch "bm-win2k3-unittest-02-hw" over to tracemonkey instead.
(In reply to comment #7)
> Mochitests added tonight to Tracemonkey builds on unittest-staging setup.  This
> may look a bit different than what you are used to on Build. 
> 
> First off, in the Check step on a windows build the process hangs on:
> ../../_tests/xpcshell-simple/xpcom/unit/test_bug374754.js:
> 
> and I have to connect remotely and say no to debugging an unhandled
> xpcshell.exe exception (2x)
> 
> then, when running Mochitests, there is another unhandled exception that hangs
> when they are first initiated.  Clearing this makes the mochitests fail out
> without running.
> 
> That's where we are at tonight.
> 
The curious can look at http://tinderbox.mozilla.org/showbuilds.cgi?tree=MozillaTest, and click through to full logs at:
http://tinderbox.mozilla.org/showlog.cgi?log=MozillaTest/1216707048.1216707682.7915.gz&fulltext=1

(In reply to comment #7)
> Mochitests added tonight to Tracemonkey builds on unittest-staging setup.  This
> may look a bit different than what you are used to on Build. 
> 
> First off, in the Check step on a windows build the process hangs on:
> ../../_tests/xpcshell-simple/xpcom/unit/test_bug374754.js:
> 
> and I have to connect remotely and say no to debugging an unhandled
> xpcshell.exe exception (2x)
> 
> then, when running Mochitests, there is another unhandled exception that hangs
> when they are first initiated.  Clearing this makes the mochitests fail out
> without running.
> 
> That's where we are at tonight.
> 

It is entirely possible that these are legitimate failures. Tracemonkey is a pretty invasive change branch.

I'll leave it to shaver to comment on the likelyhood of this, though.
(In reply to comment #7)
> First off, in the Check step on a windows build the process hangs on:
> ../../_tests/xpcshell-simple/xpcom/unit/test_bug374754.js:
> 
> and I have to connect remotely and say no to debugging an unhandled
> xpcshell.exe exception (2x)

There's a registry setting you can flip to turn off the JIT debugger. Ask coop or nthomas about it.
Awesome; so glad to see this this morning, thanks a ton!

Tracemonkey's a pretty minor changeset unless you flip the JIT pref, which we haven't yet done by default (so it won't be flipped for that unit test box).  But it's still possible that one of the small number of changes that are active when you don't have the pref flipped is causing a failure, because I am sometimes a sloppy bastard.

"WINNT 5.2 mozilla-central bm-win2k3-unittest-02-hw dep unit test" from MozillaTest (have to click back to the archive because of the ongoing MozillaTest damage tinderbox damage) looks to really be pulling from m-c from the logs; did it get flipped back?

(Hard to tell what's going on from that log -- do we need to merge from m-c to get the improved test summarization, or is this just the unit tests being lame about failure output?)
So, I disabled the JIT debugger as per the instructions here:http://wiki.mozilla.org/ReferencePlatforms/Win32#Disable_JIT_Debugger

and instead of the debugger popping up there are now memory exception errors (see attached)

It's still failing check and mochitests in the same places.
Posted image Memory error
(In reply to comment #14)
> Created an attachment (id=330790) [details]
> Memory error
(In reply to comment #15)
> Created an attachment (id=330791) [details]
> mochitest error - screenshot

Shaver, any ideas on what could be causing these errors?
No, not without a stack trace, but they went away after I merged up to m-c, so I think they were latent on the trunk.  We were green for a while after the merge, but I can't find hide nor hair of the unit test box now on MozillaTest or MozillaExperimental, so I dunno for sure.
I found the unit test box on UnitTest, but it's just yellowing over and over, not ever going green/red/orange?  Is that a fault in the repository (would be surprising, but maybe I merged at a bad time for the build setup?) or something in the machine config?
Yellowing has stopped since rebuilding buildbot on staging-master with correct PYTHONPATH and restarting the tracemonkey-unittest buildbot master.
See tracemonkey unittest results here:

http://tinderbox.mozilla.org/showbuilds.cgi?tree=UnitTest

4 green cycles since 10 am this morning.

Can we close this bug?  Or does it need to stay open while we work out getting js tests working?
Just talked with Shaver, he's happy to close this bug as its consistently running green now. He will file a separate bug to enable js tests, once they have it figured out manually.


Nice work, thanks Lukas! :-)
Status: ASSIGNED → RESOLVED
Last Resolved: 11 years ago
Resolution: --- → FIXED

Comment 22

11 years ago
Where is this tinderbox now?
I thought all those tinderboxes were being moved to MozillaTest, with the other tracemonkey builds (or probably replacing the ones that just run codesighs), but I don't see them there.
Reopening this as this tinderbox seems to have gone feral.
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
blocking1.9.1+
Flags: blocking1.9.1+
"WINNT 5.2 tracemonkey unittest bm-win2k3-unittest-02-hw" is now back on the UnitTest tree, currently green. Someone restarted the buildbot master earlier today, but the builder was hidden on the tinderbox waterfall (probably by bug 390349).
Status: REOPENED → RESOLVED
Last Resolved: 11 years ago11 years ago
Resolution: --- → FIXED
Build and unittest results for TraceMonkey project branch can now be found on: http://tinderbox.mozilla.org/showbuilds.cgi?tree=TraceMonkey

(the curious can follow details in bug#442522 and bug#453567)
Fixed 1.9.1 as per bhearsum on IRC
Keywords: fixed1.9.1
Product: mozilla.org → Release Engineering
You need to log in before you can comment on or make changes to this bug.