Closed Bug 808631 Opened 7 years ago Closed 3 years ago

Marionette needs unique error messages

Categories

(Testing :: Marionette, defect)

defect
Not set

Tracking

(Not tracked)

RESOLVED INCOMPLETE

People

(Reporter: jgriffin, Unassigned)

References

Details

(Keywords: pi-marionette-runner, Whiteboard: [runner])

Error messages shown in TPBL like:

"07:49:26 INFO -
> TEST-UNEXPECTED-FAIL | test_switch_frame.TestSwitchFrame | AssertionError"

are not too helpful, since the same error could be reported for a number of different failures.  We need to make sure that error messages are unique per failure.

See also bug 804366.
This is not a harness problem but a test problem.  The AssertionError above came from a self.assertEqual call.  The test author didn't specify a message parameter, so failures have this generic message.  The fix is to update all the tests to include message parameters.
I don't think it's a test problem. If no message is given a default one should be shown like:

> self.assertEqual(range(10), range(3))

results in:

> AssertionError: Lists differ: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] != [0, 1, 2]

So the question is why don't we print that default message?
(In reply to Henrik Skupin (:whimboo) from comment #2)
> I don't think it's a test problem. If no message is given a default one
> should be shown like:
> 
> > self.assertEqual(range(10), range(3))
> 
> results in:
> 
> > AssertionError: Lists differ: [0, 1, 2, 3, 4, 5, 6, 7, 8, 9] != [0, 1, 2]
> 
> So the question is why don't we print that default message?

We do, in fact.  The problem is that Python's unittest doesn't provide message for all cases (e.g. self.assertEqual(1, 2)), and there is no message for self.assertTrue().

We have a similar issue with JS is/ok assertions in mochitest, and the only solution there, as here, is to provide messages in the test to be used in case of failure.
(In reply to Jonathan Griffin (:jgriffin) from comment #3)
> We do, in fact.  The problem is that Python's unittest doesn't provide
> message for all cases (e.g. self.assertEqual(1, 2)), and there is no message
> for self.assertTrue().

Are you sure? Running a pure Python unittest I can see those messages:

> import unittest
> 
> class Test(unittest.TestCase):
>     def test1(self):
>         self.assertEqual(1, 2)
> 
>     def test2(self):
>         self.assertTrue(False)
> 
> if __name__ == '__main__':
>     unittest.main()

Results in:

> FF
> ======================================================================
> FAIL: test1 (__main__.Test)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "test.py", line 5, in test1
>     self.assertEqual(1, 2)
> AssertionError: 1 != 2
> 
> ======================================================================
> FAIL: test2 (__main__.Test)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>   File "test.py", line 8, in test2
>     self.assertTrue(False)
> AssertionError: False is not True
> 
> ----------------------------------------------------------------------
> Ran 2 tests in 0.000s

> We have a similar issue with JS is/ok assertions in mochitest, and the only

Do you have a reference for me? I'm interested in.
Yes you're right for self.assertEquals().  But for self.assertTrue():

 AssertionError: False is not True

does not result in unique error messages, since the same will be used for every instance of this kind of failure.

For JS tests, you can look for ok() failures in the TBPL logs; the only thing that distinguishes these is the 'msg' parameter which is supplied by the test author.
(In reply to Jonathan Griffin (:jgriffin) from comment #5)
> Yes you're right for self.assertEquals().  But for self.assertTrue():
> 
>  AssertionError: False is not True
> 
> does not result in unique error messages, since the same will be used for
> every instance of this kind of failure.

Well, my point is that the output you have shown in comment 0 does not contain any message. Do you have a real build log I can have a look at so I can understand the format we are using to write out the log. Helpful would be a log which has a failure for a Marionette test as given above.
(In reply to Henrik Skupin (:whimboo) from comment #6)
> Well, my point is that the output you have shown in comment 0 does not
> contain any message.

Thanks. So this still stands. Is it by design that we do not want to show such a message for e.g. assertTrue()? If that's the case we can really close this bug and have to update the test. Otherwise we should find the code in the framework which drops the 'False is not True' part.
I think:

AssertionFailure

and:

AssertionFailure:  False is not True

are equally useless in terms of distinguishing test failures, since the above would be emitted for every assertTrue() failure.  The best option is to update the tests.

As far as why we're dropping, "False is not True", I'm not sure.
(In reply to Jonathan Griffin (:jgriffin) from comment #9)
> As far as why we're dropping, "False is not True", I'm not sure.

Would you mind to point me to the code which prints the assertion to stdout/stderr?
Blocks: b2g-testing
Whiteboard: [runner]
Is there anything actionable left to do here?
Flags: needinfo?(dburns)
Probably not, a lot of the harness has been rewritten so I am going to close this and we can raise specific items as and when we need them
Status: NEW → RESOLVED
Closed: 3 years ago
Flags: needinfo?(dburns)
Resolution: --- → INCOMPLETE
You need to log in before you can comment on or make changes to this bug.