Closed
Bug 838192
Opened 11 years ago
Closed 9 years ago
Add screenshot functionality to assertion module
Categories
(Testing Graveyard :: Mozmill, defect, P2)
Testing Graveyard
Mozmill
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: whimboo, Unassigned)
Details
Attachments
(1 file, 2 obsolete files)
11.84 KB,
patch
|
Details | Diff | Splinter Review |
Now that we have updated all the tests for the various kinds of assertions we finally can enable the screenshot functionality now. Whenever an assert/expect fails lets get a screenshot created and stored. As an example you can take the l10n tests. Those would also need an update by exchanging the direct screenshot creation to assert/expect calls.
Updated•11 years ago
|
Assignee: nobody → andrei.eftimie
Status: NEW → ASSIGNED
Comment 1•11 years ago
|
||
Here is a first implementation of saving screenshots when our tests fail. I've made several guesses along the way, so it is time to ask for some feedback. Here is a breakdown of what is changed and why: 1. We import the screenshot module in assertions.js 2. When we have a failure we invoke screenshot.create() to save the image 3. I wanted to have the path to the saved image in the Error Report, so the path is saved into results.stack. Here is an example: http://mozmill-crowd.blargon7.com/#/l10n/report/d96621f14ec6678200ea5c05ea5d52cd 4. To save the screenshot path, I also changed in screenshot.js both functions to return the path 5. While trying to import screenshot.js in assertions.js I came accross a circular import issue. The flow was: assertions.js -> screenshot.js -> utils.js -> assertions.js Which would cause an error (something on the lines of assert overwriting itself). I have fixed this by not importing utils.js in screenshot.js anymore, and copied the relevant parts from utils.js (the appInfo part). This is not very DRY, but I have not found a better way to fix this issue. Any advice? 6. I have added caller function name to the screenshot name, for this I have added another param to the screenshot.create method. If empty or null it is not used, if it exists we prepend it to the screenshot name. 7. By default screenshot.js would require a controller as a param (from where it should take the screenshot). Our assertion module does not have access to any controller, so we take the last opened window and take the screenshot from that. It _should_ be reliable since the error most likely occured in the last opened window. However I am not 100% sure that this will always be the case. Any better idea on how to solve this? If we want to pass the controller to the screenshot module, we would have to change every expect and assert call and also include the controller as a param, so its not a feasible solution. ** To see this in action you need a failed test. Run the test with --show-errors or run a testrun. The path to the saved image should be in the top part of the report **
Attachment #712439 -
Flags: feedback?(mario.garbi)
Attachment #712439 -
Flags: feedback?(hskupin)
Attachment #712439 -
Flags: feedback?(dpetrovici)
Attachment #712439 -
Flags: feedback?(dave.hunt)
Attachment #712439 -
Flags: feedback?(andreea.matei)
Comment 2•11 years ago
|
||
Comment on attachment 712439 [details] [diff] [review] patch v1 Review of attachment 712439 [details] [diff] [review]: ----------------------------------------------------------------- Great start, Andrei. I'll leave your feedback requests open to gather some more input. ::: lib/assertions.js @@ +183,5 @@ > if (aCondition) { > this._logPass(result); > } else { > result.stack = stack.stripStackInformation(Components.stack); > + result.stack.screenshotPath = screenshot.create(null, null, frame.name); This path won't be accessible from the dashboard. I wonder if we should instead use a data URL with the screenshot base64 encoded. That way we can display the images in the dashboard too. We might even want both options, so that Jenkins can serve the screenshots from the job workspace. ::: lib/screenshot.js @@ +16,2 @@ > * @param {array of array of int} boxes > + * @param {string} aFileName prepend a given name for the file We may also want to specify a path. If we're gathering screenshots in the CI it would make sense for them to be saved within the job's workspace. @@ +114,5 @@ > +/** > + * Get application specific informations > + * @see http://mxr.mozilla.org/mozilla-central/source/xpcom/system/nsIXULAppInfo.idl > + */ > +var appInfo = { Let's find a way to reuse utils.js here, perhaps Henrik has some thoughts. ::: tests/l10n/testCropped/test1.js @@ +244,5 @@ > this.removeAttribute("data-preference-backup"); > } > > // Bug 614579 - Crop test sometimes shows single line for cropped elements > +// setupModule.__force_skip__ = "Bug 614579 - Crop test sometimes shows single line for cropped elements"; Why are you commenting out this skip?
Attachment #712439 -
Flags: feedback?(dave.hunt) → feedback+
Comment 3•11 years ago
|
||
(In reply to Dave Hunt (:davehunt) from comment #2) > Comment on attachment 712439 [details] [diff] [review] > patch v1 > > Review of attachment 712439 [details] [diff] [review]: > ----------------------------------------------------------------- > > Great start, Andrei. I'll leave your feedback requests open to gather some > more input. > > ::: lib/assertions.js > @@ +183,5 @@ > > if (aCondition) { > > this._logPass(result); > > } else { > > result.stack = stack.stripStackInformation(Components.stack); > > + result.stack.screenshotPath = screenshot.create(null, null, frame.name); > > This path won't be accessible from the dashboard. I wonder if we should > instead use a data URL with the screenshot base64 encoded. That way we can > display the images in the dashboard too. We might even want both options, so > that Jenkins can serve the screenshots from the job workspace. I've thought about that. Or using some other service to upload the images. The problem with encoding them in the raport is their size. An average size might be around 70kb (uncompressed PNG). Per failure per test. I've calculated an average of 65 failures/day (for February 2013). That would be about 4.5MB/day, or about 135MB/month, 1.6GB/year. How would that impact our DB? It's not a number I particularly like. Even if we compress the images (lets say we get a 50% size reduction), we still end up with 800MB/year. It gets pretty costly :( We could use a 3rd party service to upload the images (http://imgur.com) and include the URL in our reports. > ::: tests/l10n/testCropped/test1.js > @@ +244,5 @@ > > this.removeAttribute("data-preference-backup"); > > } > > > > // Bug 614579 - Crop test sometimes shows single line for cropped elements > > +// setupModule.__force_skip__ = "Bug 614579 - Crop test sometimes shows single line for cropped elements"; > > Why are you commenting out this skip? Whoops, used this test to test the screenshot functionality, forgot to uncomment it for the patch. Sorry for that :)
Reporter | ||
Comment 4•11 years ago
|
||
(In reply to Andrei Eftimie from comment #1) Just answering the comments for now. I will have a look at the code later. > 3. I wanted to have the path to the saved image in the Error Report, so the > path > is saved into results.stack. Here is an example: > http://mozmill-crowd.blargon7.com/#/l10n/report/ > d96621f14ec6678200ea5c05ea5d52cd The path should be added right below 'fail' and not under the stack. Reason is that the structure of the stack is pre-defined and we shouldn't change it. > 4. To save the screenshot path, I also changed in screenshot.js both > functions > to return the path Please have a look at the latest implementation for Mozmill 2.0 where we do the same. I would like to see that we are as close as possible to that code. > 5. While trying to import screenshot.js in assertions.js I came accross > a circular import issue. > > The flow was: > assertions.js -> screenshot.js -> utils.js -> assertions.js Circular imports shouldn't be a problem. We would run into an issue if we call code which gets called circularly. Is that the case here? Or which methods are affected? > 6. I have added caller function name to the screenshot name, for this I have > added another param to the screenshot.create method. If empty or null it > is > not used, if it exists we prepend it to the screenshot name. I question the usefulness of this extra part. We have the relation to the appropriate check so we wouldn't need this. > 7. By default screenshot.js would require a controller as a param (from where > it should take the screenshot). > > Our assertion module does not have access to any controller, so we take > the > last opened window and take the screenshot from that. It _should_ be > reliable > since the error most likely occured in the last opened window. However I > am > not 100% sure that this will always be the case. Sounds fine to me. Having more information is better than lesser. (In reply to Andrei Eftimie from comment #3) > I've thought about that. Or using some other service to upload the images. > The problem with encoding them in the raport is their size. > > An average size might be around 70kb (uncompressed PNG). Per failure per > test. > I've calculated an average of 65 failures/day (for February 2013). > That would be about 4.5MB/day, or about 135MB/month, 1.6GB/year. In the screenshot module for mozmill 2.0 we have moved to jpg which gives a way smaller size of the images. So we shouldn't have any problem. We can remove old entries regularly if necessary. > It's not a number I particularly like. > Even if we compress the images (lets say we get a 50% size reduction), we > still > end up with 800MB/year. That's nothing compared to the amount of hours of work we can save. Lets go with it.
Reporter | ||
Comment 5•11 years ago
|
||
Comment on attachment 712439 [details] [diff] [review] patch v1 Review of attachment 712439 [details] [diff] [review]: ----------------------------------------------------------------- I agree. It's a good start. But there are some more things to do. ::: lib/screenshot.js @@ +16,2 @@ > * @param {array of array of int} boxes > + * @param {string} aFileName prepend a given name for the file We do that by specifying --screenshot-path for the automation scripts.
Attachment #712439 -
Flags: feedback?(mario.garbi)
Attachment #712439 -
Flags: feedback?(hskupin)
Attachment #712439 -
Flags: feedback?(dpetrovici)
Attachment #712439 -
Flags: feedback?(andreea.matei)
Attachment #712439 -
Flags: feedback+
Comment 6•11 years ago
|
||
* we now use the mozmill 2.0 screenshot code * --screenshot-path works correctly to use a custom path * code updated to use Services.jsm instead of Cc[] Working example: http://mozmill-sandbox.blargon7.com/#/functional/report/2a6536e9db9f5f44ed48c58510492e08 Relevant mozmill-dashboard pull request (to show the images): https://github.com/mozilla/mozmill-dashboard/pull/50 ** I do have one privacy concern. In our report we also send the local file path, which might contain sensitive information (such as the username of the machine on *nix system). In my case on OS X: "filename":"/Users/andrei.eftimie/Library/Caches/TemporaryItems/mozmill_screenshots/testPasswordNotSaved-3.jpg" I don't have any real qualms with this, but I felt the need to raise the issue, as some might not want to share their machine username when running a mozmill testrun.
Attachment #712439 -
Attachment is obsolete: true
Attachment #724326 -
Flags: review?(andreea.matei)
Reporter | ||
Comment 7•11 years ago
|
||
(In reply to Andrei Eftimie from comment #6) > ** I do have one privacy concern. In our report we also send the local file > path, which might contain sensitive information (such as the username of the > machine on *nix system). > In my case on OS X: > "filename":"/Users/andrei.eftimie/Library/Caches/TemporaryItems/ > mozmill_screenshots/testPasswordNotSaved-3.jpg" That's not new. All the test results contain the full path an not relative ones by the test folder. So it's nothing we would have to worry about at this time, but we might want to address this. I don't think that there is a bug yet. Interestingly I wonder why we store screenshots when no --screenshot-path is given. I thought we wanted to directly add the screenshots as data-url. Or do I miss something?
Comment 8•11 years ago
|
||
ATM we store them in the TmpD path if no --screenshot-path is specified. (So this is only an override for the location). I can easily take that out if no path is specified. Since we have them in our report anyway, it might be better this way.
Reporter | ||
Comment 9•11 years ago
|
||
I would suggest so, and we should also update the mozmill 2.0 code base appropriately. If an user wants to have the screenshots stored locally I would say that he has to explicitly specify the --screenshot-path option. Dave, what would you say?
Flags: needinfo?(dave.hunt)
Comment 10•11 years ago
|
||
Well, as bug 762126 hasn't landed yet we can fix it there. So you're suggesting that screenshots would always be in the report, but only saved to disk if a path is specified?
Flags: needinfo?(dave.hunt)
Reporter | ||
Comment 11•11 years ago
|
||
Yes, that's what I'm thinking of.
Comment 12•11 years ago
|
||
To summarize our IRC chat, we are going to leave screenshot taking by default on, --screenshot-path will only override the location as ci will use that to save screenshots in the build folder under screenshots.
Comment 13•11 years ago
|
||
One more note: Since screenshot.js is mainly used from the assertions module, requiring the assertions module into the screenshot module would either yield some race conditions ("assert is undefined" http://mozmill-sandbox.blargon7.com/#/functional/report/2a6536e9db9f5f44ed48c5851047ddf6 ) or infitive recursiveness (see report http://mozmill-sandbox.blargon7.com/#/functional/report/2a6536e9db9f5f44ed48c58510490ba1 )
Reporter | ||
Comment 14•11 years ago
|
||
Why would we need the assertion module included in screenshots.js? I don't see a reasoning for.
Comment 15•11 years ago
|
||
We had a discussion about Throwing errors directly, or using assertions to test them. It appears using assertions does not work, so I have left that out.
Reporter | ||
Comment 16•11 years ago
|
||
This discussion was about highlevel modules and tests. The screenshots module is a low level one. So throwing errors is fine.
Comment 17•11 years ago
|
||
Comment on attachment 724326 [details] [diff] [review] patch v2 Review of attachment 724326 [details] [diff] [review]: ----------------------------------------------------------------- I like this start, I would be willing to have this in our dashboard as a trial and keep improving it on the way. I feel it needs improving, giving that I've made 2 tests to fail (one awesomebar test at the popup suggestions list and the testPreferredLanguage) and the images captured did not caught the actual failure. For the awesome bar test, the screeshot was after the popup closed and for testPreferredLanguage that opens Preferences dialog and then another modal dialog for changing the language, the capture caught the preferences window instead of the other one that failed at a waitFor. Can we make sure that at least the modalDialog where we fail is caught?
Attachment #724326 -
Flags: review?(andreea.matei) → review-
Comment 18•11 years ago
|
||
* Added support for multiple screenshots for 1 fail. We take a screenshot of each window opened as reported by the windowMediator enumerator ( Services.wm.getEnumerator(null) ) * We 2 an issues (failing to report the screenshot) 1. assert.waitFor Here we delegate the waitfor directly to mozmill, giving us no chance to amend the report before a timeout. expect.waitFor also triggers a _test() 2. assert._logFail Here we only throw a new Error, while expect._logFail does a mozmillFrame.events.fail This does not preserve the screenshots in the report. Issuing only the mozmillFrame event will not stop the test, but will correctly show the screenshots. Issuing the mozmillFrame and also throwing an error will stop the test, but we have the same error reported twice. I am looking into how this has been solved in mozmill 2.0, but at the moment we will miss a lot of potential screenshots as lots of failures are due to assert.waitFor timeouts.
Attachment #724326 -
Attachment is obsolete: true
Reporter | ||
Comment 19•11 years ago
|
||
(In reply to Andrei Eftimie from comment #18) > * Added support for multiple screenshots for 1 fail. > We take a screenshot of each window opened as reported by the > windowMediator enumerator ( Services.wm.getEnumerator(null) ) Why do we need this? Why isn't it enough to get the currently active window and create the screenshot? > 1. assert.waitFor > Here we delegate the waitfor directly to mozmill, giving us no chance to > amend the report before a timeout. > expect.waitFor also triggers a _test() Given that we have more expect.waitFor calls now it shouldn't happen that often, I hope. > 2. assert._logFail > Here we only throw a new Error, while expect._logFail does a > mozmillFrame.events.fail > This does not preserve the screenshots in the report. Not sure I understand. Can you please rephrase? Why doesn't it add the screenshot to the report?
Comment 20•11 years ago
|
||
(In reply to Henrik Skupin (:whimboo) from comment #19) > (In reply to Andrei Eftimie from comment #18) > > * Added support for multiple screenshots for 1 fail. > > We take a screenshot of each window opened as reported by the > > windowMediator enumerator ( Services.wm.getEnumerator(null) ) > > Why do we need this? Why isn't it enough to get the currently active window > and create the screenshot? I am not sure. It might be enough to take only the last active window. Some thoughts about why we grabbed all of them: 1) We've seen that in some instances we were grabbing the main window instead of a Dialog or Pane that was open. It is very likely that I was grabbing the wrong window. We figured getting the full stack might help. 2) We have some failures (in particular some update tests) that we suspect might fail because of previously unclosed windows. This might help prove or disprove that theory. 3) Some deeply nested windows fail (lets say some prefs) might yield more information on the state of the test by also letting us know how the parent Pref pane looked like. I see the reward outweighing the cost. > > 1. assert.waitFor > > Here we delegate the waitfor directly to mozmill, giving us no chance to > > amend the report before a timeout. > > expect.waitFor also triggers a _test() > > Given that we have more expect.waitFor calls now it shouldn't happen that > often, I hope. I am counting 146 assert.waitFor instances and 12 expect.waitFor instances. Most of the failures I have seen that impact us are actually assert.waitFor fails. It would be great to be able to also take screenshots for these failures. > > 2. assert._logFail > > Here we only throw a new Error, while expect._logFail does a > > mozmillFrame.events.fail > > This does not preserve the screenshots in the report. > > Not sure I understand. Can you please rephrase? Why doesn't it add the > screenshot to the report? I am not sure why it doesn't add the screenshot to the report. Logs ---- expect.fail: http://pastebin.com/yBkhaGLy assert.fail: http://pastebin.com/tFreYxjk Code ---- Expect: > Expect.prototype._logFail = function Expect__logFail(aResult) { > mozmillFrame.events.fail({fail: aResult}); > }; Assert: > Assert.prototype._logFail = function Assert__logFail(aResult) { > throw new Error(aResult.message, aResult.fileName, aResult.lineNumber); > }; Looking into mozmillFrame.events.fail we have: > events.fail = function (obj) { > var error = obj.exception; > if(error) { > // Error objects aren't enumerable https://bugzilla.mozilla.org/show_bug.cgi?id=637207 > obj.exception = { > name: error.name, > message: error.message, > lineNumber: error.lineNumber, > fileName: error.fileName, > stack: error.stack > }; > } > // a low level event, such as a keystroke, fails > if (events.currentTest) { > events.currentTest.__fails__.push(obj); > } > for each(var time in timers) { > timer.actions.push( > {"currentTest":events.currentModule.__file__+"::"+events.currentTest.__name__, "obj":obj, > "result":"fail"} > ); > } > events.fireEvent('fail', obj); > } It seems we recreate the error stack here and the image is correctly added to the report. Throwing it directly as the assert does yields no image.
Reporter | ||
Comment 21•11 years ago
|
||
(In reply to Andrei Eftimie from comment #20) > 1) We've seen that in some instances we were grabbing the main window > instead of a Dialog or Pane that was open. It is very likely that I was > grabbing the wrong window. We figured getting the full stack might help. The most recent window should always be the one which is displayed on top, and is active. So if it fails please let me know where. There shouldn't be a difference between normal and modal windows. > 2) We have some failures (in particular some update tests) that we suspect > might fail because of previously unclosed windows. This might help prove or > disprove that theory. If the test flips between different windows and is doing actions in a background window, this will be correct. But I can't remember we do something like that. > 3) Some deeply nested windows fail (lets say some prefs) might yield more > information on the state of the test by also letting us know how the parent > Pref pane looked like. How comes? Any example here? > I am counting 146 assert.waitFor instances and 12 expect.waitFor instances. > Most of the failures I have seen that impact us are actually assert.waitFor > fails. > It would be great to be able to also take screenshots for these failures. Why not putting a try/catch around the call and re-throwing the assertion after the screenshot code got the data? That sounds simple to me.
Reporter | ||
Comment 22•11 years ago
|
||
Do we have an update here?
Updated•11 years ago
|
Priority: -- → P2
Reporter | ||
Comment 23•11 years ago
|
||
Given that we never got an update here, we missed this feature for Mozmill 2.0. :( Given that for Mozmill 2.0 we will no longer use our own assertions module but the one which comes with Mozmill, I don't see why it should be implemented in our Mozmill-Tests lib. Lets get this moved to the Mozmill component and get it in for version 2.1.
Assignee: andrei.eftimie → nobody
Component: Mozmill Tests → Mozmill
Product: Mozilla QA → Testing
Whiteboard: [lib] s=130211 u=enhance c=assertions p=1 → [mozmill-2.1?]
Reporter | ||
Comment 24•10 years ago
|
||
This bug will not be taken for version 2.1. We might want to re-evaluate for 2.2.
Whiteboard: [mozmill-2.1?] → [mozmill-2.2?]
Reporter | ||
Comment 25•9 years ago
|
||
Mozmill will reach its end of life soon. We are currently working on getting all the tests for Firefox ported to Marionette. For status updates please see bug 1080766.
Status: ASSIGNED → RESOLVED
Closed: 9 years ago
Resolution: --- → WONTFIX
Whiteboard: [mozmill-2.2?]
Assignee | ||
Updated•8 years ago
|
Product: Testing → Testing Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•