Closed Bug 1136779 Opened 9 years ago Closed 9 years ago

Criteria document for picking Gaia Integration tests to automate

Categories

(Firefox OS Graveyard :: Gaia::UI Tests, defect)

x86_64
Linux
defect
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: jlorenzo, Assigned: jlorenzo)

Details

Attachments

(1 file)

QA Whiteboard: [fxosqa-auto-s11]
Attached file Wiki link
I added the isolation and the timing issues we talked about yesterday. I compared the criteria list against the initial document[1] which explains the differences, I don't find anything any criteria regarding the nature of a test is missing from there.

[1] https://wiki.mozilla.org/B2G/QA/Automation/UI/Strategy/Integration_vs_Acceptance#The_Solution
Attachment #8570449 - Flags: review?(gmealer)
Comment on attachment 8570449 [details]
Wiki link

Looks good. Let's get started with this, and polish it as we go. 

Thoughts on a couple of comments/questions in the doc:

* General edge cases tests *should we use the term negative test case instead?*

Yeah, probably, though it's more than that. I'd suggest "Tests that don't represent a whole user story, such as boundary/edge cases, negative cases, and highly specific or unusual bug verifications.

That would also replace the point that comments on unusual STR.

* Re: searching for a threshold on "take too long," I'd stay away from it. The test runtime differs per target. I'd leave it at "take too long" for now and let it be a judgment call. If we come up with bright line criteria later we can add it.

Other comments:

* I'd explicitly add to the first section, "Tests that validate binding between the UI state and back end state"

* Also would add something like, "Tests that perform small, isolated checks of the UI, such as a test that only verifies the sort order of a table"

* I'd stay away from QA jargon like STR, if the intention is to let a wide audience read this. I'd at least spell it out as "Steps to Reproduce". I suggested replacing that one anyway, but something to keep in mind in general.

* Re: the first point on mocks, your point comes across great, but the wording is a little awkward. Maybe:

"Tests that use mocks to simulate a portion of the system"

* I want to be a little careful on the last two general considerations. Acceptance tests *can* verify back end data, and they *can* do some steps in a non-user-like fashion. 

That last one is especially true of setup/cleanup, and other things peripheral to but not part of the test; it's much more robust usually to go through the back end for everything but the actual test flow. That way any breakage in a UI area in theory only fails test flows that have to traverse that area, not tests that needed it for something else.

That said, I don't know how best to capture that distinction. Maybe "Tests that primarily..." ? Some of this is liable to be a bit fuzzy.

Ultimately the acceptance test distinction is "Tests that automate what a QA engineer runs for acceptance," and integration is "Everything else." It's just kind of hard to capture that in bullets.

Anyway, looks good. I wouldn't get hung up on any of these, and feel you should link this in as a deployed doc ASAP if you haven't already, then follow up with any recommend edits. 

I do recommend moving the title to initial capitals though per wiki style guide if possible first.
Attachment #8570449 - Flags: review?(gmealer) → review+
(In reply to Geo Mealer [:geo] from comment #2)
> I'd suggest "Tests that don't represent a whole user story, such as
> boundary/edge cases, negative cases, and highly specific or unusual
> bug verifications. That would also replace the point that comments
> on unusual STR.
Used this sentence and removed the STR sentence.

> I'd leave it at "take too long" for now and let it be a judgment call.
No precision on the length anymore.

> * I'd explicitly add to the first section, "Tests that validate binding
> between the UI state and back end state"
Added. 

> * Also would add something like, "Tests that perform small, isolated checks
> of the UI, such as a test that only verifies the sort order of a table"
Added.

> Maybe: "Tests that use mocks to simulate a portion of the system"
Thanks for the wording! Replaced.

> Acceptance tests *can* verify back end data
Added the precision on setUp/tearDown.

> they *can* do some steps in a non-user-like fashion.
I remove this sentence. Maybe we'll find a better definition when a PR will come and we'd conclude that's not correct way to go with Acceptance.

> you should link this in as a deployed doc ASAP if you haven't already
Liked to https://wiki.mozilla.org/B2G/QA/Automation/UI

> I do recommend moving the title to initial capitals though per wiki style
> guide if possible first.
Per the instruction when you create a new page[1], we should avoid that.

[1] "Please DO NOT create pages on MozillaWiki with titles that are CamelCase or ALL UPPER CASE.
Page titles should contain spaces between words, which helps make it easier to search and maintain the wiki."
Status: NEW → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Thanks for the changes! Rolling up IRC convo to the bug, the wiki page means don't push the words together like:

AssistWithCiTests

Typically, at Mozilla I see 'Assist with CI Tests' as casing--i.e. title casing. That said, wikipedia's style guide and our Tips and Tricks page agrees with your casing, and I have no real issue with it one way or the other.
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: