Closed Bug 846166 Opened 7 years ago Closed 3 years ago

Tracking: Write eideticker tests for basic performance use cases

Categories

(Core :: General, defect)

x86_64
Linux
defect
Not set

Tracking

()

RESOLVED INCOMPLETE

People

(Reporter: cjones, Unassigned)

References

Details

We don't do a /great/ job of perf testing, for any product really.  I'm pretty sure I could check in a patch today that added a sleep(1) to every 2d canvas API entry point and nobody would be sent a regression email.  Similarly, I bet I could land a patch that had our video decoders drop 4 out of every 5 frames and perf-testing automation wouldn't notice.

We should fix that.

We should write tests that fit into two categories: first, ~realistic tests that require pixel-faithful capture.  There are some moderately useful benchmarks floating around like FishIE, Hardware Acceleration Stress Test, Psychedelic Browsing, etc.  We'll want to build some more out of real code like pdf.js.

Second, elemental tests of use cases intended to captured with noisy hardware like a camera.  These tests have to have sharp, easily-detected edges.  Some examples would be animating divs around on screen, color fills, line drawing, playback of videos with solid-color frames, etc.  The goal here is to ensure that elemental performance of something like async animation doesn't regress.

Most of the eideticker automation is built around testing on phones, and indeed that's where perf issues are most sensitive.  But the tests apply just as well on "desktop", and we should build towards running them everywhere.
Status: NEW → RESOLVED
Closed: 3 years ago
Resolution: --- → INCOMPLETE
You need to log in before you can comment on or make changes to this bug.