Closed Bug 751347 Opened 12 years ago Closed 12 years ago

Need to qualify graphics performance of pandaboard ES

Categories

(Infrastructure & Operations Graveyard :: CIDuty, task)

ARM
Android
task
Not set
normal

Tracking

(Not tracked)

RESOLVED FIXED

People

(Reporter: wlach, Assigned: wlach)

References

Details

This came up in a conversation I had with Jonathan Griffin and I thought it important enough to file a bug.

In my initial testing of the Pandaboard A3 with eideticker, I found the graphics performance absolutely abysmal and totally unrepresentative of real devices (we were seeing something like the 5 fps on the eideticker canvas clock demo, where a real device like a galaxy nexus would get 60fps).

The ES may well be better, but we should verify this (using Eideticker or Talos). If the gfx performance isn't going to resemble that of a physical device, it's not going to be particularly useful for testing graphics performance with things like Talos (e.g. tcheckerboard) and we may have to consider other solutions.

If someone can provide me a working SDCard image, I can run some preliminary eideticker tests (I have a pandaboard ES here in Montreal). I'm going to tentatively mark this as a dependency of bug 725544.
Component: Release Engineering → Release Engineering: Platform Support
QA Contact: release → coop
Anyone out there that can provide wlach a working SDcard?
Ping: anyone from a-team have an SDcard for wlach?
(In reply to Chris Cooper [:coop] from comment #2)
> Ping: anyone from a-team have an SDcard for wlach?

Clint put up a working image earlier this week which I'll be using for testing. Just getting some last minute other stuff out of the way-- hoping to do this Monday.
So I finally got around to doing a bit of testing of this (getting the sdcard image running on the panda was a bit of a bear).

To get started I tried using the unique frames metric with the canvas clock test (I don't have synthetic input going with the panda yet, so couldn't do anything else).

The numbers seem to be a bit unstable (I think mostly due to latency on my internal wireless network-- need to figure out what's going on there), but the story *seems* to be that we're doing ok.

Panda: 200 / 501, 126 / 354, 124 / 350 (average fps of about 21)
Galaxy Nexus: 336 / 538, 131 / 197, 247 / 337 (average fps of about 47)

So the Panda seems to be about half as good as the Galaxy Nexus, at least on this metric. Not sure what the underlying reason for that is, but I guess we can tentatively say that the Pandaboard isn't a totally ridiculous platform for testing graphics performance. Might be worth roping in some of the graphics people and asking about whether there might be a better / more representative test that we should pick here.
Assignee: nobody → wlachance
(In reply to William Lachance (:wlach) from comment #4)
> So I finally got around to doing a bit of testing of this (getting the
> sdcard image running on the panda was a bit of a bear).
> 
> To get started I tried using the unique frames metric with the canvas clock
> test (I don't have synthetic input going with the panda yet, so couldn't do
> anything else).

BTW this is what the Canvas clock test looks like:

http://wrla.ch/eideticker/dashboard/videos/video-1337351670.49.webm

It's a pretty ridiculous test that basically just tests the raw refresh rate of canvas. Nonetheless, it can measure things like the benefit of OMTC:

http://wrla.ch/eideticker/dashboard/#/canvas-clock/fps
The Pandaboard is "serious" hardware that represents ~middle-high-ish end current generation of phones.  It's a very reasonable test platform.

Pandaboard ES has more horsepower, just under highest-high-end of current gen.

I think either is a fine choice for automation.
(In reply to Chris Jones [:cjones] [:warhammer] from comment #6)
> The Pandaboard is "serious" hardware that represents ~middle-high-ish end
> current generation of phones.  It's a very reasonable test platform.
> 
> Pandaboard ES has more horsepower, just under highest-high-end of current
> gen.
> 
> I think either is a fine choice for automation.

I've heard similar things from other people. I guess this means that any issues we're seeing with graphics performance come down to what's driving the hardware, rather than the hardware itself. I guess the questions I really have are:

1. Are there issues with the drivers which are preventing the panda (either the ES or A3) from performing to its full potential?
2. If the answer to question (1) is yes,  can the pandas still accurately measure regressions or improvements in our code?

Signs from the benchmark I tried point to the answer to (1) being "yes" and (2) being "no". That is to say, it looks like something is limiting the pandaboard's performance, but the benchmarks we're getting from it still appear to be useful information (as opposed to the ridiculous figures of 3-5 frames per second using an earlier build with the Pandaboard A3).

I guess what I was looking for from you was more an idea for a test/benchmark which would confirm this better than a clock's arms spinning around in a webpage. ;) What really pushes a graphics chipset to the limit, independent as possible of other factors?
WebGL tests would probably be best, actually. Unfortunately we currently do a readback, so it's not *totally* indicative of the performance, but give http://webglsamples.googlecode.com/hg/aquarium/aquarium.html a try.
I think we've answered our question of whether these boards are a viable testing platform.
Status: NEW → RESOLVED
Closed: 12 years ago
Resolution: --- → FIXED
Product: mozilla.org → Release Engineering
Component: Platform Support → Buildduty
Product: Release Engineering → Infrastructure & Operations
Product: Infrastructure & Operations → Infrastructure & Operations Graveyard
You need to log in before you can comment on or make changes to this bug.