Closed Bug 917404 Opened 7 years ago Closed 6 years ago

Need to refine frame difference algorithms for pointgrey cameras

Categories

(Testing Graveyard :: Eideticker, defect, P3)

x86_64
Linux
defect

Tracking

(Not tracked)

RESOLVED WONTFIX

People

(Reporter: wlach, Assigned: wlach)

References

(Depends on 1 open bug)

Details

(Keywords: perf, Whiteboard: [c=automation p= s=2014.08.01.t u=])

Attachments

(4 files, 1 obsolete file)

For some reason, the captures I'm getting out of pointgrey cameras have some subtle differences between frames which do not actually represent content changing. Seems to happen in some applications more than others -- e.g. the gallery app startup test each frame after startup has a difference of about 7000 from the subsequent one. Need to figure out what's going on here and account for it.
I've noticed flickering that can increase apparently depending on the contrast or sharpness of the camera.
(In reply to Dave Hunt (:davehunt) from comment #1)
> I've noticed flickering that can increase apparently depending on the
> contrast or sharpness of the camera.

I think this might be due to various auto-adjustment features of the pointgrey cameras. These are on by default (and hence you see them in the flycapture visualization tool) but we turn them off in eideticker. I've never noticed a flickering eideticker capture.
(In reply to William Lachance (:wlach) from comment #2)
> (In reply to Dave Hunt (:davehunt) from comment #1)
> > I've noticed flickering that can increase apparently depending on the
> > contrast or sharpness of the camera.
> 
> I think this might be due to various auto-adjustment features of the
> pointgrey cameras. These are on by default (and hence you see them in the
> flycapture visualization tool) but we turn them off in eideticker. I've
> never noticed a flickering eideticker capture.

That makes sense.
Here's a patch that uses the entropy of the color histogram in the frames to determine frames, as described here:

http://wrla.ch/blog/2013/10/automatically-measuring-startup-load-time-with-eideticker/

It seems to produce much more stable results than what was used before.
Attachment #818641 - Flags: review?(dave.hunt)
Attachment #818641 - Flags: review?(dave.hunt) → review+
Pushed: https://github.com/mozilla/eideticker/commit/4f6740fc161903eb91add9d1f053a61fb9496f31
Status: NEW → RESOLVED
Closed: 7 years ago
Resolution: --- → FIXED
Oops, looks like this broke the dashboard.

    frame=  316 fps=  7 q=0.0 size=      90kB time=5.27 bitrate= 139.6kbits/s   
    frame=  327 fps=  8 q=0.0 Lsize=     123kB time=5.45 bitrate= 185.6kbits/s   
    21:17:57 video:121kB audio:0kB global headers:0kB muxing overhead 2.277985%
    21:17:57 Traceback (most recent call last):
    21:17:57   File "./bin/update-dashboard.py", line 295, in <module>
    21:17:57 Oct 18 21:17:14  Capture Controller | Creating movie ...
    21:17:57 Oct 18 21:17:57  Capture Controller | Writing final capture './bin/../captures/b2g-contacts-startup-None-2013-10-18-1382123753.zip'...
    21:17:57     main()
    21:17:57   File "./bin/update-dashboard.py", line 290, in main
    21:17:57     baseline=options.baseline, sync_time=options.sync_time)
    21:17:57   File "./bin/update-dashboard.py", line 112, in runtest
    21:17:57     datapoint['timetostableframe'] = videocapture.get_stable_frame_time(capture)
    21:17:57   File "/home/mozilla/eideticker-ci/jenkins-master/jobs/b2g.inari.mozilla-central.master.tests/workspace/eideticker/src/videocapture/videocapture/framediff.py", line 134, in get_stable_frame_time
    21:17:57     return get_stable_frame(capture, threshold) / 60.0
    21:17:57 TypeError: unsupported operand type(s) for /: 'NoneType' and 'float' 

Let's try this again, refactoring the code between get-metric-for-build and update-dashboard to be common so we can fix both of them simultaneously.
Status: RESOLVED → REOPENED
Resolution: FIXED → ---
Here's a patch that should fix the issues, as well as make it less likely that a similar error will occur in the future (by consolidating/sharing the analysis code between the update-dashboard and get-metric-for-build). If it looks good to you, feel free to land!
Attachment #819490 - Flags: review?(dave.hunt)
Comment on attachment 819490 [details] [diff] [review]
Fix capture analysis in update-dashboard, consolidate analysis code in one place

Review of attachment 819490 [details] [diff] [review]:
-----------------------------------------------------------------

::: bin/update-dashboard.py
@@ -115,5 @@
> -        if capture_device == "pointgrey":
> -            # even with median filtering, pointgrey captures tend to have a
> -            # bunch of visual noise -- try to compensate for this by setting
> -            # a higher threshold for frames to be considered different
> -            threshold = 2000

Does this mean we had a different threshold between using get-metric-for-build and update-dashboard? If so, good thing we're consolidating this code!

@@ +116,5 @@
>  
>      framediff_relpath = os.path.join('framediffs', 'framediff-%s.json' % time.time())
>      framediff_path = os.path.join(outputdir, framediff_relpath)
>      with open(framediff_path, 'w') as f:
> +        framediff = videocapture.get_framediff_sums(capture)

This means we'll always use a threshold of 5.0 now instead of being able to override it as we were previously. I can't imagine this is intentional, unless there's a reason I'm not seeing..?
Attachment #819490 - Flags: review?(dave.hunt) → review-
Ok, as discussed on irc, the threshold stuff is a non-issue, as it never worked (we were always using a cached copy). We probably want the raw framediff to go into the data file anyway.

I did notice a minor problem with the getdimensions script though (fixed with this patch). If you have a minute tomorrow, could you test this out with your camera rig and see if it works? I would myself but am currently working on fixing support for my new camera in bug 928115 (my old hasn't come back from RMA yet). I think I'm almost there, so if you don't have time, no big deal, I'll get to it when I finish that bug.
Attachment #819490 - Attachment is obsolete: true
Attachment #820016 - Flags: review?(dave.hunt)
Comment on attachment 820016 [details] [diff] [review]
0001-Bug-917404-Consolidate-and-clarify-capture-analysis-.patch

So r+'ing this based on conversation earlier. There was a minor fix or two that I also needed to apply for this to work, but work it does now.
Attachment #820016 - Flags: review?(dave.hunt) → review+
Going to leave this bug open for now, because we still have to deal with frame differences for animations etc.
Just realized there was some brokenness in the previous patch which I didn't catch. It sucks that this code is so hard to test :(
Attachment #826786 - Flags: review?(dave.hunt)
Attachment #826786 - Flags: review?(dave.hunt) → review+
Here's a tentative patch which (in my limited testing) seems to give fairly reliable estimates of the # of frame transitions in the eideticker panning tests. Since I'm guessing mchang is going to be involved in this aspect of eideticker in the future, I'm gonna assign him for feedback.

Mason: Let's get together over vidyo after the holiday break and I can describe to you how this algorithm works in context.
Attachment #8350806 - Flags: feedback?(mchang)
Attachment #8350806 - Flags: feedback?(mchang) → feedback+
Keywords: perf
Priority: -- → P3
Whiteboard: [c=automation p= s= u=]
I think the conclusion after a few months is that fixing this problem is not simple. E.g.:

http://wrla.ch/blog/2014/07/measuring-frames-per-second-and-animation-smoothness-with-eideticker/

Let's file bugs on specific approaches we might want to try for specific things, I don't think a bug as general as this is very helpful.
Status: REOPENED → RESOLVED
Closed: 7 years ago6 years ago
Resolution: --- → WONTFIX
Whiteboard: [c=automation p= s= u=] → [c=automation p= s=2014.078.01.t u=]
Whiteboard: [c=automation p= s=2014.078.01.t u=] → [c=automation p= s=2014.08.01.t u=]
Product: Testing → Testing Graveyard
You need to log in before you can comment on or make changes to this bug.