Closed Bug 1337641 Opened 8 years ago Closed 8 years ago

Intermittent browser/base/content/test/webrtc/browser_devices_get_user_media_screen.js | application terminated with exit code 3221226505 after Assertion failed: (input_buffer && output_buffer && *input_frames_count + static_cast<int>(samples_to_frames(in

Categories

(Core :: Audio/Video: cubeb, defect, P1)

defect

Tracking

()

RESOLVED FIXED
mozilla55
Tracking Status
firefox52 --- unaffected
firefox-esr52 --- unaffected
firefox53 --- fixed
firefox54 --- fixed
firefox55 --- fixed

People

(Reporter: intermittent-bug-filer, Assigned: padenot)

References

Details

(Keywords: intermittent-failure, Whiteboard: [stockwell disabled])

Attachments

(1 file)

Filed by: philringnalda [at] gmail.com https://treeherder.mozilla.org/logviewer.html#?job_id=75341022&repo=mozilla-central https://archive.mozilla.org/pub/firefox/tinderbox-builds/mozilla-central-win64-debug/1486511876/mozilla-central_win8_64-debug_test-mochitest-browser-chrome-2-bm119-tests1-windows-build7.txt.gz We regret to inform you that your assertion has not been selected for the Most Succinct Assertion Contest. The rest of that is Assertion failed: (input_buffer && output_buffer && *input_frames_count + static_cast<int>(samples_to_frames(internal_input_buffer.length())) >= output_frames) || (output_buffer && !input_buffer && (!input_frames_count || *input_frames_count == 0)) || (input_buffer && !output_buffer && output_frames == 0)
Rank: 12
Component: WebRTC → Audio/Video: cubeb
Flags: needinfo?(padenot)
Priority: -- → P1
See Also: → 1337571
Assignee: nobody → padenot
Flags: needinfo?(padenot)
I have a fix for this.
https://github.com/kinetiknz/cubeb/pull/238, currently being reviewed upstream.
I see the pull request was merged in, when would we see this land on our trunk trees?
Flags: needinfo?(padenot)
(In reply to Joel Maher ( :jmaher) from comment #16) > I see the pull request was merged in, when would we see this land on our > trunk trees? Today.
Flags: needinfo?(padenot)
Depends on: 1342363
Whiteboard: [stockwell fixed]
These failures are still happening on m-c as of today, so it appears that bug 1342363 didn't fix it.
Flags: needinfo?(achronop)
I redirect the NI to padenot who did the analysis and the fix. Bug 1342363 only imported cubeb library from upstream.
Flags: needinfo?(achronop) → needinfo?(padenot)
padenot is on PTO this week. Alex/kinetik - any ideas? Otherwise we'll need to wait for him to come back next week.
Flags: needinfo?(kinetik)
Flags: needinfo?(achronop)
I pushed a newer try with a printf to investigate the failure: https://hg.mozilla.org/try/rev/1173d039a5f82dd32b0c7e87099c5658580d0214 I plan to re-trigger bc4 test until I repro similar to what padenot did. Below is one similar failure from today: https://treeherder.mozilla.org/#/jobs?repo=try&revision=737d50700baae107a80cf3a8bff5ec5623180e6e&selectedJob=80549497
Flags: needinfo?(achronop)
Flags: needinfo?(kinetik)
Depends on: 1344653
Clearing NI, this has been handled by achronop during my PTO, see the dependent bug.
Flags: needinfo?(padenot)
:achronop, when would this PR make it to mozilla-central?
Flags: needinfo?(achronop)
No idea, it's in inbound since yesterday. Probably the release merge slows down everything else.
Flags: needinfo?(achronop)
oh, I will wait a couple days, thanks!
Doesn't look like bug 1344653 worked. Still seeing failures on trunk from today :(
Flags: needinfo?(achronop)
Flags: needinfo?(achronop) → needinfo?(padenot)
brassstacks tell me those failures are from push that don't have the patch (central, graphics, autoland, aurora, and such). No failures on inbound since when it landed. We should check back later today.
Flags: needinfo?(padenot)
By the time I commented in comment 36, there were already failures on branches that should have had the fix. And yes, they're still happening.
Flags: needinfo?(padenot)
Right, so this has really reduced in frequency: 0.08 vs. 0.179. I'll fix it harder soon. Thanks for the heads-up.
Flags: needinfo?(padenot)
thanks for getting a fix for this :padenot!
Whiteboard: [stockwell fixed] → [stockwell needswork]
checking in here, while this is a more mild failure rate, it is still trending around 40/week which is pretty high- we also have bug 1337571 which is ~20/week, possibly we can circle back to this before the end of the month?
Flags: needinfo?(padenot)
See Also: → 1350476
:padenot, thanks for working on this bug. I haven't seen any update here in the last week, can you give us an eta of when this should be fixed? Given the frequency of this failure and other related failures in the same test, we want to ensure this gets resolved (fixed or disabled) this week if possible.
Flags: needinfo?(padenot)
I've done some try pushes, but I don't quite understand what's happening. I'll do another try push, and deactivate this assert soon in the meantime, to ease sheriff's pain.
Flags: needinfo?(padenot)
Ok, I understand now, the test that triggers this got moved a chunk later, so I wasn't re-triggering the right job.
any updates here? Need any help with sorting out chunks or try pushes?
Joel, I don't understand why my printfs are not showing up on the try builds. Maybe I need to sprinkle some `requestFullLog` thing in tests ?
Flags: needinfo?(jmaher)
odd, when you are running locally can you see the printf statements? I see an example of what you are doing here: https://hg.mozilla.org/try/rev/ecedc0310320f92052c5410974cbac04077e9545 adding a: SimpleTest.requestCompleteLog(); to the test case would probably help. I assume we do not need anything else other than that- :gbrown has been doing a bit more of this type of debugging recently, lets see if he has other input/ideas.
Flags: needinfo?(jmaher) → needinfo?(gbrown)
I would not expect requestCompleteLog() to be required, but it can't hurt. Maybe that condition is not being met? Most recent failures have been on aurora and beta, so it may require a lot of retries to reproduce.
Flags: needinfo?(gbrown)
we are at 3 weeks of very frequent failures- would there be any concerns with disabling this test temporarily?
Flags: needinfo?(padenot)
I'm starting to understand what is happening here. I'm going to land a patch to disable this assert for now, I've determined that the cause of this failure makes it so that disabling this is not too problematic (but I'll work upstream to fix this better). Disabling this only this test would be playing whack-a-mole, this failure would pop-up elsewhere. Thanks for being patient with this one, I'll land the patch now on inbound, and ask for uplift.
Flags: needinfo?(padenot)
Comment on attachment 8856587 [details] Bug 1337641 - Temporarily disable an assert in cubeb duplex. https://reviewboard.mozilla.org/r/128542/#review130962 Looks good!
Comment on attachment 8856587 [details] Bug 1337641 - Temporarily disable an assert in cubeb duplex. https://reviewboard.mozilla.org/r/128542/#review130966 Looks good
Attachment #8856587 - Flags: review?(achronop) → review+
Pushed by paul@paul.cx: https://hg.mozilla.org/integration/autoland/rev/68d1795caf30 Temporarily disable an assert in cubeb duplex. r=achronop
Merging m-c (with bug 1348344) over to autoland caused merge conflicts with this patch in media/libcubeb/update.sh. I think I resolved it correctly, but could you take a look and make sure?
Flags: needinfo?(padenot)
Status: NEW → RESOLVED
Closed: 8 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla55
(In reply to Wes Kocher (:KWierso) from comment #68) > Merging m-c (with bug 1348344) over to autoland caused merge conflicts with > this patch in media/libcubeb/update.sh. I think I resolved it correctly, but > could you take a look and make sure? Yes, you did the right thing, thanks! Sorry about that.
Flags: needinfo?(padenot)
Whiteboard: [stockwell needswork] → [stockwell disabled]
See Also: → 1406427
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: