Closed Bug 1607963 Opened 3 years ago Closed 3 years ago

Crash in [@ RtlAcquireSRWLockExclusive | mozilla::MozPromise<T>::ChainTo]

Categories

(Core :: DOM: Content Processes, defect)

74 Branch
Unspecified
Windows 10
defect
Not set
normal

Tracking

()

RESOLVED FIXED
mozilla74
Tracking Status
firefox-esr68 --- unaffected
firefox72 --- unaffected
firefox73 --- unaffected
firefox74 --- fixed

People

(Reporter: lizzard, Assigned: Yoric)

References

(Regression)

Details

(Keywords: crash, regression)

Crash Data

This bug is for crash report bp-9a1eb0fc-a512-4f1c-b8ba-6e7bc0200108.

This signature started showing up in moderate volume in 74 nightly.

Top 10 frames of crashing thread:

0 ntdll.dll RtlAcquireSRWLockExclusive 
1 xul.dll mozilla::MozPromise<RefPtr<mozilla::dom::ContentParent>, mozilla::ipc::LaunchError, 0>::ChainTo xpcom/threads/MozPromise.h:931
2 xul.dll mozilla::MozPromise<RefPtr<mozilla::dom::ContentParent>, mozilla::ipc::LaunchError, 0>::ThenValue<`lambda at z:/task_1578477160/build/src/dom/ipc/ContentParent.cpp:967:7', `lambda at z:/task_1578477160/build/src/dom/ipc/ContentParent.cpp:985:7'>::DoResolveOrRejectInternal xpcom/threads/MozPromise.h
3 xul.dll mozilla::MozPromise<RefPtr<mozilla::dom::ContentParent>, mozilla::ipc::LaunchError, 0>::ThenValueBase::ResolveOrRejectRunnable::Run xpcom/threads/MozPromise.h:403
4 xul.dll nsThread::ProcessNextEvent xpcom/threads/nsThread.cpp:1246
5 xul.dll NS_ProcessNextEvent xpcom/threads/nsThreadUtils.cpp:486
6 xul.dll mozilla::ipc::MessagePump::Run ipc/glue/MessagePump.cpp:87
7 xul.dll MessageLoop::RunHandler ipc/chromium/src/base/message_loop.cc:308
8 xul.dll MessageLoop::Run ipc/chromium/src/base/message_loop.cc:290
9 xul.dll nsBaseAppShell::Run widget/nsBaseAppShell.cpp:137

The LaunchError in these stacks makes me think this is related to process launching.

Component: XPCOM → DOM: Content Processes

First build this appears in is 20200106215403. Possible regression range based on build id: https://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=86aa64c6bd540f6e93e8bde71754a03cb343f5b7&tochange=e6427fac5ee8d1d87fb78e917781e85dda119a81

Jim - does anything in that regression range look as if it may have caused this?

Keywords: regression
Flags: needinfo?(jmathies)

Maybe bug 1605086?

Flags: needinfo?(jmathies) → needinfo?(dteller)

Very likely.

I'm currently attempting to land bug 1607530, which should get rid of these crashes. It's not perfect, as there's still a possible race condition that may end up with too many processes created – working on that as a separate patch.

Flags: needinfo?(dteller)

Ok, patch landed. Do we have any kind of automated way of checking whether the volume decreases?

The crash data table in the header of this bug will show you the amount of crashes for a given build (and has the one for previous builds left of it).

No crashes in the last week.

Assignee: nobody → dteller
Status: NEW → RESOLVED
Closed: 3 years ago
Depends on: 1607530
Regressed by: 1605086
Resolution: --- → FIXED
Target Milestone: --- → mozilla74
Has Regression Range: --- → yes
You need to log in before you can comment on or make changes to this bug.