Intermittent browser_ext_webRequest.js | resource type is correct - Expected: main_frame, Actual: other -

RESOLVED DUPLICATE of bug 1317619

Status

P2
normal
RESOLVED DUPLICATE of bug 1317619
2 years ago
3 months ago

People

(Reporter: intermittent-bug-filer, Assigned: mixedpuppy)

Tracking

({intermittent-failure})

Firefox Tracking Flags

(Not tracked)

Details

(Whiteboard: [stockwell unknown])

5 failures in 690 pushes (0.007 failures/push) were associated with this bug in the last 7 days.  

Repository breakdown:
* autoland: 3
* try: 1
* graphics: 1

Platform breakdown:
* linux32: 3
* osx-10-10: 2

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-01-16&endday=2017-01-22&tree=all
5 failures in 833 pushes (0.006 failures/push) were associated with this bug in the last 7 days.  
Repository breakdown:
* autoland: 3
* mozilla-central: 1
* graphics: 1

Platform breakdown:
* osx-10-10: 3
* linux64: 2

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-02-13&endday=2017-02-19&tree=all
19 failures in 817 pushes (0.023 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* autoland: 8
* try: 6
* mozilla-inbound: 2
* mozilla-central: 2
* oak: 1

Platform breakdown:
* linux64: 15
* linux32: 2
* windows7-32-vm: 1
* osx-10-10: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-04-17&endday=2017-04-23&tree=all
it looks like bug 1353689 introduced this failure:
https://treeherder.mozilla.org/#/jobs?repo=autoland&filter-searchStr=asan%20e10s%20bc13&tochange=9b5e7f9f2816a10c4c1bc9caa5e51d03659e3222&fromchange=a418fa4ad28555c8ea6bea2e6a9f2d079822112f&selectedJob=93788054

there are 39 instances in the last week, primarily on linux64-asan.

:bechen, can you look at fixing this soon?
Blocks: 1353689
Flags: needinfo?(bechen)
Whiteboard: [stockwell needswork]
36 failures in 883 pushes (0.041 failures/push) were associated with this bug in the last 7 days. 

This is the #29 most frequent failure this week.  

** This failure happened more than 30 times this week! Resolving this bug is a high priority. **

** Try to resolve this bug as soon as possible. If unresolved for 2 weeks, the affected test(s) may be disabled. ** 

Repository breakdown:
* autoland: 18
* mozilla-inbound: 13
* graphics: 3
* mozilla-central: 1
* mozilla-beta: 1

Platform breakdown:
* linux64: 35
* windows7-32-vm: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-04-24&endday=2017-04-30&tree=all
(In reply to Joel Maher ( :jmaher) from comment #4)
> it looks like bug 1353689 introduced this failure:
> https://treeherder.mozilla.org/#/jobs?repo=autoland&filter-
> searchStr=asan%20e10s%20bc13&tochange=9b5e7f9f2816a10c4c1bc9caa5e51d03659e322
> 2&fromchange=a418fa4ad28555c8ea6bea2e6a9f2d079822112f&selectedJob=93788054
> 
> there are 39 instances in the last week, primarily on linux64-asan.
> 
> :bechen, can you look at fixing this soon?

Sorry, I don't have any idea about the failure.
Because bug 1353689 enable wpt tests for webvtt, and it seems not relative to this bug?

Can we find someone who knows the test of extentions?
Flags: needinfo?(bechen) → needinfo?(jmaher)
I did more retriggers and see 1 instance on the push prior, it seems odd that changing web-platform tests would cause this to fail so frequently.  This also follows the branches so I am inclined to rule out environment/infra changes- maybe there is an issue with the build.

:andym, can you help find someone to look into this?
Flags: needinfo?(jmaher) → needinfo?(amckay)
:gbrown, do you have further thoughts here?
Flags: needinfo?(gbrown)
It's worth noting that there have been low-frequency failures for some months. 

It looks like the recent increase in frequency is due to failures on linux64-asan/e10s. The regression range noted in comment 4 seems a little late to me: There were linux64-asan failures on April 21 and April 22. 

The earliest recent linux64-asan failure that I see is April 19, https://treeherder.mozilla.org/#/jobs?repo=autoland&revision=1eb86cbfac794754da6ed1a8f6adbc4150900415&filter-searchStr=asan+e10s+browser-chrome, but retries on that push have not reproduced the failure -- maybe not significant.

Curiously, some of the earliest linux64-asan failures around April 19 are on try...but I don't see any reliable connection between the bugs associated with those try pushes, their pushes to integration branches and subsequent failures.


This may be too random to track down the cause for the change in frequency. It's probably better to just debug the test.
Flags: needinfo?(gbrown)
30 failures in 770 pushes (0.039 failures/push) were associated with this bug in the last 7 days. 

This is the #35 most frequent failure this week.  

** This failure happened more than 30 times this week! Resolving this bug is a high priority. **

** Try to resolve this bug as soon as possible. If unresolved for 2 weeks, the affected test(s) may be disabled. ** 

Repository breakdown:
* autoland: 15
* mozilla-inbound: 11
* mozilla-central: 2
* graphics: 2

Platform breakdown:
* linux64: 25
* windows7-32-vm: 5

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-05-01&endday=2017-05-07&tree=all
Summary: Intermittent browser/components/extensions/test/browser/browser_ext_webRequest.js | resource type is correct - Expected: main_frame, Actual: other - → Intermittent browser_ext_webRequest.js | resource type is correct - Expected: main_frame, Actual: other -
30 failures in 879 pushes (0.034 failures/push) were associated with this bug in the last 7 days. 

This is the #48 most frequent failure this week.  

** This failure happened more than 30 times this week! Resolving this bug is a high priority. **

** Try to resolve this bug as soon as possible. If unresolved for 2 weeks, the affected test(s) may be disabled. ** 

Repository breakdown:
* mozilla-inbound: 11
* autoland: 10
* try: 4
* mozilla-central: 4
* graphics: 1

Platform breakdown:
* linux64: 16
* windows7-32-vm: 13
* osx-10-10: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-05-08&endday=2017-05-14&tree=all
(Assignee)

Comment 12

a year ago
Some observations:

The uptick in occurrence of this failure matches the landing of bug 1326298 in which I converted our stream listener to c++.  I'm guessing that contributes to some other timing related issue somewhere else, resulting in more frequent failures.  I don't think bug 1326298 is the cause since failures happened prior to it.

On linux, the failure is during nsIRequestObserver.onStopRequest (WebRequest onCompleted)
On Windows, the failure is during nsIRequestObserver.onStartRequest (WebRequest onResponseStarted)
OSX does not seem to fail.

The offending code in WebRequest.jsm is:

let {loadInfo} = channel;
let policyType = (loadInfo ? loadInfo.externalContentPolicyType
                           : Ci.nsIContentPolicy.TYPE_OTHER);

The test is expecting TYPE_DOCUMENT but is getting TYPE_OTHER.  Which could mean loadInfo is not available for some reason.  It *WAS* available in other stages of the request, and on Windows, it is available in stages both before and after onResponseStarted.

What I don't understand is why we would have no loadInfo at this stage, since it was available earlier in the request cycle.

shotgun ni? a few people that may have an idea where I should look further...
Flags: needinfo?(kmaglione+bmo)
Flags: needinfo?(honzab.moz)
Flags: needinfo?(bzbarsky)
Flags: needinfo?(amckay)
I don't have any bright ideas offhand.  It's a bit odd that nsWebRequestListener doesn't have threadsafe refcounting, and think we should fix that, but I'd expect that to hit asserts in debug builds if it were being a problem in practice right now.

In terms of things I would strongly consider logging (maybe only if some test-only pref is set, so we can examine logs if it fails but don't create noise in general):

1)  Is loadInfo falsy, or is loadInfo.externalContentPolicyType wrong?
2)  Is the "channel" object the one we're expecting?
Flags: needinfo?(bzbarsky)
21 failures in 891 pushes (0.024 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* mozilla-inbound: 10
* autoland: 6
* try: 2
* mozilla-central: 1
* graphics: 1
* cedar: 1

Platform breakdown:
* linux64: 19
* windows7-32-vm: 1
* osx-10-10: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-05-22&endday=2017-05-28&tree=all
48 failures in 820 pushes (0.059 failures/push) were associated with this bug in the last 7 days. 

This is the #25 most frequent failure this week.  

** This failure happened more than 30 times this week! Resolving this bug is a high priority. **

** Try to resolve this bug as soon as possible. If unresolved for 2 weeks, the affected test(s) may be disabled. ** 

Repository breakdown:
* autoland: 31
* mozilla-inbound: 13
* try: 4

Platform breakdown:
* linux64: 40
* windows7-32-vm: 5
* osx-10-10: 1
* linux32: 1
* lint: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-05-29&endday=2017-06-04&tree=all
this has picked up a lot last week, and settled back down this week, I really don't like that this has been at 20+ failures/week for a couple months.
(Assignee)

Comment 17

a year ago
I'll try to get to this in the near future.
Assignee: nobody → mixedpuppy
Priority: -- → P2
18 failures in 864 pushes (0.021 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* mozilla-inbound: 9
* autoland: 5
* try: 2
* mozilla-central: 1
* cedar: 1

Platform breakdown:
* linux64: 11
* windows7-32-vm: 4
* windows8-64: 1
* osx-10-10: 1
* linux32: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-06-05&endday=2017-06-11&tree=all
I'm not sure what I could add here except a suggestion to add some logs to the tests (could influence timing) or to obtain a log (nsHttp:5 at least?) from a faulty run.
Flags: needinfo?(honzab.moz)
:mixedpuppy- any good information from your try push?
(Assignee)

Comment 22

a year ago
Only that channel.loadInfo is occasionally not set, don't know why yet.  This will have to wait to at least work week.
7 failures in 814 pushes (0.009 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* autoland: 4
* mozilla-inbound: 2
* mozilla-central: 1

Platform breakdown:
* linux64: 6
* windows8-64: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-06-12&endday=2017-06-18&tree=all
6 failures in 892 pushes (0.007 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* mozilla-inbound: 3
* autoland: 3

Platform breakdown:
* linux64: 3
* linux32: 3

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-06-19&endday=2017-06-25&tree=all
5 failures in 718 pushes (0.007 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* autoland: 3
* mozilla-inbound: 2

Platform breakdown:
* linux32: 3
* osx-10-10: 1
* linux64: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-06-26&endday=2017-07-02&tree=all
7 failures in 656 pushes (0.011 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* mozilla-inbound: 4
* try: 1
* mozilla-central: 1
* autoland: 1

Platform breakdown:
* linux32: 5
* linux64: 2

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-07-03&endday=2017-07-09&tree=all
this has become very infrequent
Whiteboard: [stockwell needswork] → [stockwell unknown]
6 failures in 720 pushes (0.008 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* autoland: 3
* pine: 1
* mozilla-inbound: 1
* mozilla-central: 1

Platform breakdown:
* linux32: 5
* linux64: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-07-10&endday=2017-07-16&tree=all
(Assignee)

Updated

a year ago
Component: WebExtensions: Untriaged → WebExtensions: Request Handling
(Assignee)

Updated

a year ago
See Also: → bug 1317619
1 failures in 822 pushes (0.001 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* autoland: 1

Platform breakdown:
* linux32: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-07-17&endday=2017-07-23&tree=all
1 failures in 1008 pushes (0.001 failures/push) were associated with this bug in the last 7 days.   

Repository breakdown:
* mozilla-beta: 1

Platform breakdown:
* windows7-32-vm: 1

For more details, see:
https://brasstacks.mozilla.com/orangefactor/?display=Bug&bugid=1326217&startday=2017-07-24&endday=2017-07-30&tree=all

Comment 31

11 months ago
This hasn't happened in ages, I think tests got shuffled around and the duplicate took the place of this one
Status: NEW → RESOLVED
Last Resolved: 11 months ago
Resolution: --- → DUPLICATE
Duplicate of bug: 1317619
(Assignee)

Updated

11 months ago
Flags: needinfo?(kmaglione+bmo)

Updated

3 months ago
Product: Toolkit → WebExtensions
You need to log in before you can comment on or make changes to this bug.