"main_frame" chrome.webRequest events fail to fire upon returning to a page that uses service workers
Categories
(WebExtensions :: Request Handling, defect, P3)
Tracking
(Not tracked)
People
(Reporter: alexeiatyahoodotcom+mzllbgzll, Unassigned)
References
Details
Comment 2•7 years ago
|
||
Updated•7 years ago
|
Comment 3•7 years ago
|
||
Comment 4•7 years ago
|
||
Updated•7 years ago
|
Comment 5•6 years ago
|
||
This is also an issue for users of the new version of Twitter with Privacy Badger.
To reproduce (with the latest version of Privacy Badger installed):
- Navigate to https://mobile.twitter.com in a new tab and log in.
- Navigate to a different website (e.g. https://www.eff.org/) in the same tab.
- Via the browser bar (in the same tab), navigate directly to a mobile.twitter.com url, such as https://mobile.twitter.com/Twitter/status/1087791357756956680.
If you have Privacy Badger installed, this should throw the tab into an ugly redirect loop involving the login page that does not allow you to access Twitter. This happens because the extension doesn't intercept any main_frame requests; all requests to "mobile.twitter.com" are of type "xmlhttprequest". Therefore, Privacy Badger can't see that the first-party host on the tab should be mobile.twitter.com; therefore, it believes requests to *.twitter.com are third-party tracking requests and tries to block them.
Any updates on this? Is there a timeline?
Comment 6•6 years ago
|
||
The ServiceWorker overhaul should be landing in Fx68, although it's not guaranteed we will flip the preference to enable it, although we're trying hard. It may be best to temporarily set the dom.serviceWorkers.enabled
preference to false to disable ServiceWorkers in your profile, noting that this preference will manually need to be flipped back later on.
Comment 7•6 years ago
|
||
I'm not absolutely certain, but I suspect this bug is responsible for Firefox showing "Corrupted Content Error" on sites reached from search engine results, yet working fine when the exact same URL is pasted into a new tab.
Here is a short discussion:
https://forum.bytemark.co.uk/t/bytemark-website-fails-to-display-when-i-use-google-to-find-it/3053
I've only rarely seen this "Corrupted Content Error" before, but I have seen it occasionally.
Before I've assumed it was a bug in Firefox's TLS or HTTP implementation, or a bug in the site's implementation, and I didn't think to copy the URL and try it in a new tab!
This time, I thought it was just another site with a bad HTTP server, or an annoying "we only test our HTTP server on Chrome" type of issue, until I noticed the console message about a ServiceWorker and thought to look there. Kudos to whoever added the ServiceWorker message to Firefox :-)
Note the error message indicates more than just Privacy Badger misclassifying a page when following a cross-site link:
Failed to load ‘https://www.bytemark.co.uk/’. A ServiceWorker passed a promise to FetchEvent.respondWith() that rejected with ‘TypeError: NetworkError when attempting to fetch resource.’.
When that error occurs, it would be helpful if Firefox displayed a better error page than "Corrupted Content Error".
Comment 8•6 years ago
|
||
If any site that uses ServiceWorker could suffer from connection errors due to this bug, is there a way for the site developer to detect the condition and turn off the ServiceWorker, or otherwise workaround the glitch?
ServiceWorkers are sometimes used just to cache and accelerate.
If there are detectable situations where a site will break despite a correctly written ServiceWorker (even if it's due to a browser bug, or an unfortunate interaction with an add-on), it would be preferable for the site to catch the error and skip using ServiceWorker altogether.
Comment 9•6 years ago
|
||
Bug 1503072 covers handling corrupted content errors better. Competing priorities have delayed implementation.
Comment 10•6 years ago
|
||
(In reply to Andrew Sutherland [:asuth] (he/him) from comment #9)
Bug 1503072 covers handling corrupted content errors better. Competing priorities have delayed implementation.
Thanks.
That's useful, though it's disappointing that such a catastrophic issue (for any site affected permanently by it - the 24 hour recovery timeout never happens in that bug) is marked as mere "enhancement".
Updated•3 years ago
|
Description
•