Open Bug 1385415 Opened 7 years ago Updated 2 years ago

Add excludeUrls propertie to webRequest.RequestFilter

Categories

(WebExtensions :: Request Handling, enhancement, P5)

enhancement

Tracking

(Not tracked)

UNCONFIRMED

People

(Reporter: kernp25, Unassigned)

Details

I noticed when developing my add-on [1] (it redirects pdf urls to the google viewer page), on the google viewer page there are loading also many urls that triggers the webRequest.onHeadersReceived event, that i can safely ignore these urls (because these urls are from the viewer page and they aren't even pdf links)

For example:
https://accounts.google.com/o/oauth2/
https://clients6.google.com/
https://content.googleapis.com/

I'm not interested for these urls, so i think the following code should work:
webRequest.onHeadersReceived.addListener(
  processHeaders,
  {urls: ["*://*/*"], excludeUrls: ["https://accounts.google.com/o/oauth2/*", "https://clients6.google.com/*", "https://content.googleapis.com/*"],
types: ["main_frame", "sub_frame", "xmlhttprequest"]},
  ["blocking", "responseHeaders"]
);

I think this will also improve performance because the code after the this.shouldRunListener [2] will not be called.

[1] https://addons.mozilla.org/firefox/addon/google-pdf-viewer/
[2] http://searchfox.org/mozilla-central/source/toolkit/modules/addons/WebRequest.jsm#878
What do you think of this idea?
Flags: needinfo?(mixedpuppy)
If you're listening on all urls like that (and why a pattern rather than all_urls?), I don't see excluding a few as providing any real benefit in performance.

I'm personally not really against having an exclude list even if just for convenience, but I don't see a big gain.
Flags: needinfo?(mixedpuppy)
(In reply to Shane Caraveo (:mixedpuppy) from comment #2)
> If you're listening on all urls like that (and why a pattern rather than
> all_urls?), I don't see excluding a few as providing any real benefit in
> performance.

I'm only interested in urls with the http/https scheme (because other urls e.g. ftp do not work with the google viewer).

> I'm personally not really against having an exclude list even if just for
> convenience, but I don't see a big gain.

This is also the same with content_scripts [1] (e.g. exclude_matches) this is also only for a few urls but it's there.

[1] https://developer.mozilla.org/en-US/Add-ons/WebExtensions/manifest.json/content_scripts
Flags: needinfo?(amckay)
This is a spin off from bug 1367320.
Flags: needinfo?(amckay)
Priority: -- → P5
Product: Toolkit → WebExtensions
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.