Closed Bug 1496075 Opened 6 years ago Closed 6 years ago

Support loading search extensions on startup

Categories

(Firefox :: Search, enhancement, P1)

enhancement

Tracking

()

RESOLVED FIXED
Firefox 68
Tracking Status
firefox68 --- fixed

People

(Reporter: daleharvey, Assigned: daleharvey)

References

(Depends on 1 open bug)

Details

Attachments

(7 files, 7 obsolete files)

11.03 KB, patch
rhelmer
: feedback+
Details | Diff | Splinter Review
91.60 KB, patch
daleharvey
: feedback+
Details | Diff | Splinter Review
47 bytes, text/x-phabricator-request
Details | Review
47 bytes, text/x-phabricator-request
Details | Review
47 bytes, text/x-phabricator-request
Details | Review
47 bytes, text/x-phabricator-request
Details | Review
47 bytes, text/x-phabricator-request
Details | Review
Assignee: nobody → dharvey
Blocks: 1486820
Priority: -- → P1
See Also: → webextensions-startup
not related.
See Also: webextensions-startup
MozReview-Commit-ID: 4eQFFULmutG Bug 1496075 - Load WebExtensions in nsSearchService MozReview-Commit-ID: C9I9onbhJ2A
Attached patch fixBuild (obsolete) — Splinter Review
This fixes and simplifies the build patch in phab, placing unziped extensions into omnijar.
Attachment #9020228 - Flags: feedback?(dharvey)
This is some work towards loading extensions from omnijar. It works, but we probably want to go further with it somehow (rather than the XPInstall hack I did with installAddon). I just want to get some rough thought on where this is going, any input on something better/easier.
Attachment #9020229 - Flags: feedback?(rhelmer)
Attachment #9020229 - Flags: feedback?(aswan)
Attached patch fixBuild (obsolete) — Splinter Review
Attachment #9020228 - Attachment is obsolete: true
Attachment #9020228 - Flags: feedback?(dharvey)
Attachment #9020424 - Flags: feedback?(dharvey)
Attached patch fixBuildSplinter Review
Attachment #9020425 - Flags: feedback?(dharvey)
Attachment #9020424 - Attachment is obsolete: true
Attachment #9020424 - Flags: feedback?(dharvey)
Comment on attachment 9020229 [details] [diff] [review] nonTemporaryInstall Looks reasonable to me - nice to see how small the patch is. It's been so long since I've done any significant work on addons code that whatever aswan says should totally override my opinion here :)
Attachment #9020229 - Flags: feedback?(rhelmer) → feedback+
Summary: Support loading WebExtensions on startup → Support loading search extensions on startup
Comment on attachment 9020425 [details] [diff] [review] fixBuild This is a much nicer way to define what build files are needed and to pick up the resource path, cheers
Attachment #9020425 - Flags: feedback?(dharvey) → feedback+
reminder to myself to produce demo code for installing from chrome:/resource: urls
Flags: needinfo?(aswan)
Just an update on this based on a product meeting we just had, it seems very likely that the addons we want to load will be signed .xpi files within omni ja
(In reply to Dale Harvey (:daleharvey) from comment #10) > Just an update on this based on a product meeting we just had, it seems very > likely that the addons we want to load will be signed .xpi files within omni > ja Wait, signed .xpi files inside omni.ja is different from what we talked about last week (?) If that's the plan there are some open questions but I think it makes the request from comment 9 irrelevant?
Flags: needinfo?(aswan) → needinfo?(dharvey)
We will land signed XPIs into m-c (maybe not initially signed in order to get other stuff working). Since we have almost 100 of them, I think it might still be beneficial to have them in omni.jar, but if you feel otherwise, I'm not attached to it.
I thought you were waiting on me to produce a prototype in which the search engine contents would come directly from omni.ja, not as xpi files embedded in omni.ja. The requirements here have never been very clearly stated so I don't have an opinion on whether putting xpi files into omni.ja is an appropriate implementation or not. If we do end up putting xpi files into omni.ja, that doesn't mean we need to check xpi files into the tree. And given that checked-in xpi files are terrible for looking through history/blame lets not do that.
Product wants these signed. Getting signing into the production builds is problematic. Our builtin search engines rarely change. The easiest path is to drop signed XPIs into the tree.
Just as easy is to check in the components of a signed XPI and actually put together signed XPIs at build time. That is much friendlier for history/blame. And to be very clear, that means you're not waiting on me for anything on this bug any more, correct?
Ah, right, we can zip them at build time. I'm guessing we don't need anything new...though I haven't looked at what to do to load a signed xpi.
Yup apologies, as we mentioned during the last meeting this is a thing where we had ideas about where to go but not a concrete product decision, got the product decision tonight. Sorry if it wasted any of your time. Checking in the components as explained @ https://wiki.mozilla.org/Add-ons/Extension_Signing sounds good.
Flags: needinfo?(dharvey)
This, based on top of https://bugzilla.mozilla.org/show_bug.cgi?id=1492475 puts the webextension source files in the tree then builds them into xpi files which are loaded by nssearchservice on startup Only have the engines loaded by default in english only for the sake of a readable diff (splitting them into 2 commits didnt work great before) The main question where to go from here would be that this doesnt handle updates, removing outdated engines etc, nor caching at startup etc (it simply tries to install every engine on every startup) We can have nsSearchService handle all that, I think Shane had mentioned it may be a good idea for XPIDatabase(?) or somewhere to manage the actual addons installation and sending queries to nsSearchService about what should actually installed Also trying to think of a nicer way to handle the deadlock between installing engines inside the init. The build files obviously arent going to land like that but not sure there is many unknown complexities in there, will ask someone in #build about what it should look like and start to clean that up
Attachment #9020227 - Attachment is obsolete: true
Hi Andrew. I was looking for some guidance based on what we discussed before. We no longer need to support loading unzipped extentions from the omni jar as these extensions will be signed .xpi files in release builds and unsigned .xpi in non release builds. We are currently planning on sSearchService to handle the installation and keeping them up to date Currently the patch @ https://phabricator.services.mozilla.com/D12737 does the building of the (unsigned xpi) files and uses: let install = await AddonManager.getInstallForFile(file); await install.install(); to install but we do need to differentiate these web extensions from plain user installed extensions, they will not show up in about:addons and as I previously mentioned they will be unsigned in non release builds so we need a way to disable the signature check in that case. Shane had started a patch in https://bug1496075.bmoattachments.org/attachment.cgi?id=9020229 to do some of this but via XPIInstall which was suggested against so I was wondering how you think that would best be achieved Cheers
Flags: needinfo?(aswan)
A couple of things. First, that patch is going to copy search engines from omni.ja into the profile, is that what you intend? As for distinguishing built-in search engines from other extensions for the purpose of showing them in about:addons and deciding whether to require signing, I guess one option is bug 1491074. However, if you want to load the search engines directly from omni.ja without copying them into the profile, I think we'll need to create a new install location inside the addon manager, at which point we could establish rules for signing for everything in that location. This all might be easier to discuss interactively?
Flags: needinfo?(aswan)
(In reply to Andrew Swan [:aswan] from comment #20) > A couple of things. > > First, that patch is going to copy search engines from omni.ja into the > profile, is that what you intend? no, they should install in-place. > As for distinguishing built-in search engines from other extensions for the > purpose of showing them in about:addons and deciding whether to require > signing, I guess one option is bug 1491074. Preferably not, at least until we have more product and policy direction on this. > However, if you want to load > the search engines directly from omni.ja without copying them into the > profile, I think we'll need to create a new install location inside the > addon manager, at which point we could establish rules for signing for > everything in that location. So that would be along the lines of the BuiltInSearchLocation class I created in attachement 9020229?
(In reply to Shane Caraveo (:mixedpuppy) from comment #21) > So that would be along the lines of the BuiltInSearchLocation class I > created in attachement 9020229? Something like that. We discussed this before and I don't remember where we left off -- we can obviously update these when a new app version is installed but do these need to be able to be updated independently of app updates?
(In reply to Andrew Swan [:aswan] from comment #22) > (In reply to Shane Caraveo (:mixedpuppy) from comment #21) > > So that would be along the lines of the BuiltInSearchLocation class I > > created in attachement 9020229? > > Something like that. We discussed this before and I don't remember where we > left off -- we can obviously update these when a new app version is > installed but do these need to be able to be updated independently of app > updates? The only goal is to be able to load signed extensions from omni.jar and not have them appear in about:addons. SearchService may want to change which extensions are loaded, so unload is also necessary.
Yup, we will need to be able to remove them. Updating actually isnt an issue as far as I can tell, list.json nor absearchdata have any notion of versions so I think an update is either a wipe and reinstall all, or more often just making a new plugin (we are moving to making a new engine for every new code). Adding code for nsSearch to keep track of what should be installed, but yeh would be great to have some guidance on how to install (and remove) signed .xpi files in place without them showing in about:addons (and allowing unsigned .xpis in nightly) Can talk on irc or vidyo when is good for yall, 4PM UTC today work? irc or vidyo works good for me
(In reply to Dale Harvey (:daleharvey) from comment #24) > Yup, we will need to be able to remove them. Updating actually isnt an issue > as far as I can tell, list.json nor absearchdata have any notion of versions > so I think an update is either a wipe and reinstall all, or more often just > making a new plugin (we are moving to making a new engine for every new > code). I didn't really follow all of that, but would it work to have a fixed (at build time) list of addons that are all permanently installed. The search service can then enable and disable these as needed, but cannot change them outside of an update to the application? Also, Shane and I talked the other night and we're pretty certain that the existing signature verification code is not going to work here. That code is maintained by the security/nss folks, so its probably worth filing a separate bug to get that code added.
> I didn't really follow all of that, but would it work to have > a fixed (at build time) list of addons that are all permanently > installed. The search service can then enable and disable these > as needed, but cannot change them outside of an update to the application? That should work fine if thats whats best from the addons side of things > existing signature verification code is not going to work here. Because of the .xpi's being inside the omni jar or something else? They don't need to be kept inside there. I am not super familiar with how we build the firefox distro and that was the code examples I found to follow, if it will 'just work' if we put the .xpi files elsewhere that would be fine
(In reply to Dale Harvey (:daleharvey) from comment #26) > > existing signature verification code is not going to work here. > > Because of the .xpi's being inside the omni jar or something else? Because the signature verification code only knows how to verify the contents of an nsIFile, ie an actual file on the filesystem: https://searchfox.org/mozilla-central/rev/e22c0d152060f4f8d4ca8904094f15f65a1b6f93/security/manager/ssl/nsIX509CertDB.idl#264-266 > They > don't need to be kept inside there. I am not super familiar with how we > build the firefox distro and that was the code examples I found to follow, if > it will 'just work' if we put the .xpi files elsewhere that would be fine I'm confused, I thought Shane said the opposite in comment 21
I think he would prefer omni.ja if it doesnt cause problems, but if it does its not a requirement (https://bugzilla.mozilla.org/show_bug.cgi?id=1496075#c12) As far as copying into profile vs installing in-place, I dont know what tradeoffs are involved in that but we definitely want install to happen as fast as possible so if in-place is faster then that may be a reason to lean that way
Hey mconnor, We currently have the ability for distributions to load their own engines in distribution directories @ https://dxr.mozilla.org/mozilla-central/source/toolkit/components/search/nsSearchService.js#2917. Is this something we still need to support in Web Extensions world?
Flags: needinfo?(mconnor)
> Is this something we still need to support in Web Extensions world? Yes. We can't easily migrate legacy distributions over to WebExtensions.
Comment on attachment 9020229 [details] [diff] [review] nonTemporaryInstall This has moved over to bug 1512436
Attachment #9020229 - Flags: feedback?(aswan)
Depends on: 1528186
Depends on: 1529374
Depends on: 1529321

Hi Mike

We are looking to replace openSearch plugins with WebExtensions, currently we package the plugins @ https://searchfox.org/mozilla-central/source/browser/components/search and expose them via resource://search-plugins. Basically we want to s/plugins/extensions

This patch does that, however I needed to add a patch to the packager to avoid these extensions being recognised as extensions and something else happening to them @ https://phabricator.services.mozilla.com/D17212#change-0ZE6dsZ1Knq1, my patch works but guessing its not what you would want, could you suggest another way?

Cheers
Dale

Flags: needinfo?(mh+mozilla)

Unfortunately, I don't have much more information to work from than what you gave me on irc, and I'm still confused why you need those to actually be full fledged webextensions with a manifest.json, if you read them from resource://search-plugins.

Flags: needinfo?(mh+mozilla)

Sorry if I am being confusing, I dont really understand your questions.

We want them to be webextensions and webextensions have a manifest, in https://phabricator.services.mozilla.com/D17213 is the commit to add the search extensions, we are going to add search extension and remove search plugins. The extensions are read from resource://search-extensions and resource://search-plugins is going to go away

We need them to be extensions for product reasons, a single path to install / control engines in a way that we can update / revoke them etc in a way we cant do with opensearch (as well as it being confusing to do that with multiple entrypoints)

it might be better to exclude based on the directory name, then. ni? me again on the bug,

Excluding based on the directory name works fine for me, could do with a pointer on how to do it

Cheers

Flags: needinfo?(mh+mozilla)

More fallout from having to use mutilocale extensions. First off:

https://searchfox.org/mozilla-central/source/browser/components/search/searchplugins/google-b-1-d.xml
https://searchfox.org/mozilla-central/source/browser/components/search/searchplugins/google-2018.xml

These engines have different params + mozParams, extension localisation doesnt just let me say manifest.params = locale.params, I have to specify them individually in manifest.json which obviously doesnt work when different 'locales' have different sets of parameters. I have tried encoding the parameters in the search_url and only using params for mozParams, but that 1. doesnt help because different google 'locales' have different mozParams and 2. seems to break some functionality in SearchService thats expecting to get the params as an array. This could be a bigger problem (the same issue would happen if only locale had a suggest_url) but only seen it for mozParams at the moment

And then:

https://searchfox.org/mozilla-central/source/browser/components/search/searchplugins/amazondotcn.xml
https://searchfox.org/mozilla-central/source/browser/components/search/searchplugins/amazondotcom-de.xml

We have some completely different engines that use the same icons, rolling these into the same extension means changes to absearchdata which fairly definitely rules this out of landing for this release

Happy for any suggestions on how to get this stuff done, will carry on looking into it over the weekend but I think its safe to say that the multilocale stuff means this isnt going to make it by monday

Shane any suggestions for ^?

Flags: needinfo?(mixedpuppy)

(In reply to Dale Harvey (:daleharvey) from comment #42)

Shane any suggestions for ^?

A couple.

  1. In _addEngineForManifest you could cull any param that has an empty value. That way you can include all possible params, the locale file would simply have an empty value for params it does not need.

or

  1. You could add a paramsString to the schema and parse that into an array in _addEngineForManifest (as I think other places in SearchService expects those as an array):

schema:

search_provider: {
paramsString: {
"type": "string",
"description": "a string of all params",
"preprocess": "localize"
},
}

manifest.json:

search_provider: {
paramsString: "client=firefox-b-1-d&q={searchTerms}" // can be localized
}

Regarding the icon, if you really must have separate extensions, just add the same icon under different names within the extension. But if you use either approach above, I see no reason that the amazon extension has to be two separate extensions.

Flags: needinfo?(mixedpuppy)

Hrm, option 2 above wont work for MozParam differences, so you probably would have to go with option 1 to handle that.

(In reply to Dale Harvey (:daleharvey) from comment #39)

it might be better to exclude based on the directory name, then. ni? me again on the bug,

Excluding based on the directory name works fine for me, could do with a pointer on how to do it

Cheers

I'd go with something like:

diff --git a/python/mozbuild/mozpack/packager/__init__.py b/python/mozbuild/mozpack/packager/__init__.py
index 4b9126e44db45..0c8cb434ba57b 100644
--- a/python/mozbuild/mozpack/packager/__init__.py
+++ b/python/mozbuild/mozpack/packager/__init__.py
@@ -275,7 +275,9 @@ class SimplePackager(object):
             self._queue.append(self.formatter.add_interfaces, path, file)
         else:
             self._file_queue.append(self.formatter.add, path, file)
-            if mozpath.basename(path) == 'install.rdf':
+            if mozpath.basedir(path, ['search-extensions']):
+                pass
+            elif mozpath.basename(path) == 'install.rdf':
                 addon = True
                 install_rdf = file.open().read()
                 if self.UNPACK_ADDON_RE.search(install_rdf):
Flags: needinfo?(mh+mozilla)

I gave that a try run but the packaging failed, not sure I described the directory layout right and having hard time finding docs for basedir to try the right thing myself. The path here is

Nightly.app/Contents/Resources/browser/chrome/browser/search-extensions/wikipedia/manifest.json

Flags: needinfo?(mh+mozilla)

Since this goes in chrome, there's actually a more "elegant" solution that doesn't involve hardcoding the search-extensions location, but I still need to think how this should be hooked up.

Robert do you have any suggestions on what to do for https://phabricator.services.mozilla.com/D17212#inline-125141 just to ensure I dont break the test with this patch?

I added isBuiltin to the isSystem flag because test_telemetryEnvironment.js checked for the signed state of addons and these built in ones are not signed (https://searchfox.org/mozilla-central/source/toolkit/components/telemetry/tests/unit/test_TelemetryEnvironment.js#719). I originally added it as a seperate flag but then it was mentioned would need a data collection review which I figured would avoid for a change purely to get tests passing

Happy to do whatevers best here

Flags: needinfo?(rhelmer)

(In reply to Dale Harvey (:daleharvey) from comment #48)

Robert do you have any suggestions on what to do for https://phabricator.services.mozilla.com/D17212#inline-125141 just to ensure I dont break the test with this patch?

I added isBuiltin to the isSystem flag because test_telemetryEnvironment.js checked for the signed state of addons and these built in ones are not signed (https://searchfox.org/mozilla-central/source/toolkit/components/telemetry/tests/unit/test_TelemetryEnvironment.js#719). I originally added it as a seperate flag but then it was mentioned would need a data collection review which I figured would avoid for a change purely to get tests passing

Happy to do whatevers best here

Oh, I see what you mean about changing the field, I think you're right that adding a new field would be ideal but that's going to take some dedicated effort.

OK I think going forward with setting isSystem for search extensions is fine with me. I'll file a separate bug about cleaning up the "built-in" vs. "system" naming, that doesn't need to block this work.

Flags: needinfo?(rhelmer) → needinfo?(dharvey)

Ok thats great, cheers

Flags: needinfo?(dharvey)

Hey Raphael

I was wondering if you could take a look at this, its the only failure for this try run @ https://treeherder.mozilla.org/#/jobs?repo=try&revision=a8779512b4987482c21750d382f03ffdd2421c47&selectedJob=231980784

I modified the script to wait for searchService to start up which looks to be working but the test is still failing and having a hard time figuring out what could have broken it https://treeherder.mozilla.org/logviewer.html#/jobs?job_id=231980784&repo=try&lineNumber=2953 is the failure.

Flags: needinfo?(rpierzina)

(In reply to Mike Hommey [:glandium] from comment #47)

Since this goes in chrome, there's actually a more "elegant" solution that doesn't involve hardcoding the search-extensions location, but I still need to think how this should be hooked up.

There is no simple way to do this the elegant way. It would require something like bug 1158018 first.

For now, I guess something like:

--- a/python/mozbuild/mozpack/packager/__init__.py
+++ b/python/mozbuild/mozpack/packager/__init__.py
@@ -285,7 +285,8 @@ class SimplePackager(object):
                 if self.UNPACK_ADDON_RE.search(install_rdf):
                     addon = 'unpacked'
                 self._add_addon(mozpath.dirname(path), addon)
-            elif mozpath.basename(path) == 'manifest.json':
+            elif mozpath.basename(path) == 'manifest.json' and \
+                    not mozpath.match(path, '**/chrome/browser/search-extensions/*/manifest.json'):
                 manifest = file.open().read()
                 try:
                     parsed = json.loads(manifest)

would do it.

Flags: needinfo?(mh+mozilla)

Awesome thanks glandium, updated the commit

I have been debugging the failure mentioned @ https://bugzilla.mozilla.org/show_bug.cgi?id=1496075#c51

The main test I am looking at is telemetry/marionette/tests/client/test_search_counts_across_sessions.py, it fails with

File "/builds/worker/workspace/build/tests/telemetry/marionette/tests/client/test_search_counts_across_sessions.py", line 83, in test_search_counts
[task 2019-03-05T22:40:14.842Z] 22:40:14 INFO - self.assertIsNone(ping1_info["previousSubsessionId"])

The issue is that the new search engine installs trigger startNewSubsession() which means the hardcoded subssion values expected in the test are no longer the right values

It seems either telemetry should ignore the installs, or the test needs update with new expected values, the former seems to me like the best approach

Telemetry by default does not generate multiple subsessions from multiple addon installs in startup.
I suspect the test you mention is overriding a testing preference, probably toolkit.telemetry.minSubsessionLength.

If that is true, we could:

  • not override that preference, if it is not needed in that test
  • accept multiple subsessions being generated, if the preference is needed for the test

The preference is definitely set in the tests and is required, the test is also installing addons itself to trigger new subsessions.

I wonder... if these new addons are considered builtin & are always installed, maybe we should just not trigger new subsessions for them, based on an addon id whitelist?

Chris, what do you think?

Flags: needinfo?(chutten)

Dale mentioned that these addons always have isBuiltin set, so we could maybe also just filter on that.
We could always update the environment data, but only trigger new subsessions when !isBuiltin?

I think the minimal set of changes is:

Are you able to test this out?
Or should we follow up with a patch in a separate bug early next week?

There's a discussion we could have about how, since no extensions are Legacy Extensions any more, we don't need to cut subsessions on addon changes of any kind... but let's defer that.

For now having a check on isBuiltin sounds like it'd have minimal code impact and not manipulate subsessions too drastically, making it an ideal solution for this case. We'll need to take a close look at environment-change topic listeners to see if some still need to be notified but others must not be notified, but that comes with every code change to TelemetryEnvironment anyway.

Flags: needinfo?(chutten)

We don't check for specific subsession IDs, but rather verify that when new subsessions are created it is reflected in the telemetry ping data. The suggested minimal set of changes from :gfritzsche sounds sensible to me!

Flags: needinfo?(rpierzina)

So due to the above test I have run into a fairly serious issue with the patch, glad we had a marionette test here and will be adding some dedicated marionette tests for this next

The issue is with how we interact with the AddonManager on startup, on first load everything is fairly easy since we install an addon then the addon gives us the details via addEnginesFromExtension, on restart we are usually loading from cache and everything is fine then, the issue is when we restart with an invalidated cache.

For example - First run: install google, yahoo, restart with new engines: keep google, uninstall yahoo, install ddg

I assumed (based on basic testing) that on restart with an invalidated cache, the AddonManager would boot and report all installed engines via addEnginesFromExtension before SearchService could possibly init, we would also be able to use AddonManager.getAddonByID inside SearchService init to find the engines that are not installed and install them

That assumption was wrong, I found a few bugs on using the AddonManager on startup (getAddonById returns null for installed addon, getLocalizedManifest crashes @ https://gist.github.com/daleharvey/9acf6e21ad8ac4c9a778f7c0843c52f5#file-gistfile1-txt-L265), and while asking about those bugs in IRC kmag told me I shouldnt be using AddonManager on startup at all as it invoked the add-on DB and is very expensive, similiarly the AddonManager doesnt report the installed extensions on its startup but when the add-on DB is invoked

This leaves me a little stuck, when SearchService inits I need to be able to get the details of the built in engines obviously, right now means getting them via addEnginesFromExtension, if I cant getAddonByID to check whether an addon is installed then we have 2 options as far as I can see

  1. Keep track of the install state of addons + the parsed manifest details across restarts in the SearchService cache
  2. Make builtIn extension install stateless, they don't get persisted across firefox restarts (store any data at all really) and we can installBuiltInAddon and get back the parsed manifest details at any time

I am trying to see how far I can get with the former, we already cache the engine details however I am worried about us keeping what is essentially an addons database in our cache, having the possibility of it going out of sync with the actual state of the addons and getting tricky bugs out the other end. The latter sounds much much cleaner to me, but would need new addons work.

Happy to hear any suggestions for any other options here or how feasible getting stateless installs would be

Flags: needinfo?(mixedpuppy)
Flags: needinfo?(kmaglione+bmo)

Alright, so I'm assuming there's some way that language packs do work on startup that somehow doesn't require a whole lot of machinery that we do? What would that be and can we get the same treatment for Search WebExtensions?

Also, is it possible to do something like await waitForNotification('addon-manager-startup') if such a thing exists, instead of explicitly initializing the AddonManager?

I'm quite surprised that nobody was able to mention this particular caveat to us before. Perhaps even disappointed.

I don't really understand comment 60 (e.g., is "the cache" something in the addons manager or in the search service? Not sure what "the AddonManager doesnt report the installed extensions on its startup" means, etc)

However, perhaps I can give some background that will help. The addons manager starts relatively early during browser startup and it loads the minimal set of information it needs to actually activate addons (including extensions and language packs as well as other types). During startup it does not load metadata about addons (i.e., full description, author information, etc.) So, when the search service is initializing, information about which addons are installed is readily available, but getAddonByID is the API for getting the detailed information that is loaded later, so its not the right API to use during startup.

We provided a couple of suggestions on IRC yesterday that as far as I can tell, haven't been tried yet. One option is to use AddonManager.getActiveAddons() which can be used safely during startup. Another option is to use the patch that Kris provided on IRC yesterday to do "stateless" installs, it really amounts to the same thing as checking the thing you want to install against the results of getActiveAddons().

(In reply to Mike de Boer [:mikedeboer] from comment #61)

I'm quite surprised that nobody was able to mention this particular caveat to us before. Perhaps even disappointed.

Lets save the finger-pointing until after we actually get a clear picture of what is going on. I don't really a see any serious problem here.

(In reply to Andrew Swan [:aswan] from comment #62)

Lets save the finger-pointing until after we actually get a clear picture of what is going on. I don't really a see any serious problem here.

True, it seems premature of me to do so indeed. My apologies! I think it's perhaps due to the number of back and forth's this project has seen thusfar, that made me react in this way.

Thanks for the explanation and for providing alternatives for Dale to use. I think using getActiveAddons() would be a fine alternative implementation method.

After some conversation on IRC think we have a way forward here, filed https://bugzilla.mozilla.org/show_bug.cgi?id=1534710 which would ensure addEnginesFromExtension gets called for all installed search extensions before SearchService.init is called

Flags: needinfo?(mixedpuppy)
Flags: needinfo?(kmaglione+bmo)
Depends on: 1534710

Depends on D17213

Attachment #9052186 - Attachment is obsolete: true
Attachment #9038859 - Attachment is obsolete: true
Depends on: 1539355
Attachment #9052190 - Attachment is obsolete: true
Attachment #9052189 - Attachment is obsolete: true

Ok so updated https://phabricator.services.mozilla.com/D25245 to undelete the assertions and additions to tab_scalars.py, the failures this causes will show up @ https://treeherder.mozilla.org/#/jobs?repo=try&revision=3aa00eae6e2f0443382620bb09b62345212ac4a1&selectedJob=238424866

Now taking a look at proper fixes, tab_scalars.py works with toolkit.telemetry.minSubsessionLength = 1000, testing the search count will also work if I do that, but the subsession assertions will all fail. I think we should have

tab_scalars.py and test_search_count.py (that only tests the search count values) run with minSubsessionLength = 1000, and add a new test_subsession_creation.py that runs with minSubsessionLength = 0 and does something along the lines of

    ping = self.wait_for_ping(
        self.install_addon, MAIN_ENVIRONMENT_CHANGE_PING
    )
    self.assertNotEqual(currentSubsessionID, previousSubsessionId)
    ping = self.wait_for_ping(
        self.restart_browser, MAIN_ENVIRONMENT_CHANGE_PING
    )
    self.assertNotEqual(sessionID, previousSessionId)

etc, how does that sound?

Flags: needinfo?(rpierzina)
Depends on: 1542383
No longer depends on: 1542383

(In reply to Dale Harvey (:daleharvey) from comment #72)

The code changes in test_search_counts_across_sessions.py in the latest revision of D25245 look good to me!

I agree that it makes sense to move checks related to Telemetry subsession management in that test to a new test. We have created bug 1539874 for that.

Flags: needinfo?(rpierzina)
Pushed by dharvey@mozilla.com: https://hg.mozilla.org/integration/autoland/rev/7efb082cc25d Part 1: Extensions changes to support search extensions. r=mixedpuppy,robwu https://hg.mozilla.org/integration/autoland/rev/e07b4ceea077 Part 2: Telemetry changes to support search extensions. r=chutten,raphael https://hg.mozilla.org/integration/autoland/rev/9e0b9cbb8982 Part 3: Use webextensions in SearchServices. r=mikedeboer,mixedpuppy https://hg.mozilla.org/integration/autoland/rev/6534d80a3329 Part 4: Delete searchplugins. r=mixedpuppy
Regressions: 1543973
Regressions: 1544214
Regressions: 1544369
Regressions: 1544486
Flags: needinfo?(mconnor)
Regressed by: 1544392
No longer regressed by: 1544392
Regressions: 1544392
Regressions: 1544835
Regressions: 1545395
Regressions: 1545517
Regressions: 1547662
Regressions: 1549349
Regressions: 1552185
Regressions: 1552193
Regressions: 1555661
Regressions: 1565779
Regressions: 1570700
See Also: → 1929830
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: