Closed Bug 598697 Opened 14 years ago Closed 6 years ago

<ext-xpi-root>/searchplugins/ doesn't work with packed extensions

Categories

(Firefox :: Search, defect, P5)

Other
All
defect

Tracking

()

RESOLVED WONTFIX

People

(Reporter: BenB, Unassigned)

References

Details

(Keywords: dev-doc-complete, Whiteboard: [fxsearch])

Attachments

(1 file, 1 obsolete file)

In FF3.6, an extension could add search plugins by adding them in folder searchplugins/ in the XPI. They'd be automatically used by FF, and the extension could use them, set them as default (after user confirmation with custom dialog) etc..

This is now broken, because XPIs are no longer extracted, since bug 533038. I assume the code to get the search plugins needs to be adapted to that change.

This is serious, because the addEngine() API is seriously broken for other reasons (see description of bug 598623).
Or extensions bundling search plugins need to always unpack.
Or we just fix the regression.
I'll take a look, but it's known that not every extension will work without unpacking. em:unpack is provided for those extensions using features that require extracted files. I don't consider this a regression since extensions need to be updated for Firefox 4 and determining whether em:unpack is required or not is part of the process.
Assignee: nobody → mwu
Keywords: regression
Summary: [regression] <ext-xpi-root>/searchplugins/ no longer works → <ext-xpi-root>/searchplugins/ doesn't work with packed extensions
What other parts of an addon won't work if it is packed? Where is this documented?
Binary components and ctypes DLLs, obviously. Perhaps other things.

mwu, I don't think you should spend time on this for FF4. Extensions with searchplugins can just use <em:unpack>.
Assignee: mwu → nobody
I checked MDC and can't find a document which describes "em:unpack". Can we make sure to get it added?
(In reply to comment #5)
> Binary components and ctypes DLLs, obviously. Perhaps other things.

Who is going to figure out what those other things are?
If it breaks, you do <em:unpack>.
bsmedberg, you got this the wrong way around.

We didn't even *know* we were broken by this. I'll spare you the details, but we didn't even intend to use <xpiroot>/searchplugins/, so I had no idea that *this* change broke some subtle stage during installation. We thought our particular problem has an entirely different one and were trying to solve that, and 2 engineers wasted 2 days on the solution of the wrong problem. And I did know about bug 533038. What about those ext authors who don't know about that bug and <em:unpack> and what it affects?

*First* you document
- what will change,
- what the effects will be (all of them), and
- how to solve them,
then commit. Just breaking people is double-un-good. Responses like "well you should know" are not particularly helpful.

Aside, I'm all for mwu's change, but if many extensions need to disable it, we didn't really win much, did we?
Also, would it be feasible to encode which extensions would be broken by this, and make them automatically unpack = false? In this case, it should be trivial: If it has searchplugins/, it's likely to break. Same for components/.

Also, you could then change that code once you do support loading searchplugins/ from inside XPIs, which I think is totally possible and not even hard, as you just need to feed a different URL to addEngine() or similar.
The documentation and outreach piece was a failing, sure, cc'ing Jorge and Nick to figure out what went awry there. A workaround exists, though, right?
Yeah, something fell through the cracks here. Sheppy, can you prioritize the stuff from this bug and bug 533038?

I explictly do not want a guessing/autodetection system, because it means that minor changes to the contents of an extension can change its behavior significantly. I prefer the explicit opt-out mechanism.

I'm fine with leaving this bug open for somebody to implement loading searchplugins from extension JARs, but I don't think it's a current priority.
Keywords: dev-doc-needed
> something fell through the cracks here

That would be me and my fellow dev :). I feel like in an gully. (just kidding)
Could we have this fixed for 4.0? This seems easy to do.
We don't want to use unpack, because the jarred extensions are really useful in our case.
And the addEngine() API is seriously broken: You can't add a search engine without making it the default, and AMO doesn't allow you to change the default without user consent, so you are in trouble, if you just want to add it. We tried to restore the default after addEngine(), but that's async and doesn't say when it completes. A total mess. That's the 2 days above.
I can try to let my employer give me the time, but not sure that he will.
Also, when this is finally fixed extensions will have to chose between being slower on new versions or not working at all on old versions. (if they notice at all)
No, we're not going to "fix" this for FF4, you'll have to use em:unpack. James, I have no clue what your comment means. If you use em:unpack old versions should work just fine.
Was there a technical reason not to include searchplugins? If a patch was made, would you take it?

Why were some directories disallowed from packing?
It's probably just that the search engine code assumes that the files are in a certain location and nobody had the time to look into changing it. If a patch were made and it works, I don't see why it wouldn't be accepted.
This turns out to be a quite difficult problem. I'm looking for suggestions.

The way search plugins are loaded from add-ons today is via the search code getting the special set of directories NS_APP_SEARCH_DIR_LIST which is loaded here:

http://mxr.mozilla.org/mozilla-central/source/browser/components/dirprovider/DirectoryProvider.cpp#218

It's a union of extension search plugin locations, distro search plugin locations, user search plugin locations (profile), app search plugin location (shipped engines)

These locations are placed in a very specific order to allow certain locations to win over other locations.

In the search engine code itself, it goes through those dirs searching for engines:

http://mxr.mozilla.org/mozilla-central/source/toolkit/components/search/nsSearchService.js#2641

Engines that are in zipped XPIs would theoretically go before or after "extension search plugin locations." But it's not really feasible to search the zipped XPI file to find search plugins.

So we need a way to tell the search service about these searchplugins.

I thought about using chrome.manifest, but I'm not sure how the search services would get the information from the chrome.manifest for every installed extensions.


Anyone have any other thoughts?
It would make sense to use chrome.manifest for this. Since the components code works similarly (for JS components, at least), I'd look into what that code is doing.
The component manager has a custom API called XRE_AddJarManifestLocation that is invoked by the directory service.

It then adds those locations to a private array inside of the component manager.

The function LoadExtensionDirectories never gives any of the XPI information back to callers, just the actual directories.

The directory provider has no mechanism to give information about packed XPIs back to any caller.
(In reply to Michael Kaply (mkaply) from comment #20)
> Engines that are in zipped XPIs would theoretically go before or after
> "extension search plugin locations." But it's not really feasible to search
> the zipped XPI file to find search plugins.

Why not? Tweaking the existing _findJAREngines to work on arbitrary archives and not rely on list.txt doesn't seem like it would be too difficult.
> Why not? Tweaking the existing _findJAREngines to work on arbitrary archives and not rely on list.txt doesn't seem like it would be too difficult.

I wasn't sure you could use zipreader to enumerate directories/files, plus I thought it might be a performance issue, but I'll look into it.

What is the format of list.txt?

I'd like to get findJAREngines working on a desktop build to see how it works.
Attached patch First pass at a patch (obsolete) — Splinter Review
First pass at a patch.

Per Gavin's advice, I've modified findJAREngines to look into XPIs to find search engines.

I had two make two other changes - there was an explicit check for "chrome" in initFromURISync, and there was code that didn't allow "jar" as a scheme.

The main downside to this method is that these engines won't have the same priority as other extension engines, in particular user engines.
Assignee: nobody → mozilla
Attachment #660103 - Flags: review?(gavin.sharp)
> The main downside to this method is that these engines won't have the same priority as other extension engines, in particular user engines.

I think I'm wrong. These are loaded last, so I think they win over anything. I'll check.
Search engines are a "first loaded wins"

So in the current code, unpacked wins over packed and system wins over packed as well.

I believe the right thing to do is to switch these lines:

      LOG("_loadEngines: Absent or outdated cache. Loading engines from disk.");
      loadDirs.forEach(this._loadEnginesFromDir, this);

      this._loadFromChromeURLs(chromeURIs);

to

      this._loadFromChromeURLs(chromeURIs);
      loadDirs.forEach(this._loadEnginesFromDir, this);

This will cause the JAR engines to have priority over the other engines.

We also should probably remove the code that treats JAR engines as app engines get _id(), but I don't completely understand why it was put in.
Comment on attachment 660103 [details] [diff] [review]
First pass at a patch

I don't think we want to allow arbitrary jar: URIs as Engine URLs generally - we'll need some way to restrict that to engines loaded from add-ons.

The _initFromURISync assertion being triggered is kind of worrying - omni.jar is quick to read from, typically, whereas the add-on XPI might not be. I don't know that there's a good way around this, short of finishing the re-work of search service initialization to be entirely async and getting rid of the sync initialization path. On the other hand, it's not really much worse than what we're already doing, I guess.

Rather than making _findJAREngines do everything, factor out the relevant code into a helper ("_findEnginesInArchive" or something, and have the existing _findJAREngines and a new method (_findAddonEngines) call it. Then you can leave the _findJAREngines call in _loadEngines alone, and just put a call to _findAddonEngines wherever makes most sense to maintain proper ordering.
Attachment #660103 - Flags: review?(gavin.sharp) → review-
> I don't think we want to allow arbitrary jar: URIs as Engine URLs generally - we'll need some way to restrict that to engines loaded from add-ons.

A couple thoughts come to mind.

1. Adding an extra parameters to the Engine constructor that we call in the addon case.
2. Explicitly parsing the URL and looking for .xpi

> The _initFromURISync assertion being triggered is kind of worrying

initFromURISync is called directly by the loadFromChromeURLs function to load the search data from the JAR/XPI.

        engine._initFromURISync();

So we're hitting that assertion because we're calling it with a jar and it says it only supports chrome URIs.
Attached patch Second passSplinter Review
Addressing gavin's comments. I still don't have a good solution for allowing just XPIs to add engines.
Attachment #660103 - Attachment is obsolete: true
Attachment #663066 - Flags: review?(gavin.sharp)
(In reply to Michael Kaply (mkaply) from comment #29)
> > The _initFromURISync assertion being triggered is kind of worrying

> So we're hitting that assertion because we're calling it with a jar and it
> says it only supports chrome URIs.

Yes, I know. I'm saying that that's a potential problem, because loading from omni.jar is pretty much garanteed to be "fast" (we need to load a bunch of other stuff from it, it has been optimized, etc.), while loading from a random extension jar might not have that same guarantee (though given that we need to load the extension it probably won't be too bad).
I now think the *only* way to do this is to require explicit registration via chrome.manifest. My proposal in bug 793093 would make that pretty easy.

Rationale: 

The patch currently searches through *all* packed extensions - enabled, disabled, incompatible, and blocklisted. It's also treating restartless add-ons the same as non-restartless - if we support restartless add-ons at all here, it needs to properly handle enabling/disabling of restartless add-ons at runtime. To handle all that, it needs to use the Add-on Manager APIs. But doing that would cause the Add-ons Manager to always open its database on startup, which needs to be avoided.

Now, you could potentially hack around that by parsing the extensions.installCache and extensions.bootstrappedAddons prefs, parsing the contents of the extensions.ini file, and listening for various Add-on Manager events. But it would be an awful awful hack, and puppies would cry. So would I.

Using the APIs that bug 793093 would add for chrome.manifest directives would solve all the ickyness.
Unfocused, that sounds great. What we needed here.
Unfortunately 793093 was WONTFIXed. Not sure where that leaves us...
gavin, given that the "register handler" idea was a dead end, can we revive Mike's patch, please?
Comment on attachment 663066 [details] [diff] [review]
Second pass

I don't think the synchronous loading in this patch is acceptable, and I'm not really convinced this is worth fixing (particularly after it's been broken for so long with seemingly little impact).
Attachment #663066 - Flags: review?(gavin.sharp) → review-
I don't know about others, but for us, the impact was huge. We use packed extensions, and thus had to write search engine installation/deinstallation manually, and had lots of serious bugs in that code, due to 1000 edge cases to consider.
I agree this isn't a reason to notably slow down Firefox startup (i.e. more than a few milliseconds). I don't know whether it would.
This is still a huge problem for us. Our workaround of installing and uninstalling search engines manually has all kinds of edge cases and error cases that keep causing bugs.

The main problem here in this is to enumerate the search engines.

Options to get list of search engines:
1. Enumerate all files in a directory of a ZIP/JAR/XPI file
   * There currently is no API for that in Mozilla
   * It shouldn't be a perf hit, because in order to read any file in a ZIP file, you have to
     have the index of the ZIP file (files contained, and their exact location in bytes) anyway,
     so the information we need must be already in RAM somewhere anyway. It won't be a perf hit.
   * We just need an API to enumerate files in a ZIP file dir.
     This would be useful in many other places as well.
2. Add explicit "searchengine <url>" lines to the chrome.manifest.
   * We would get all lines with "searchengine", and have the absolute URLs of the OSDs.
   * Requires us to add new mantras to chrome.manifest
   * We're not allowed to hardcode search in the chrome.manifest reader, so we need
     a generic API to add any new mantras.
   * This was Blair's approach, he filed bug 793093 for that
   * It got WONTFIXed
3. Manually read chrome.manifest
   * Load <XPIbaseURL>:/chrome.manifest as absolute URL
   * Read it as plaintext file, iterate over all lines, look at first word,
     look at only the "searchengine" mantra
   * From these lines, take the relative URL, and with XPIbaseURL, get an absolute URL
   * Should be fairly small (about 10 lines of code?) and self-contained
     (can be solely in search service - but could factor out the chrome.manifest parser)
   * Data should be in cache, so should be fast
4. Make our own manifest
   * Load <XPIbaseURL>:/searchplugins/searchplugins.manifest as absolute URL
   * Read it as plaintext file, iterate over all lines, look at first word,
     look at only the "searchengine" mantra
   * From these lines, take the relative URL, and with XPIbaseURL, get an absolute URL
   * Should be fairly small (about 10 lines of code?) and self-contained
     (can be solely in search service - but could factor out the chrome.manifest parser)
   * Avoids touching chrome.manifest

Options 3. and 4. should be trivial to implement and are self-contained.
This seems a valid request, but given some of the comments here, it seems this is more work than it's worth, so I don't see it coming close the top of a prioritized list any time soon.
Priority: -- → P5
Whiteboard: [fxsearch]
Rank: 59
Assignee: mozilla → nobody
Florian, FWIW, this bug is one of the main reasons why we're still using normal, unpacked XPI extensions. (There are other reasons, but this one is a square blocker.)

As mentioned above, I had tried to implement the search engine installation manually, but had too many bugs, the code was just too fragile with all the AMO requirements about uninstalling, deactivating etc.pp.
s/I/we/ (Mike did, too) :)
Not an issue for WebExtensions
Status: NEW → RESOLVED
Closed: 6 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: