Closed Bug 378216 Opened 17 years ago Closed 17 years ago

Disable insecure extension updates by default

Categories

(Toolkit :: Add-ons Manager, defect)

defect
Not set
normal

Tracking

()

VERIFIED FIXED
mozilla1.9alpha8

People

(Reporter: dveditz, Assigned: mossop)

References

()

Details

(Keywords: dev-doc-complete, relnote, Whiteboard: [sg:want P2])

Attachments

(3 files, 6 obsolete files)

We went to some lengths to make sure extension updates from addons.mozilla.org are secure (SSL update url, rejecting addons with a custom update URL, rejecting self-updating addons that bypass the normal system), but other people hosting extensions haven't take all the same steps. In particular they're not using SSL for the updateURL and that is insecure: all it takes is someone with a popular extension firing up their browser at a wireless hotspot where someone is running AirPwn or similar.

Not entirely sure how this should work, here are some possibilities:

1) at the simplest level, don't check any non-https update url unless the user has set the pref extensions.update.enable_insecure_update to true. All the mozdev fans are likely to do so, but it'll force big players trying for mainstream use (e.g. google) to do the right thing and implement an SSL update link.

2) disable automatic updates, put a red icon of some sort overlayed on the item(s) in the EM dialog, and require manual update checks (per item).

3) as in 2, but allow overrides of individual items through some sort of context-menu item to enable insecure updates of that item.

All of these options mean some people won't be getting updates for their extensions which might put them at risk, but that has to be weighed against the known risk of getting trojaned through the update process itself.

Of the above options I prefer 1) for its simplicity and lack of UI. Sites can try to walk users through turning it on step by step, but the name "enable_insecure_updates" should give them pause.

Maybe even with 1) we'd want some sort of "updates disabled" indication for each such addon, or an warning message when it gets installed.
Flags: blocking-firefox3?
2 points:

Firstly the current rumours are that hosting from AMO will require a signed extension, so out of reach for those of us that can't afford to spend money on our hobby. And now you're suggesting to disable auto-updates for us as well? Please make sure there is a prominent notification about this, preferably on install that the extension cannot be updated.

Secondly presumably you should also require that the actual link to the xpi in the update rdf should also be on a secure server, otherwise what's the point? If I was going to try to hijack extension updates in the way you describe I'd pretend to be releases.mozilla.org and then I'd hit any extension update or install with no problem.
1) alone is certainly not enough, there must be some sort of notification for the user. If users are stuck with old extensions and don't even know that something is wrong nobody will be helped. And even if extension authors change to a secure connection, they cannot change updateURL in extensions that are already installed.

Dave, that's what SSL is for - it prevents you from pretending that you are releases.mozilla.org. As to rumors you are citing here, I haven't heard of any such idea, so I would assume it is only rumors. There is certainly a number of showstoppers for requiring extensions to be signed (though certificates are not it, there is a provider who will give you valid certificates to sign open source software for free).
How about allowing "sandboxed" extensions to update using AMO, if the user downloaded the extension from outside of AMO or through the AMO sandbox?  Then authors of less-popular extensions wouldn't have to argue with DreamHost about TLS SNI support, etc.
Dave, sorry, I misunderstood you. Yes, that's a valid point - if the connection to releases.mozilla.org isn't secured then updates still can be compromised, even though it becomes less likely.
(In reply to comment #4)
> Dave, sorry, I misunderstood you. Yes, that's a valid point - if the connection
> to releases.mozilla.org isn't secured then updates still can be compromised,
> even though it becomes less likely.

How does it become less likely?
Ok so the update.rdf can specify a hash which means the xpi can be on an insecure server. So the requirement I guess would be that the update rdf be secure and the xpi be secure, or the update rdf specify the hash.
Whiteboard: [sg:want P2]
So, an extension that uses https for the updateURL and uses https for the url to the xpi or specifies a hash for the xpi would not have update disabled and all others would?

The user experience without changing ui is pretty bad IMHO... ui / process flow for this will need to be defined.
((In reply to comment #1)
> Firstly the current rumours are that hosting from AMO will require a signed
> extension, so out of reach for those of us that can't afford to spend money on
> our hobby.

This really isn't the bug in which to deal with such rumours, but I haven't heard any rumours that we would require such things without finding a way to accommodate individual developers who are not willing or able to buy a certificate.  Perhaps I should start a rumour that that's the reason we haven't already made it a requirement, if people are so quick to trade in rumours!)

We should be warning people who are installing unsigned software over an unsecured connection, though, I agree, and we should not continue to do it behind their backs in Firefox 3.  I don't think that silently disabling updates is going to do users any favours, especially when traded off against the desire to get them security updates for just those add-ons.

I think we need a stronger story here for all of our assisted-install cases, including search plugin updates (a juicy fiscal target if ever there was one), and PFS-linked URLs.  Link fingerprints in update.rdf would help with the last leg, but not solve the problem initial update.rdf-fetching problem.  We could make self-signed certificates work for signing add-ons, and then add a signature block to update.rdf that had to use the same cert as was on the original add-on.  That then limits the exposure to initial install, which is still a little spooky, but not as easy to exploit broadly.

I want to move away from plain-http serving of our add-on updates from AMO as well, but we have to get some infrastructure set up on the cluster before we can do that without nuking web heads.
I should have noted in the initial comment that this bug was filed in response to an April 15 mail security@mozilla.org received from Christopher Soghoian pointing out that several big-name extensions not hosted at AMO are potentially vulnerable to this problem.
(In reply to comment #8)
> Link fingerprints in update.rdf would help with the last leg,
> but not solve the problem initial update.rdf-fetching problem.

update.rdf already supports <updateHash> (used by AMO), we don't need to wait for a link fingerprints implementation.

Flags: blocking-firefox3? → blocking-firefox3+
Target Milestone: --- → Firefox 3 alpha6
> This bug was filed in response  to an email received from Christopher Soghoian
> pointing out that several big-name extensions not hosted at AMO are potentially
> vulnerable to this problem.

Christopher Soghoian has now released this publicly at http://paranoia.dubfire.net/2007/05/remote-vulnerability-in-firefox.html and on full-disclosure
Perhaps a better solution would be to improve the code signing feature. Specifically:
* Allow self-signed certificates (but warn!)
* Store the certificate used to sign the extension upon install
* Verify that the certificate is the same on update, and if not, throw up a conspicuous warning.

Finally, completely reject installation (perhaps allow installation without auto-update with a preference, for testing extensions) for any extensions which are not signed at all, to force extension authors to at least use a self-signed cert.

This way, the security of the update server itself doesn't matter, and people can use their own cheap shared hosting if they still want to.
Using self-signing as described above certainly seems viable from my point of view (extension author's hat on). It does what I think is the point here, to ensure the updated extension comes from the same person as the original does. Whether that source is trusted or not is something you need to decide on initial install of course.
Just to add another thing to think about on this issue: on both this thread and Soghoian's blog entry, it is stated that this vulnerability does not apply to extensions downloaded from addons.mozilla.org. This is incorrect! 

A common trick that some extension developers have been utilizing for some time now is to programmatically set a per-extension Firefox preference that allows an extension to specify an arbitrary URL to check for updates, without having to specify the updateURL property in install.rdf. This is documented here: http://kb.mozillazine.org/Manage_extension_updates_by_locale. In this manner, an extension developer can have a single .xpi package for the extension, and upload that both to addons.mozilla.org (that doesn't allow the updateURL property to be set) and to their own Web page. When a new version is available, users do not have to wait for it to be approved by the addons.mozilla.org staff, since update can be performed directly from the developer's Web page. Therefore, extensions hosted on addons.mozilla.org that use this trick are also vulnerable to the attack.

Here is some sample code for doing this, assuming the extension ID is "sample@site":
const updateURL = "http://www.site.com/update.rdf";
const prefs = Components.classes["@mozilla.org/preferences-service;1"]                  .getService(Components.interfaces.nsIPrefBranch);
prefs.setCharPref("extensions.sample@site.update.url", updateURL);

Regarding the general issue of how to best handle extension updates, I personally agree with Comment 6. I think that requiring the use of an SSL-enabled Web server or signed extensions is too much of a burden on the developers, most of whom are hobbyists. It seems to me that the best solution would be the following: 1. first, disable the aformenentioned extension-specific update preference. 2. developers should be allowed to upload the update.rdf file to addons.mozilla.org, at the same time when they upload the new extension version. 3. the uploaded update.rdf file should include the updateHash property for the .xpi package (described here: http://developer.mozilla.org/en/docs/Extension_Versioning%2C_Update_and_Compatibility#Security_Concerns), however the actual package can be hosted on any arbitrary server. 4. updates from any server other than addons.mozilla.org are disallowed.

In this manner, it would be guaranteed that all updates go through an SSL-enabled server, i.e. addons.mozilla.org, thwarting the attack. The actual extension packages though could be hosted anywhere, probably on the extension developer's Web page, or even on addons.mozilla.org.

The only downside I see in this scheme is scalability issues for addons.mozilla.org, but given the small size of update.rdf files, I trust this won't be that serious a problem.
(In reply to comment #14)
> A common trick that some extension developers have been utilizing for some time
> now is to programmatically set a per-extension Firefox preference that allows
> an extension to specify an arbitrary URL to check for updates

Previously this would have been against addons.mozilla.org's review criteria as I understood it. I'm not sure how that applies with Remora though.
(In reply to comment #15)
> Previously this would have been against addons.mozilla.org's review criteria as
> I understood it. I'm not sure how that applies with Remora though.

This trick works with the current version of the software running on addons.mozilla.org. I'm not sure if it worked in the previous reincarnation though. 

(In reply to comment #14)
> A common trick that some extension developers have been utilizing for some time
> now is to programmatically set a per-extension Firefox preference that allows
> an extension to specify an arbitrary URL to check for updates, without having
> to specify the updateURL property in install.rdf.

If you don't mind, please file this as a new bug in addons.mozilla.org:Administration. I was unaware of this and will be fixing that loophole as soon as possible.
In a newsgroup, Dennis Henderson wrote:
> SpoofStick is available at AMO and it connects to www.corestreet.com
> using http when it checks for updates, so either AMO doesn't have any
> effective restrictions in that regard or SpoofStick is so old that it
> was 'grandfathered in' or something. Not good...
> https://addons.mozilla.org/firefox/addon/121

(In reply to comment #18)
> In a newsgroup, Dennis Henderson wrote:
> > SpoofStick is available at AMO and it connects to www.corestreet.com
> > using http when it checks for updates, so either AMO doesn't have any
> > effective restrictions in that regard or SpoofStick is so old that it
> > was 'grandfathered in' or something. Not good...
> > https://addons.mozilla.org/firefox/addon/121

It is using the updateURL property, and not the trick I mentioned, so it was probably 'grandfathered in' as you say. Last version is February 2005.
(In reply to comment #17)
> If you don't mind, please file this as a new bug in
> addons.mozilla.org:Administration. I was unaware of this and will be fixing
> that loophole as soon as possible.

Filed a bug for this: https://bugzilla.mozilla.org/show_bug.cgi?id=382708
One more thing. From a brief look at the code of Google Browser Sync, it seems that it completely circumvents the Firefox update framework, and instead performs updates behind the scenes with an XMLHttpRequest. I'm not sure how this is exactly done, but it seems to me that its mere feasibility renders any effort to ensure secure update delivery useless.

Before deciding on the best way to solve the issue mentioned in this thread, it's probably a good idea to look at exactly how this new method of updates is done.
Dave, when you're done with the locales stuff, can you look at what we need to do to tighten this up?
Assignee: nobody → dtownsend
Essentially there are two stages to the update process. The update discovery through update.rdf files, and the update retrieval where we grab the xpi.

The update retrieval stage is relatively simple to secure. We can just make updateHash entries a necessity. AMO already provides them and they are pretty trivial for home authors to make. Assuming the update discovery was secure then we can use the hash to be fairly confident that there has been no intercepting of the xpi. It would probably be sensible to limit which hash functions are allowed, probably not allowing md2 and maybe md5.

The update discovery stage is the more difficult problem. The simplest option is to rule out any update.rdf that was retrieved from a non-https source. This is fine for AMO and businesses, but bad for home addon authors. For the non-https cases I suggest that a digital signature be used. The public key for this digital signature would have been distributed with the extension (possibly the key used for signed extensions but I believe that that is not sufficient) and thus can be used by the update system to ensure that the update.rdf is for the given extension.

Assuming we can make this signing process a reasonably simple task all addon authors should be able to manage it meaning that we can disable any insecure updates as standard and users need not be bothered by what's going on.
Do we know why some authors are serving their own updates? I would suspect that quite a few did it just because it seemed interesting to do, or because early versions of AMO were unreliable/broken. Those folks probably wouldn't be interested in hosting their own when doing so securely means extra work.

OTOH, if folks are doing this because of some ongoing shortcoming of AMO, it might be a better use of resources to patch AMO to meet their needs (vs creating a second non-SSL path that would need implemented, tested, documented, etc). If there's some philosophical difference that prevents that, perhaps those folks could band together to create an AMO alternative that has SSL (ala extensionsmirror.nl?).
The main reason should be that AMO is often slow to review updates. And that's something that cannot be patched...
There are also different reasons:
1. Hosting the add-on on a own website with a good infrastructure gives better feedback. E.g, you can offer a beta or nightly branch there, the people will more likely search and submit to a bug tracker if you have one and they have seen it during install on your page.
2. Some developers want to have control about their add-on, from time to time people use their source code without permission and even try (but only try) to become the maintainer of an add-on on AMO.
3. Some developers don't want to be hosted with add-ons which contain binary components ( http://forums.mozillazine.org/viewtopic.php?p=2241571#2241571 ) or transfer user data to company servers.

It should be the developers choice if the add-on should be hosted on AMO and he shouldn't be forced to do so.
I could reel off a bunch of differing reasons on why some authors don't host their add-ons on AMO (and I should note at this point that I am one of them, so take this as you will). But really I don't think the reasons matter so much. I believe that we should be preserving choice for the add-on authors. AMO will never be the perfect delivery solution for everyone. No solution that the author doesn't come up with themselves ever could be. If the author chooses to not host on AMO then we should be allowing them to do so. AMO will always be the easy path of course.

Hopefully what I can get working here is something that is about as easy to do as signing an xpi, not a piece of cake but easy enough for those that are committed to doing things themselves.
I should have been a little more focused with my last question, to avoid reopening the "who AMO sucks" can of worms! :-)

My line of thinking was that it might be possible to have AMO handle the update notification (securely), without hosting the extension itself -- thus no need for approvals. I'm approaching this from the angle of helping authors get access to an SSL host for secure update notification, and avoid adding a new codepath in Firefox for non-AMO extensions.

Looking back through the comments, I see that Jesse proposed exactly this in comment #3.

I think "preserving choice" is a red herring, because extension authors have been (and will continue to be) free to host their own extensions, completely free from AMO if desired. The issue is that when Firefox requires secure updates, how can we help minimize the burden of doing so?

I don't think that a certificate based non-SSL method is wrong, but that's still a difficult process for an independent developer to use. Creating an account on some SSL host, and filling in a simple automated form would be much easer. AMO already has the infrastructure for most of that, so having Mozilla provide this service there would seem a natural option.
Issues:
1. There are sometimes different updateURLs (e.g. release channels, but also from different add-on hosting sites). So a free-chooseable channel tag would be necessary.
2. The user _could_ be tracked (every add-on installed is reported to AMO), so the reputation of Mozilla would sink.
FWIW, with my extension author's hat on I would actually find it more difficult to have to submit my update.rdf to AMO than to host it myself, possibly that's because I have fairly well featured extensions hosting already. What I'm proposing is that once the update.rdf has been written they run a simple command to generate a signature for it. I'm not sure that that's particularly difficult. The difficult part would be the key generation in the first place, but even that shouldn't be all that complicated and could easily be automated by a script of some kind.

I don't think that the preserving choice bit is a red herring. Currently I am free to either host on AMO, host on some other secure platform (Around £50 per year for the certificate plus hosting costs), or host on my own insecure website (for whatever hosting costs). If we flat out deny non-ssl updates then you are taking that third option away from me, and since I make no money and often a lot of hassle from my extensions I certainly wouldn't take the second option.

The issue of users being tracked through AMO is I think a bit of a non-issue. I've heard no suggestion that people are worried about this from add-ons already hosted there, and nothing similar coming from the app's auto-update mechanisms.
Mossop pinged me on IRC just now to talk about ways to warn the user about addons with insecure updates, and it just reconvinced me that down that way lies sadness.   

(In reply to comment #30)
> FWIW, with my extension author's hat on I would actually find it more difficult
> to have to submit my update.rdf to AMO than to host it myself, possibly that's
> because I have fairly well featured extensions hosting already. What I'm
> proposing is that once the update.rdf has been written they run a simple
> command to generate a signature for it. I'm not sure that that's particularly
> difficult. The difficult part would be the key generation in the first place,
> but even that shouldn't be all that complicated and could easily be automated
> by a script of some kind.

This was my gut reaction too.  Adding generate_signing_key.exe and sign_my_addon_or_update.exe to our bevy of extension developer tools (or, more likely, a page of docs on how to do it with existing tools) seems like a straightforward way to get to a point where an update can be verified as having come from the original author, and having been unaltered in transit.  SSL is another good path for that, but my gut tells me that if our only solution is to expect all addon authors to have a dependency - however low touch - on AMO specifically, we will create friction.  And Stephen Colbert tells me to trust my gut. 

I hear dolske's point, that it seems unpleasant to have multiple codepaths here, though.  I am out of gut, I'm afraid, since I don't know the addons code at all.      Can someone who does give us some meat around how big an impact it would be?  Specifically, how much pain would it be to say:

 - Updates have to be verifiable, so either:
   a) Updates have to be retrieved over a secure connection
   b) Updates have to be signed by a key already associated with the addon (I get that this envisions new metadata in the add-on.  More sadfaces)
   c) Update.rdf has to be retrieved over a secure connection, and the update payload has to have a verified signature.

And then in all cases where none of a), b) or c) is met, we either disable updates, or disable install of the addon, or at least make a whole lot of grumpy noises.
b) is hard... I think c) is the easiest solution, and matches what we do with application updates and on AMOm, i.e.

Require update.rdf to always be signed (which in the current world means HTTPS, sucky but true). The update.rdf contains a download URI (which can be HTTP) and a hash for verification.

There is a fourth option d) which is require addons to be signed (the XPI itself). I know shaver's been talking about this, trying to work out how it would be possible to provide extension developers with reasonably-priced code-signing certs.
Given that I've just discovered the wonders of godaddy and its insanely cheap ssl certs (free for a year for open source projects) I'm going to do some postings on the extensions forums and newsgroups and see what kind of response I get to requiring ssl for the update files.
The response will be pretty bad. Certificates are not enough, the actual signing process is currently ridiculously complicated.
(In reply to comment #34)
> The response will be pretty bad. Certificates are not enough, the actual
> signing process is currently ridiculously complicated.

I think you are talking about signing the xpi, which is not what I am suggesting. Signed xpi's would be lovely etc. but they do very little in terms of keeping a good auto-update process secure. I am talking about having an ssl for your webserver where you host you update.rdf file.
Why not sign the update.rdf itself in that case? This is analogous to debian's secure-apt system - there's a signature for the central Release file, which contains a hash of the package list, which contains hashes for the packages themselves. That said, why would signing xpis not be sufficient to prevent malicious code from being installed? (provided you sanity-check the download to ensure it's the version/ID given by the update.rdf perhaps)
How do you proposed to sign the update.rdf without HTTPS? In a perfect world we wouldn't need HTTPS to serve signed content, but...

Getting a website-signing cert is significantly less expensive and troublesome than getting a code-signing cert and then signing an XPI.
As I suggested in comment #12, allow code to be signed by a self-signed certificate - but warn (prominently!) the first time it's used, and tie it to the extension it was accepted by the user with (ie, only accept it for that particular extension's update.rdf). If the update.rdf isn't signed, or the key changes, warn /prominently/.
1) You register as an ext author at mozilla.org providing identity details and you get a unique secret key from mozilla.org. 

2) When installation/update is offered a handshake occurs. A challenge token (similiar to nonce) is generated by the client.

3) Combined with a hash of the key, the update url/some mechanism/ responds with the key-hash encrypted with nonce

4) Client accordingly authenticates the installation/update.

similiar to windows integrated authentication in IIS.
Wouldn't it just be easier for mozilla to become an issuer of object signing
certs for use only with extensions from AMO?  Really.  I'll bet RedHat would
be willing to donate a copy of their cool Certificate Management System 
(a great CA software product, if I may praise the competition :) to Moz for
such a purpose.  
Is that sarcastic? No offence intended.

We'll actually if Google can distribute their Google API Keys, I believe Mozilla too can do the same. Developers don't sign up in bulk. at the most 5 a day. review each and accordingly reciprocate.

Regards
Shivanand Sharma
No sarcasm intended.  Scope absolutely limited to extension signing. 
nothing more.
(In reply to comment #40)
> Wouldn't it just be easier for mozilla to become an issuer of object signing
> certs for use only with extensions from AMO?  Really.  I'll bet RedHat would
> be willing to donate a copy of their cool Certificate Management System 
> (a great CA software product, if I may praise the competition :) to Moz for
> such a purpose.  

Add-ons installed from AMO are not an issue here. Ensuring that updates are secure is extremely simple if we assume all the add-ons are hosted at AMO. Unfortunately they aren't and we have to deal with the needs of people who host their own add-ons.

However if Mozilla were able to be a CA for code signing of any add-ons then it could be useful and I have previously asked shaver if this is a possibility.

In reply to comment 43, 
Dave, we have a nomenclature problem here.  From a user's perspective, he
gets addons from AMO, even if the actual file is downloaded from somewhere
else.  He goes to AMO, finds a page that describes the addon he wants,
and clicks download.  He typically never knows what server the file actually
comes from.  

I'm suggesting that mozilla could make code signing certs that could be used
ONLY for downloading addons that are found through AMO, regardless of which
server the XPI actually comes from.   

But here's another alternative.  The StartCom CA is talking about offering
code signing certs for ~$20 each.  Does that solve this problem entirely,
making it finally reasonable to require XPI's to be signed to be installed?
See news://news.mozilla.org:119/mailman.2003.1182341049.7364.dev-tech-crypto@lists.mozilla.org
(In reply to comment #44)
> In reply to comment 43, 
> Dave, we have a nomenclature problem here.  From a user's perspective, he
> gets addons from AMO, even if the actual file is downloaded from somewhere
> else.  He goes to AMO, finds a page that describes the addon he wants,
> and clicks download.  He typically never knows what server the file actually
> comes from.  

No I don't think we do. I was referring to add-ons from AMO as being those listed on AMO's pages (all of which are installed from one of mozilla's mirrors). The issue is with add-ons that are hosted on the authors own hosting, not listed on AMO at all.

> But here's another alternative.  The StartCom CA is talking about offering
> code signing certs for ~$20 each.  Does that solve this problem entirely,
> making it finally reasonable to require XPI's to be signed to be installed?

It's possible that it's a help, but really it's not much cheaper than their free SSL cert and the cost for a unique IP, but there is still resistance to even that, and code signing is more difficult than getting SSL hosting up and running.
I'm a little wary of posting figures on the responses I've had to my postings to the forums and newsgroups since the survey was hardly scientific, but here's an overview.

Every author that responded said they did not generally host on AMO, but that is unremarkable since I specifically asked people to respond if they didn't.

Very few (around 10%) of responses said that they already hosted updates on SSL sites.

When asked whether enforcing updates over SSL was a good idea, about 20% said they would be happy with it. A number spoke about the difficulty of getting a certificate, the majority were against the cost, some claimed that their hosting wouldn't support SSL.

Everyone that answered said they would be happy for a free option to secure updates even if it meant more work for them.

I think the biggest danger with these results is that the questions basically asked whether they would be happy paying for secure updates or not paying with extra work. I guess it's obvious which most add-on authors would take given that choice, they are already doing extra work by hosting themselves.

In terms of implementation difficulty, enforcing updates over SSL only is a simple option requiring minimal time to implement. The proposal I made about public/private key signing of the update file so far appears to be possible (a posting to moz.dev.tech.crypto resulted in no suggested problems), however it would to a certain extent be reinventing the wheel, and implementation time would be somewhat longer, in particular I gather that the necessary NSS api's aren't available to JS so a new C++ component would need to come in for the EM to use.
Target Milestone: Firefox 3 alpha6 → Firefox 3 beta1
I've written up the proposal for how I intend to tackle this. There are a couple of minor questions still to be answered but nothing that is blocking me starting real work here.
(In reply to comment #46)
> ...implementation time
> would be somewhat longer, in particular I gather that the necessary NSS api's
> aren't available to JS so a new C++ component would need to come in for the EM
> to use.

Oh, so what specific methods are you referring to here?

I just wanted to post a note to let you know that Mozdev.org is following this bug fairly closely to see what it means for their hosting environment and their mirrors.

If I read the proposal correctly, it could be possible to serve updates securely without using SSL at all by doing:
1) Sign the updates.rdf file with your private key; install your public key with your .xpi file during the initial install.
Question: where would the signature be downloaded from?  updates.rdf.asc?
2) Provide updateHash for each file in updates.rdf.

This would be the best scenario for Mozdev since we use public mirrors for our downloads which would probably not take kindly to needing SSL all-of-a-sudden, and our project owners generate the updates.rdf manually right now as it is.
(In reply to comment #49)
> Question: where would the signature be downloaded from?  updates.rdf.asc?

Im currently planning on the signature being included in the update.rdf file.
Target Milestone: Firefox 3 M7 → Firefox 3 M8
Depends on: 389816
Depends on: 390615
Attached patch patch rev 1 (obsolete) — Splinter Review
This implements the feature as described on the wiki.

The work essentially hits 3 parts.

On install we check if an add-on has the appropriate secure update information (at least one of an updateKey or a https updateURL). If not the install is rejected. We ignore this requirement if compatibility checking is disabled.

On app update, any addons that do not appear usable are checked for an updateKey that the previous version of the app may not have installed. This import is far from perfect however I'm not sure of a better way to do this unless we think about introducing a schema version for the EM datasource or something.

On update we ensure that if we need an ssl transport then we have one for the duration of the request. This happens when we don't have an updateKey, and the update url is https. This allows an insecure request for addons that are still installed from a previous version of the app and should be app-disabled (unless compatibility checking is off).
When we have the update rdf we perform a signature check if we have an updateKey. The string to check against the signature is generated as specified at http://wiki.mozilla.org/User:Mossop:Fx-Docs:AddonUpdateSignature and checked with the signature checker from bug 390615.
When finding updates in the manifest we ignore those that don't have appropriate ways to verify their security. So we require either a https url, or a hash of sufficient strength (currently any of the sha set we support).

Note that since bug 390615 is not actually reviewed yet it's possible the part of the code touching that could change slightly, however it should be minor.
Attachment #278409 - Flags: review?(robert.bugzilla)
Attached patch testcases rev 1 (obsolete) — Splinter Review
Testcases, split out to make the review a bit easier.

There is a total of 13 different test extensions which make up 6 different install and 8 different update scenarios.

Unfortunately we cannot test https update retrieval since the test http server cannot speak https. If I get time I hope to look at bug 242448 which would make that possible.
Attachment #278410 - Flags: review?(robert.bugzilla)
In reply to comment 52 and bug 242448:
Dave, why is a test https server required to have browser internals?

There are numerous good free open source https servers available, 
so why does testing of a secure extension updates require a specific one
that uses mozilla browser internals?
Developer resources for Thunderbird may be scarce, but they're relatively
plentiful, compared to developer resources for servers based on mozilla 
client internals!
(In reply to comment #53)
> In reply to comment 52 and bug 242448:
> Dave, why is a test https server required to have browser internals?

Well a few people have mentioned wanting bug 242448 anyway, it would improve the already existing test http server we have in the our source and I didn't particularly want to introduce a dependency on an external webserver on all the tinderboxen and developers running unit tests as well as write an interface to control such a webserver from xpcshell.
Attached patch patch rev 1 (obsolete) — Splinter Review
Updated to trunk. Note that the testcases for bug 299716 will fail at the moment, I need to either sign the install.rdf and update.rdfs there or make ssl tests work.
Attachment #278409 - Attachment is obsolete: true
Attachment #278763 - Flags: review?(robert.bugzilla)
Attachment #278409 - Flags: review?(robert.bugzilla)
Comment on attachment 278763 [details] [diff] [review]
patch rev 1

>Index: toolkit/locales/en-US/chrome/mozapps/extensions/extensions.properties
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/locales/en-US/chrome/mozapps/extensions/extensions.properties,v
>retrieving revision 1.37
>diff -u8 -p -r1.37 extensions.properties
>--- toolkit/locales/en-US/chrome/mozapps/extensions/extensions.properties	29 Aug 2007 08:34:34 -0000	1.37
>+++ toolkit/locales/en-US/chrome/mozapps/extensions/extensions.properties	29 Aug 2007 12:51:10 -0000	
>@@ -67,16 +67,17 @@ incompatibleTitle=Incompatible %S
> incompatibleMsg=%S %S could not be installed because it is not compatible with %S %S. (%S %S will only work with %S versions from %S to %S)
> incompatibleMsgSingleAppVersion=%S %S could not be installed because it is not compatible with %S %S. (%S %S will only work with %S %S)
> incompatibleMessageNoApp=%S %S could not be installed because it is not compatible with %S.
> incompatibleOlder=versions 0.8 or older.
> incompatibleThemeName=this Theme
> incompatibleExtension=Disabled - not compatible with %S %S
> incompatibleAddonMsg=Not compatible with %S %S
> blocklistedDisabled=Disabled for your protection
>+insecureUpdatesMessage="%S" will not be installed because it allows potentially insecure updates.
This should be singular and not plural... true?

>Index: toolkit/mozapps/extensions/public/nsIExtensionManager.idl
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/extensions/public/nsIExtensionManager.idl,v
>retrieving revision 1.52
>diff -u8 -p -r1.52 nsIExtensionManager.idl
>--- toolkit/mozapps/extensions/public/nsIExtensionManager.idl	29 Aug 2007 08:34:36 -0000	1.52
>+++ toolkit/mozapps/extensions/public/nsIExtensionManager.idl	29 Aug 2007 13:03:52 -0000	
>@@ -530,16 +530,21 @@ interface nsIUpdateItem : nsISupports
>    */
>   readonly attribute AString  iconURL;
> 
>   /**
>    * The URL of the update RDF file for this item.
>    */
>   readonly attribute AString  updateRDF;
> 
>+  /**
>+   * The public key to verify updates for this item.
>+   */
>+  readonly attribute AString  updateKey;
Please add detail regarding the key being used to sign the manifest, etc.

>Index: toolkit/mozapps/extensions/src/nsExtensionManager.js.in
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/extensions/src/nsExtensionManager.js.in,v
>retrieving revision 1.241
>diff -u8 -p -r1.241 nsExtensionManager.js.in
>--- toolkit/mozapps/extensions/src/nsExtensionManager.js.in	29 Aug 2007 08:34:36 -0000	1.241
>+++ toolkit/mozapps/extensions/src/nsExtensionManager.js.in	29 Aug 2007 13:01:56 -0000	
>@@ -146,16 +146,17 @@ const URI_EXTENSION_LIST_DIALOG       = "chrome://mozapps/content/extensions/lis
> 
> const INSTALLERROR_SUCCESS               = 0;
> const INSTALLERROR_INVALID_VERSION       = -1;
> const INSTALLERROR_INVALID_GUID          = -2;
> const INSTALLERROR_INCOMPATIBLE_VERSION  = -3;
> const INSTALLERROR_PHONED_HOME           = -4;
> const INSTALLERROR_INCOMPATIBLE_PLATFORM = -5;
> const INSTALLERROR_BLOCKLISTED           = -6;
>+const INSTALLERROR_INSECURE_UPDATES      = -7;
Shouldn't this be INSTALLERROR_INSECURE_UPDATE since it is for one update?

>@@ -2741,16 +2744,17 @@ ExtensionManager.prototype = {
>                       location.name,
>                       targetAppInfo ? targetAppInfo.minVersion : "",
>                       targetAppInfo ? targetAppInfo.maxVersion : "",
>                       getManifestProperty(installManifest, "name"),
>                       "", /* XPI Update URL */
>                       "", /* XPI Update Hash */
>                       getManifestProperty(installManifest, "iconURL"),
>                       getManifestProperty(installManifest, "updateURL"),
>+                      getManifestProperty(installManifest, "updateKey"),
>                       installData.type,
>                       targetAppInfo ? targetAppInfo.appID : gApp.ID);
>     }
>     return null;
>   },
> 
>   /**
>    * Check for changes to items that were made independently of the Extension
>@@ -2814,16 +2818,18 @@ ExtensionManager.prototype = {
>             case INSTALLERROR_INVALID_GUID:
>               return "Invalid GUID";
>             case INSTALLERROR_INVALID_VERSION:
>               return "Invalid Version";
>             case INSTALLERROR_INCOMPATIBLE_VERSION:
>               return "Incompatible Version";
>             case INSTALLERROR_INCOMPATIBLE_PLATFORM:
>               return "Incompatible Platform";
>+            case INSTALLERROR_INSECURE_UPDATES:
>+              return "Insecure Updates";
Same as above along with "Insecure Update"

>@@ -3938,22 +3962,31 @@ ExtensionManager.prototype = {
>       }
>     }
> 
>     // Validate the Item ID
>     if (!gIDTest.test(installData.id)) {
>       installData.error = INSTALLERROR_INVALID_GUID;
>       return installData;
>     }
>-
>-    // Check the target application range specified by the extension metadata.
>-    if (gCheckCompatibility &&
>-        !this.datasource.isCompatible(installManifest, gInstallManifestRoot, undefined))
>-      installData.error = INSTALLERROR_INCOMPATIBLE_VERSION;
>-
>+    
>+    // Check the target application range and update security.
>+    if (gCheckCompatibility) {
>+      if ((installData.updateURL && installData.updateURL.substring(0, 6) != "https:")
>+          && !getManifestProperty(installManifest, "updateKey")) {
>+        installData.error = INSTALLERROR_INSECURE_UPDATES;
>+        return installData;
>+      }
>+      
>+      if (!this.datasource.isCompatible(installManifest, gInstallManifestRoot, undefined)) {
>+        installData.error = INSTALLERROR_INCOMPATIBLE_VERSION;
>+        return installData;
>+      }
>+    }
hmmm... what's the rationale for using the checkCompatibility pref vs. a new pref? I thought this would have its own but this may make more sense.

>@@ -5054,19 +5096,33 @@ ExtensionManager.prototype = {
> 
>   /**
>    * Determines whether an item should be disabled by the application.
>    * @param   id
>    *          The ID of the item to check
>    */
>   _isUsableItem: function(id) {
>     var ds = this.datasource;
>-    return ((!gCheckCompatibility || ds.getItemProperty(id, "compatible") == "true") &&
>-            ds.getItemProperty(id, "blocklisted") == "false" &&
>-            ds.getItemProperty(id, "satisfiesDependencies") == "true");
>+    /* If we're compatibility checking then check if the item is compatible.
>+     * Also check if it isn't blocklisted and has all dependencies satisfied. */
>+    if ((!gCheckCompatibility || ds.getItemProperty(id, "compatible") == "true") &&
>+        ds.getItemProperty(id, "blocklisted") == "false" &&
>+        ds.getItemProperty(id, "satisfiesDependencies") == "true") {
>+      // appManaged items aren't updated so no need to check update security.
>+      if (ds.getItemProperty(id, "appManaged") == "true")
>+        return true;
>+      /* We ignore update security if not checking compatibility. Otherwise
>+       * check that we have an updateKey or the update URL is non-existant 
>+       * (means we are using the app defined update URL) or SSL */
>+      return (!gCheckCompatibility ||
>+              ds.getItemProperty(id, "updateKey") ||
>+              !ds.getItemProperty(id, "updateURL") ||
>+              ds.getItemProperty(id, "updateURL").substring(0, 6) == "https:");
This is going to app disable the add-on and not just the update check! I don't think we want to go that far.
This also breaks blocklisting and dependencies.

>@@ -6020,21 +6076,24 @@ RDFItemUpdater.prototype = {
>       this._updater.checkForDone(aItem,
>                                  nsIAddonUpdateCheckListener.STATUS_FAILURE);
>       return;
>     }
> 
>     LOG("RDFItemUpdater:checkForUpdates sending a request to server for: " +
>         uri.spec + ", item = " + aItem.objectSource);
> 
>+    /* If there is an updateKey then we aren't too concerned with the security
>+     * of the channel */
This comment doesn't provide any information that I don't get from the declaration of requireSSL. Either remove it or expand upon it.

>+    var requireSSL = !aItem.updateKey && uri.schemeIs("https");

>@@ -6325,39 +6430,202 @@ RDFItemUpdater.prototype = {
>     var taArc = gRDF.GetResource(EM_NS("targetApplication"));
>     var targetApps = aDataSource.GetTargets(aUpdateResource, taArc, true);
>     while (targetApps.hasMoreElements()) {
>       var targetApp = targetApps.getNext().QueryInterface(Components.interfaces.nsIRDFResource);
>       var appID = this._getPropertyFromResource(aDataSource, targetApp, "id", aLocalItem);
>       if (appID != this._updater._appID && appID != TOOLKIT_ID)
>         continue;
> 
>+      var updateLink = this._getPropertyFromResource(aDataSource, targetApp, "updateLink", aLocalItem);
>+      var updateHash = this._getPropertyFromResource(aDataSource, targetApp, "updateHash", aLocalItem);
>+      if (aUpdateCheckType == nsIExtensionManager.UPDATE_CHECK_NEWVERSION) {
>+        // New version information is useless without a link to get it from
>+        if (!updateLink)
>+          continue;
>+
>+        /* If the update link is non-ssl and we do not have a hash or the hash
>+         * is of an insecure nature then we must ignore this update. */
>+        if ((updateLink.substring(0, 6) != "https:") && 
>+            (!updateHash || updateHash.substring(0, 3) != "sha")) {
>+          LOG("RDFItemUpdater:_parseV20Update: Update for " + aLocalItem.id +
>+              " at " + updateLink + " ignored because it is insecure. updateLink " +
>+              " must be a https url or an updateHash must be specified.");
>+          continue;
>+        }
>+      }
It seems that you disable the requirement for ssl, updateKey, etc. with gCheckCompatibility and then you make it so they can't install the update if one is found without ssl or an updateHash.
Dave, I'm not going to get back to the rest of this until later today.
Comment on attachment 278763 [details] [diff] [review]
patch rev 1

> 
> /**
>+ * A serialisation method for RDF data that produces an identical string
>+ * provided that the RDF assertions match.
>+ * The serialisation is not complete, only assertions stemming from a given
>+ * resource are included, multiple references to the same resource are not
>+ * permitted, and the RDF prolog and epilog are not included.
>+ * RDF Blob and Date literals are not supported.
>+ */
>+function RDFSerializer()
>+{
>+  this.cUtils = Components.classes["@mozilla.org/rdf/container-utils;1"]
>+                          .getService(Components.interfaces.nsIRDFContainerUtils);
>+  this.resources = [];
>+}
>+
>+RDFSerializer.prototype = {
>+  INDENT: "  ",      // The indent used for pretty-printing
>+  resources: null,   // Keeps track of the resources that have been visited
Array of the resources that have been found

>+  
>+  /**
>+   * Escapes characters from a string that should not appear in XML.
>+   */
>+  escapeEntities: function(string)
>+  {
>+    string = string.replace(/&/g, "&amp;");
>+    string = string.replace(/</g, "&lt;");
>+    string = string.replace(/>/g, "&gt;");
>+    string = string.replace(/"/g, "&quot;");
>+    return string;
>+  },
>+  
>+  /**
>+   * Returns a serilization of the elements of an RDF container.
>+   * @param ds         The datasource holding the data
>+   * @param container  The RDF container to output the child elements of
>+   * @param indent     The current level of indent for pretty-printing
>+   */
>+  serializeContainerItems: function(ds, container, indent)
>+  {
>+    var result = "";
>+    var items = container.GetElements();
>+    while (items.hasMoreElements()) {
>+      var item = items.getNext().QueryInterface(Components.interfaces.nsIRDFResource);
>+      result += indent + "<RDF:li>\n"
>+      result += this.serializeResource(ds, item, indent + this.INDENT);
>+      result += indent + "</RDF:li>\n"
>+    }
>+    return result;
>+  },
>+  
>+  /**
>+   * Returns a serilization of the regular properties of an RDF resource.
>+   * This will only output EM_NS properties and will ignore any em:signature
>+   * property.
How aboout something like
Serializes all em:* (see EM_NS) properties of an RDF resource expcept for the em:signature property.

for this one and update other comments similarly.

Also add why it doesn't serialize the em:signature property.

>+   * @param ds         The datasource holding the data
>+   * @param resource   The RDF resource to output the properties of
>+   * @param indent     The current level of indent for pretty-printing
Please add an @returns here and elsewhere as applicable

>Index: toolkit/mozapps/shared/src/badCertHandler.js
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/shared/src/badCertHandler.js,v
>retrieving revision 1.2
>diff -u8 -p -r1.2 badCertHandler.js
>--- toolkit/mozapps/shared/src/badCertHandler.js	15 Mar 2007 21:51:48 -0000	1.2
>+++ toolkit/mozapps/shared/src/badCertHandler.js	29 Aug 2007 12:51:08 -0000	
>@@ -60,19 +60,21 @@ function checkCert(channel) {
>     throw "cert issuer is not built-in";
> }
> 
> /**
>  * This class implements nsIBadCertListener.  It's job is to prevent "bad cert"
>  * security dialogs from being shown to the user.  It is better to simply fail
>  * if the certificate is bad. See bug 304286.
>  */
>-function BadCertHandler() {
>+function BadCertHandler(aRequireSSL) {
>+  this.requireSSL = aRequireSSL;
> }
> BadCertHandler.prototype = {
>+  requireSSL: false,
> 
>   // nsIBadCertListener
>   confirmUnknownIssuer: function(socketInfo, cert, certAddType) {
>     LOG("EM BadCertHandler: Unknown issuer");
>     return false;
>   },
> 
>   confirmMismatchDomain: function(socketInfo, targetURL, cert) {
>@@ -85,16 +87,18 @@ BadCertHandler.prototype = {
>     return false;
>   },
> 
>   notifyCrlNextupdate: function(socketInfo, targetURL, cert) {
>   },
> 
>   // nsIChannelEventSink
>   onChannelRedirect: function(oldChannel, newChannel, flags) {
>+    if (this.requireSSL && !newChannel.URI.schemeIs("https"))
>+      throw "connected to non-ssl server";
throw new Error("connected to a non-ssl server");

It might be helpful to mention that ssl connections are required in the error.
Attachment #278763 - Flags: review?(robert.bugzilla) → review-
Comment on attachment 278410 [details] [diff] [review]
testcases rev 1

I'm fine with this and I'll r+ this after verifying this is all in good order with the main patch.
Attachment #278410 - Flags: review?(robert.bugzilla)
(In reply to comment #56)
> (From update of attachment 278763 [details] [diff] [review])
> >@@ -5054,19 +5096,33 @@ ExtensionManager.prototype = {
> > 
> >   /**
> >    * Determines whether an item should be disabled by the application.
> >    * @param   id
> >    *          The ID of the item to check
> >    */
> >   _isUsableItem: function(id) {
> >     var ds = this.datasource;
> >-    return ((!gCheckCompatibility || ds.getItemProperty(id, "compatible") == "true") &&
> >-            ds.getItemProperty(id, "blocklisted") == "false" &&
> >-            ds.getItemProperty(id, "satisfiesDependencies") == "true");
> >+    /* If we're compatibility checking then check if the item is compatible.
> >+     * Also check if it isn't blocklisted and has all dependencies satisfied. */
> >+    if ((!gCheckCompatibility || ds.getItemProperty(id, "compatible") == "true") &&
> >+        ds.getItemProperty(id, "blocklisted") == "false" &&
> >+        ds.getItemProperty(id, "satisfiesDependencies") == "true") {
> >+      // appManaged items aren't updated so no need to check update security.
> >+      if (ds.getItemProperty(id, "appManaged") == "true")
> >+        return true;
> >+      /* We ignore update security if not checking compatibility. Otherwise
> >+       * check that we have an updateKey or the update URL is non-existant 
> >+       * (means we are using the app defined update URL) or SSL */
> >+      return (!gCheckCompatibility ||
> >+              ds.getItemProperty(id, "updateKey") ||
> >+              !ds.getItemProperty(id, "updateURL") ||
> >+              ds.getItemProperty(id, "updateURL").substring(0, 6) == "https:");
> This is going to app disable the add-on and not just the update check! I don't
> think we want to go that far.
> This also breaks blocklisting and dependencies.

I've spent some time thinking on this and I'm ok (though I am a tad nervous about it) with preventing installs of extensions that don't provide a secure update mechanism.
Heh, it only took me a few hours to come to this conclusion and I found out that it is the same conclusion shaver came to! That's pretty damn good for me. :P
Attached patch patch rev 2 (obsolete) — Splinter Review
This fixes all the review points so far:

(In reply to comment #56)
> >+insecureUpdatesMessage="%S" will not be installed because it allows potentially insecure updates.
> This should be singular and not plural... true?

I've changed the string name, and the wording a little, though the words remain plural it seems to read right to me. I think UX should take a quick look and throw their word in.

> hmmm... what's the rationale for using the checkCompatibility pref vs. a new
> pref? I thought this would have its own but this may make more sense.

I was thinking it made sense that the update security was a kind of compatibility check but I now think you're right, a separate pref has many benefits. Firslty as a nightly user I tend to run with compatibility checking disabled yet I don't want security checking disabled. Secondly third party apps may well want to lower the restrictions on add-ons by disabling the security checking, I would hope they wouldn't want to drop the compatibility checking though.

> >   _isUsableItem: function(id) {

> This is going to app disable the add-on and not just the update check! I don't
> think we want to go that far.
> This also breaks blocklisting and dependencies.

I can't see where blocklisting and dependencies are broken here?

> It seems that you disable the requirement for ssl, updateKey, etc. with
> gCheckCompatibility and then you make it so they can't install the update if
> one is found without ssl or an updateHash.

Dropped that. I have for now left in the actual signature check on the update manifest if we have an update key to check against. I'm in two minds over whether disabling that part is good as well, if we have a key then why not use it?

Unfortunately I had to make a bunch of changes to make the UI update right. Hopefully interdiff will be feeling friendly but heres a roundup of what's changed aside from the previous points:

Added status messages and appropriate decoration to items disabled in the UI.
Added a dynamic rdf property secureUpdate to allow the UI to work right.
Changed the migration code to always look for an updateKey if the item does not support a secure update yet.
Added the pref to turn off the security requirement.
Used that pref on existing update testcases we have so they still pass.
Attachment #278763 - Attachment is obsolete: true
Attachment #279213 - Flags: review?(robert.bugzilla)
Attached patch testcases rev 2Splinter Review
These are the same test cases except I had to redo the signatures based on some changes in bug 390615 and I've merged the update manifests into a single rdf file.
Attachment #278410 - Attachment is obsolete: true
(In reply to comment #62)
> Created an attachment (id=279213) [details]
> patch rev 2
> 
> This fixes all the review points so far:
> 
> (In reply to comment #56)
> > >+insecureUpdatesMessage="%S" will not be installed because it allows potentially insecure updates.
> > This should be singular and not plural... true?
> 
> I've changed the string name, and the wording a little, though the words remain
> plural it seems to read right to me. I think UX should take a quick look and
> throw their word in.
That's what I was referring to... thanks!

> > >   _isUsableItem: function(id) {
> 
> > This is going to app disable the add-on and not just the update check! I don't
> > think we want to go that far.
> > This also breaks blocklisting and dependencies.
> 
> I can't see where blocklisting and dependencies are broken here?
You're right.

I'll get to this shortly.
Comment on attachment 279213 [details] [diff] [review]
patch rev 2

A couple of initial comments... I'll finish the rest tomorrow

Check if the default url is checked for https as well

>Index: toolkit/mozapps/extensions/content/extensions.js
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/extensions/content/extensions.js,v
>retrieving revision 1.141
>diff -u8 -p -r1.141 extensions.js
>--- toolkit/mozapps/extensions/content/extensions.js	31 Aug 2007 11:35:18 -0000	1.141
>+++ toolkit/mozapps/extensions/content/extensions.js	1 Sep 2007 03:03:27 -0000	
>...
>@@ -649,16 +653,20 @@ function Startup()
>                           .QueryInterface(Components.interfaces.nsIXULRuntime);
>   gInSafeMode = appInfo.inSafeMode;
>   gAppID = appInfo.ID;
> 
>   try {
>     gCheckCompat = gPref.getBoolPref(PREF_EM_CHECK_COMPATIBILITY);
>   } catch(e) { }
> 
>+  try {
>+    gCheckUpdateSecurity = gPref.getBoolPref(PREF_EM_CHECK_UPDATE_SECURITY);
>+  } catch(e) { }
>+
I think we should display a notification if this pref is set to false as we do when compatibility checking is disabled.

>Index: toolkit/mozapps/extensions/content/extensions.xml
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/extensions/content/extensions.xml,v
>retrieving revision 1.45
>diff -u8 -p -r1.45 extensions.xml
>--- toolkit/mozapps/extensions/content/extensions.xml	25 Aug 2007 23:35:27 -0000	1.45
>+++ toolkit/mozapps/extensions/content/extensions.xml	1 Sep 2007 03:03:06 -0000	
>@@ -124,16 +124,17 @@
>     </resources>
> 
>     <implementation>
>       <field name="eventPrefix">"extension-"</field>
>       <property name="type" onget="return parseInt(this.getAttribute('type'));"/>
>       <property name="isCompatible" onget="return this.getAttribute('compatible') == 'true';"/>
>       <property name="isBlocklisted" onget="return this.getAttribute('blocklisted') == 'true';"/>
>       <property name="isDisabled" onget="return this.getAttribute('isDisabled') == 'true';"/>
>+      <property name="hasSecureUpdates" onget="return this.getAttribute('secureUpdate') == 'true';"/>
hasSecureUpdates makes me think that if hasSecureUpdates is true that this item has secure updates available. I'm not sure what would be better but perhaps hasUpdateSecurity? Same goes for secureUpdate... perhaps updateSecurity?

>Index: toolkit/mozapps/extensions/src/nsExtensionManager.js.in
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/extensions/src/nsExtensionManager.js.in,v
>retrieving revision 1.246
>diff -u8 -p -r1.246 nsExtensionManager.js.in
>--- toolkit/mozapps/extensions/src/nsExtensionManager.js.in	31 Aug 2007 11:35:18 -0000	1.246
>+++ toolkit/mozapps/extensions/src/nsExtensionManager.js.in	1 Sep 2007 03:21:39 -0000	
>@@ -2460,21 +2466,22 @@ ExtensionManager.prototype = {
>       }
>     }
>     catch (ex) {
>     }
>     gLocale = gPref.getCharPref(PREF_SELECTED_LOCALE);
>   },
> 
>   /**
>-   * Enables or disables extensions that are incompatible depending upon the pref
>-   * setting for compatibility checking.
>+   * When a preference that affects whether an item is usable or not is toggled
>+   * we must app-enable or app-disable the item based on the new settings.
>    */
I think this reads better
* When a preference is toggled that affects whether an item is usable or not

>-  _checkCompatToggled: function() {
>+  _usabilityPrefToggled: function() {
Naming functions can be a PITA... I don't care much for the name but I can't think of a better one right now.

>@@ -3470,18 +3481,32 @@ ExtensionManager.prototype = {
>             var metadataDS = getInstallManifest(installRDF);
>             ds.addItemMetadata(id, metadataDS, location);
>             ds.updateProperty(id, "compatible");
>           }
>         }
>       }
>       else if (allAppManaged)
>         allAppManaged = false;
>-      // appDisabled is determined by an item being compatible,
>-      // satisfying its dependencies, and not being blocklisted
>+
>+      if (ds.getItemProperty(id, "secureUpdate") == "false") {
>+        /* It's possible the previous version did not understand updateKeys so
>+         * check if we can import one for this addon from it's manifest. */
>+        var location = this.getInstallLocation(id);
>+        var installRDF = location.getItemFile(id, FILE_INSTALL_MANIFEST);
>+        if (installRDF.exists()) {
>+          var metadataDS = getInstallManifest(installRDF);
>+          var literal = metadataDS.GetTarget(gInstallManifestRoot, EM_R("updateKey"), true);
>+          if (literal && literal instanceof Components.interfaces.nsIRDFLiteral)
>+            ds.setItemProperty(id, EM_R("updateKey"), literal);
>+        }
>+      }
>+
>+      // appDisabled is determined by an item being compatible, using secure
>+      // updates, satisfying its dependencies, and not being blocklisted
>       if (this._isUsableItem(id)) {
>         if (ds.getItemProperty(id, "appDisabled"))
>           ds.setItemProperty(id, EM_R("appDisabled"), null);
>       }
>       else if (!ds.getItemProperty(id, "appDisabled"))
>         ds.setItemProperty(id, EM_R("appDisabled"), EM_L("true"));
> 
>       ds.setItemProperty(id, EM_R("availableUpdateURL"), null);
>@@ -3943,22 +3973,32 @@ ExtensionManager.prototype = {
>       }
>     }
> 
>     // Validate the Item ID
>     if (!gIDTest.test(installData.id)) {
>       installData.error = INSTALLERROR_INVALID_GUID;
>       return installData;
>     }
>-
>-    // Check the target application range specified by the extension metadata.
>+    
>+    // Check the target application range and update security.
Might as well have a comment for each check so move the target application range to be above the applicable check and have it mention compatibility.

>+    if (gCheckUpdateSecurity &&
>+        installData.updateURL &&
>+        installData.updateURL.substring(0, 6) != "https:" &&
>+        !installData.updateKey) {
>+      installData.error = INSTALLERROR_INSECURE_UPDATE;
>+      return installData;
>+    }
>+      
>     if (gCheckCompatibility &&

>@@ -6339,42 +6451,214 @@ RDFItemUpdater.prototype = {
>     // Track the best update we have found so far
>     var newestUpdateItem = null;
>     while (targetApps.hasMoreElements()) {
>       var targetApp = targetApps.getNext().QueryInterface(Components.interfaces.nsIRDFResource);
>       var appID = this._getPropertyFromResource(aDataSource, targetApp, "id", aLocalItem);
>       if (appID != this._updater._appID && appID != TOOLKIT_ID)
>         continue;
> 
>+      var updateLink = this._getPropertyFromResource(aDataSource, targetApp, "updateLink", aLocalItem);
>+      var updateHash = this._getPropertyFromResource(aDataSource, targetApp, "updateHash", aLocalItem);
>+      if (aUpdateCheckType == nsIExtensionManager.UPDATE_CHECK_NEWVERSION) {
>+        // New version information is useless without a link to get it from
>+        if (!updateLink)
>+          continue;
>+
>+        /* If the update link is non-ssl and we do not have a hash or the hash
>+         * is of an insecure nature then we must ignore this update. Bypass
>+         * this if not checking update security */
>+        if (gCheckUpdateSecurity && updateLink.substring(0, 6) != "https:" && 
>+            (!updateHash || updateHash.substring(0, 3) != "sha")) {
>+          LOG("RDFItemUpdater:_parseV20Update: Update for " + aLocalItem.id +
>+              " at " + updateLink + " ignored because it is insecure. updateLink " +
>+              " must be a https url or an updateHash must be specified.");
This should specifically call out that the hash must be sha
(In reply to comment #65)
>...
> Check if the default url is checked for https as well
ignore this... it is a comment to myself I forgot to delete.
(In reply to comment #65)
> >+  try {
> >+    gCheckUpdateSecurity = gPref.getBoolPref(PREF_EM_CHECK_UPDATE_SECURITY);
> >+  } catch(e) { }
> >+
> I think we should display a notification if this pref is set to false as we do
> when compatibility checking is disabled.

I'll think on this some more but my initial thought was that we shouldn't. If songbird for example choose to disable this do we want to saddle them with this notification? I guess they can patch it away but but still.
My initial thought was along the same lines but... when we had the app version override for extension compatibility checking there would be reports of extensions not working in the forums and bugs reports due to people setting it and then either forgetting they had set it or not even knowing that instructions they had followed caused the problems they were seeing. If the process for adding an updateHash and the signature is simple I would hope most extension authors will add them and remove the need to set this pref except when testing.
I think I might be able to not make the changes to badCertHandler.js, it appears that it already checks and throws on any attempt to change from a https to a http url. checkCert looks at originalURI which is the vert first url requested regardless of any redirects. If https it then attempts to QueryInterface the channels securityInfo. If hte channel is non-https then it's securityInfo is null and so that will throw. From bug 340198 comment 48 that appears to be entirely intentional.

I will double check this and resubmit if it appears to be the case.
So, if you'd like to make it so a XULRunner app doesn't show the notification you could check against the default pref branch and if that is false not show the notification.
Comment on attachment 279214 [details] [diff] [review]
testcases rev 2

>Index: toolkit/mozapps/extensions/test/unit/data/test_bug378216.rdf
>===================================================================
>RCS file: /cvsroot/mozilla/toolkit/mozapps/extensions/test/unit/data/test_bug378216.rdf,v
>diff -N /toolkit/mozapps/extensions/test/unit/data/test_bug378216.rdf
>--- /dev/null	1 Jan 1970 00:00:00 -0000
>+++ toolkit/mozapps/extensions/test/unit/data/test_bug378216.rdf	1 Sep 2007 00:27:30 -0000	
>@@ -0,0 +1,197 @@
>+<?xml version="1.0" encoding="UTF8"?>
nit: UTF-8

r=me
Attachment #279214 - Flags: review+
Comment on attachment 279213 [details] [diff] [review]
patch rev 2

one more nit: when you add a throw use throw new Error.

With nits fixed r=me. I'd like to do a quick once over of the updated patch after the nits have been fixed and you have verified if the bad cert handler needs the changes or not.
Attachment #279213 - Flags: review?(robert.bugzilla) → review+
Just to confirm, the update check does indeed fail if the update url is https then redirects to http, get the following:

RDFItemUpdater::onXMLLoad: TypeError: channel.securityInfo has no properties
Datasource: Addon Update Ended: {c7b6314b-991f-4d24-ba49-473c1511c237}, status: 4

I'll pull out the unnecessary changes, add in the notification bar and resubmit this afternoon.
Attached patch patch rev 3 (obsolete) — Splinter Review
(In reply to comment #65)
> >+      <property name="hasSecureUpdates" onget="return this.getAttribute('secureUpdate') == 'true';"/>
> hasSecureUpdates makes me think that if hasSecureUpdates is true that this item
> has secure updates available. I'm not sure what would be better but perhaps
> hasUpdateSecurity? Same goes for secureUpdate... perhaps updateSecurity?

I've gone with providesUpdatesSecurely, a little long maybe but I think it's less ambiguous.

> >-  _checkCompatToggled: function() {
> >+  _usabilityPrefToggled: function() {
> Naming functions can be a PITA... I don't care much for the name but I can't
> think of a better one right now.

Gone for _updateAppDisabledState to describe what the function does rather than when we should call it.

I've also used the default pref branch for checking whether to display the notification about update security.
Attachment #279213 - Attachment is obsolete: true
Attachment #279341 - Flags: review?(robert.bugzilla)
Attached patch patch rev 4Splinter Review
This cleans up the case where an add-on's directory or pointer file appears in the instal location. It adds the add-on's metadata to the ds but app disables it which is consistent with how we handle add-ons already installed when the app is upgraded to one requiring secure updates.
Attachment #279341 - Attachment is obsolete: true
Attachment #279386 - Flags: review?(robert.bugzilla)
Attachment #279341 - Flags: review?(robert.bugzilla)
Comment on attachment 279386 [details] [diff] [review]
patch rev 4

r=me and thanks Dave!
Attachment #279386 - Flags: review?(robert.bugzilla) → review+
Blocks: 393650
Checked in implementation and test cases with that minor nit fixed on checkin.
Status: NEW → RESOLVED
Closed: 17 years ago
Resolution: --- → FIXED
Flags: in-testsuite+
Here is some sample code for doing this, assuming the extension ID is
"sample@site":
const updateURL = "http://www.site.com/update.rdf";
const prefs = Components.classes["@mozilla.org/preferences-service;1"]         
        .getService(Components.interfaces.nsIPrefBranch);
prefs.setCharPref("extensions.sample@site.update.url", updateURL);

...where do I have to insert that (or similar) code?
Depends on: 394990
Couldn't an extension include a https:// URL which redirects to http://, as the code only checks for the https:// within the extension? Ok, one could ask why an extension author should do this, but should anything be done about this (like check the final URL after the http request)?
(In reply to comment #79)
> Couldn't an extension include a https:// URL which redirects to http://, as the
> code only checks for the https:// within the extension? Ok, one could ask why
> an extension author should do this, but should anything be done about this
> (like check the final URL after the http request)?

No, a redirect to http from https will fail (see comment 69)
And now?
With the latest SM nightly I'm even unable to *use* my own extensions which (and whose updates) are hosted on my own server because I can't afford SSL for that server.
If I couldn't just update them (getting a hint for the reason why), ok, but not *using* them? I wasn't able to find a pref to disable that behaviour.
Anybody in need of valid digital certificates, you can get certification for free at http://cert.startcom.org/

Hope this helps!
(In reply to comment #81)
> And now?
> With the latest SM nightly I'm even unable to *use* my own extensions which
> (and whose updates) are hosted on my own server because I can't afford SSL for
> that server.
> If I couldn't just update them (getting a hint for the reason why), ok, but not
> *using* them? I wasn't able to find a pref to disable that behaviour.

Relax!
1) Set "extensions.checkUpdateSecurity" to boolean false
2) Use <em:updateKey>anything</em:updateKey>
  - where anything can be your friends update key.
3) Don't add a updateURL and use your own update service.
4) Host your extension at a.m.o (certificate needed/required).
5) See Eddy Nigg's replay (comment #82)
6) Wait two more weeks for the documentation/tools to become available.
7) Delete the update service (or hack it to pieces like what we did).
8) No longer use Mozilla Firefox/SeaMonkey v2.0 nightlies.

(In reply to comment #81)
> Anybody in need of valid digital certificates, you can get certification for
> free at http://cert.startcom.org/
>
> Hope this helps!

What about privacy?
(In reply to comment #83)
> 
> What about privacy?
> 
In which respect? 

For questions  concerning PKI, but not related to this bug, please post to the dev-tech-crypto or dev-security mailing list. Feel free to mail me as well. Thanks!

(In reply to comment #83)
> (In reply to comment #81)
[...]
> > If I couldn't just update them (getting a hint for the reason why), ok, but not
> > *using* them? I wasn't able to find a pref to disable that behaviour.
> 
> Relax!
> 1) Set "extensions.checkUpdateSecurity" to boolean false
> 2) Use <em:updateKey>anything</em:updateKey>
>   - where anything can be your friends update key.
> 3) Don't add a updateURL and use your own update service.
> 4) Host your extension at a.m.o (certificate needed/required).
> 5) See Eddy Nigg's replay (comment #82)
> 6) Wait two more weeks for the documentation/tools to become available.
> 7) Delete the update service (or hack it to pieces like what we did).
> 8) No longer use Mozilla Firefox/SeaMonkey v2.0 nightlies.

...and, for Johannes: the above are alternative possibilities, not a list of steps for a single procedure.
Thanks mv, the pref helps for the moment. Would have been nice to mention it in the discussion (not only in the code where I found it now I now its name).

That updateKey thing I don't understand at the moment, it's not explained in http://developer.mozilla.org/en/docs/Install_Manifests. And to speak of documentation, is it right there's none for (even older) Update Manifests (update.rdf) at all?
The other points are not an option for me I fear.
Ok, so anyone in need of help should subscribe to: news.mozilla.org -> mozilla.dev.extensions and look for the: "Add-on security restrictions" thread and not spam this bug report.  Thank you.
Simple quick hack.
You can use this page to make a public key info and a signature if you already have a key pair and its certificate by NSS's certutil or so, and if you understand the manifest serialization method.
- You must trust the certificate issuer (as Email User CA), otherwise you will get Internal Error.
- You must access with Firefox because the page uses crypto.signText() function.

Though the output is on the exact format, I've not yet test with trunk at all.
I hope this will help you.
(In reply to comment #89)
> Created an attachment (id=281024) [details]
> my Update Signature Generator - prototype

I've done some quick testing with this and the signatures it produces fail the signature checks. I haven't really dug into why this is. The McCoy tool to handle the signature and key generation for you should be out real soon now.
(In reply to comment #88)
> Added notes to:
> 
> http://developer.mozilla.org/en/docs/Extension_Versioning%2C_Update_and_Compatibility
> http://developer.mozilla.org/en/docs/Install_Manifests
> http://developer.mozilla.org/en/docs/Firefox_3_for_developers
> http://developer.mozilla.org/en/docs/Updating_extensions_for_Firefox_3
> 
From what I see it is not mentioned that self-signed certs or certs signed by a self-installed CA cert won't work. The https cert must be signed by one of those "Builtin Object Token" CA certs.
(In reply to comment #91)
> From what I see it is not mentioned that self-signed certs or certs signed by a
> self-installed CA cert won't work. The https cert must be signed by one of
> those "Builtin Object Token" CA certs.

Yes you're right I should add that, though that became a requirement in Firefox 1.5 I believe.
(In reply to comment #90)
> The McCoy tool to handle the signature and key generation should be out 
> real soon now.

NSS has a utility that does key generation and object signing.  
It's called signtool.  If it's insufficient, please tell me how/why.

(In reply to comment #93)
> NSS has a utility that does key generation and object signing.  
> It's called signtool.  If it's insufficient, please tell me how/why.

signtool signs jar files (not what we are signing here) and requires a code signing certificate (too unobtainable for the authors).
Dave, where is the open specification for your new signature format?
(In reply to comment #95)
> Dave, where is the open specification for your new signature format?

Please check the bug's url and the links off that page for details of the security verification.
verified fixed using  Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9a8) Gecko/2007091216 GranParadiso/3.0a8 and a old version (with non https) update path.

The Extension is disabled by default because of "does not provide secure updates" 
Status: RESOLVED → VERIFIED
2 issues:

1. how do you deal with self issued CA?
1a. most users can't verify the certificate
1b. users that can verify the certificate will cause spam to the developer if they do it properly


2. even valid certificates are cheap and virtually anonymous
a single valid malicious certificate and man in the middle attack can compromise a popular extension.
screenshots illustrating some certificates:
http://www.benedelman.org/spyware/images/installers-020305.html
http://www.benedelman.org/spyware/images/installers-020305.html#2
http://www.benedelman.org/spyware/images/installers-020305.html#3
(In reply to comment #99)
> 2 issues:
> 
> 1. how do you deal with self issued CA?
> 1a. most users can't verify the certificate
> 1b. users that can verify the certificate will cause spam to the developer if
> they do it properly

Self signed certificates are rejected by the add-ons update process (as they have been since Firefox 1.5.0.10 and Firefox 2.0.0.2)

> 2. even valid certificates are cheap and virtually anonymous
> a single valid malicious certificate and man in the middle attack can
> compromise a popular extension.
> screenshots illustrating some certificates:
> http://www.benedelman.org/spyware/images/installers-020305.html
> http://www.benedelman.org/spyware/images/installers-020305.html#2
> http://www.benedelman.org/spyware/images/installers-020305.html#3

Your screenshots appear to be related to signing installers so I am not sure how they relate to this bug.

(In reply to comment #100)
> 
> Self signed certificates are rejected by the add-ons update process (as they
> have been since Firefox 1.5.0.10 and Firefox 2.0.0.2)
> 

the corporate documentation is not clear enough on this afaict.

> Your screenshots appear to be related to signing installers so I am not sure
> how they relate to this bug.
> 

the screenshots aren't mine. afaict they were in the wild. please educate yourself to tell an installer from activex(tm)(r)(inc) - the difference is little, but important - the installer is the .EXE you click, the activex(tm)(r)(inc) is the annoying dialog that popups while you browse scarcely dressed ladies.

can some well known crypto expert please comment on installing digitally signed active content?

G30rgi, denying installation of extensions from an unknown source is beyond the scope of this bug. The threat model for this bug is a user installing a trusted extension which uses a http:// update URL, and then connecting to an untrusted network. With the current software update system, an attacker could install evil code on the user's browser via the update mechanism.

This bug prevents this by requiring the original (trusted) extension to either use non-self-signed https:// URLs, or to include a public key in the initial installation, and verify it on update. This ensures that the updates are authorized by the entity who made the original extension - but ensuring that these updates are benign are again beyond the scope of this bug.
wanted to point out that the doc http://wiki.mozilla.org/User:Mossop:Fx-Docs:AddonUpdateSecurity
wasn't clear enough about self signed certs when i first read it, but comment #102 made it clear.

the other thing is that any cert signed by a trusted CA and man in the middle compromises this solution (if i understand the solution correctly) - the screenshots just illustrated what certs are issued.

not counting extensions, does any signed cert + MITM compromise vanilla firefox update - i.e. no extensions?

Georgi, Either you don't understand the new design, or I don't.

AFAIK, this new scheme uses certs from CAs for SSL (https) servers only.
Initial installation packages received from an https server (with a valid 
cert from a trusted CA), and updates that are received from the https server 
designated in the initial download, are presumed to be authentic by virtue
of having been obtained from an authenticated source.  But after the initial
download, nothing thereafter requires SSL or any certs from any CAs. 
Indeed, an initial https download may be necessary only for the "install
manifest" and for nothing else.

The signatures in the "Update Manifest"s appear to be verified using a 
public key in the "install manifest".  While there are some similarities 
between an "install manifest" and a certificate, an install manifest is 
not a signed document, and I'm not aware that Mozilla is doing any strong 
identity vetting for it.  It's simply the way that an initial public key 
is established at the time of initial download, for verifying subsequent 
updates.  It appears to me that the public key in the install manifest 
provides essentially the same protection as an SSH server key. It provides 
assurances that the updates come from the same source as the original, 
and nothing more.  (Right, Dave?)

Some person or persons at Mozilla (Dave and other person(s) unknown) have 
decided that level of protection is sufficient for securing updates.  
I think it is appropriate for them to decide their threat model (the attacks 
against which they want to guard) and to employ a solution that just covers 
that model, if they wish. 

My concern for this solution was (and is) that it be sufficiently well
specified that others can implement tools for issuing the install and 
updates manifests, with the keys and signatures they contain.  Personally,
I would have preferred if the formats were entirely standard, so that 
existing tools would suffice and new tools would not be needed.  IMO, 
tools that implement the standards are more likely to keep apace with the 
developments in cryptography than special tools that are less frequently
and widely used.  But as long as Mozilla is willing to meet the challenge 
of maintaining and enhancing their new tool, that concern is largely academic.
Nelson is correct. Ultimately this is all about securing updates and nothing about securing the initial install. By presuming that the installed add-on is trusted we can use information from that install to establish the authenticity of the update information, either a public key to verify a signature in the update information, or a https url to retrieve the information from which we check that has a cert from a trusted CA.
may be wrong but if i read
http://wiki.mozilla.org/User:Mossop:Fx-Docs:AddonUpdateSecurity
correctly one of the ways to securely distribute an extension:
 Securing Update Manifests Through SSL
...
 Securing Update Packages Through Hashes 
If the update package is not being delivered by an SSL connection then an updateHash entry must be present.

is the following attack possible:

legitimate extension is installed and it is ok.

man in the middle performs the following attack at update of above extension:
serves manifest from https url and presents cert signed by trusted CA issued to john doe, homeless person in uganda (or something like this). then manifest has correct hash and .xpi may be served either via the same cert or unencrypted and the hash is ok.

if i understand correctly "if a trusted CA sold a cert to whomever, update security is ok".
(In reply to comment #106)
> man in the middle performs the following attack at update of above extension:
> serves manifest from https url and presents cert signed by trusted CA issued to
> john doe, homeless person in uganda (or something like this). then manifest has
> correct hash and .xpi may be served either via the same cert or unencrypted and
> the hash is ok.

So lets take one of my extensions as an example. The update url is at https://www.oxymoronical.com/firefox/nightly/update.rdf. For what you are suggesting to work you would have to find a CA that is in Firefox's trusted list to issue an ssl certificate for www.oxymoronical.com to a malicious person who does not really have control over that website (i.e. anyone but me).

If we have CAs willing to do that then they should be removed from the trusted list. As I understand it CAs go through a vetting process before being put on the trusted list precisely because we have to trust that they are giving due care and attention when issuing certificates because their word is what is protecting you while you perform any kind of secure transaction online.
(In reply to comment #106)

> man in the middle performs the following attack at update of above extension:
> serves manifest from https url and presents cert signed by trusted CA issued to
> john doe, homeless person in uganda (or something like this). then manifest has
> correct hash and .xpi may be served either via the same cert or unencrypted and
> the hash is ok.
> 
> if i understand correctly "if a trusted CA sold a cert to whomever, update
> security is ok".

As Dave mentions, for this attack to work, John Doe in Uganda would have to demonstrate ownership of the update url's domain (usually by placing a CA-supplied file at a particular location on the domain's web site, though some CAs - e.g. those who are domain registrars - may have other methods).  The CAs are audited on their performance of this check, and passing regular audits is a condition of entry into our root store.

Georgi, I know you have eyes in a lot of places -- if you know of CAs which are routinely issuing certificates for domains without demonstrating ownership, please do let us know - it's absolutely grounds for pulling their certs from our trusted list. 

(In reply to comment #108)
> As Dave mentions, for this attack to work, John Doe in Uganda would have to
> demonstrate ownership of the update url's domain (usually by placing a
> CA-supplied file at a particular location on the domain's web site, though some
> CAs - e.g. those who are domain registrars - may have other methods).  The CAs
> are audited on their performance of this check, and passing regular audits is a
> condition of entry into our root store.
> 
> Georgi, I know you have eyes in a lot of places -- if you know of CAs which are
> routinely issuing certificates for domains without demonstrating ownership,
> please do let us know - it's absolutely grounds for pulling their certs from
> our trusted list. 
> 

oops sorry. due to dumbness i thought the domain name could be controlled by the mitm. so my comments were quite dumb, sorry.

Attachment #281024 - Attachment is obsolete: true
(In reply to comment #90)
> I've done some quick testing with this and the signatures it produces fail the
> signature checks.

Oops, sorry. I misunderstood the crypto.signText()'s return value (PKCS #7).

BTW, I hope nsDataSignatureVerifier (i.e. em:signature) supports PKCS #7's signerInfo.authenticatedAttributes-like attributes. Then you can easily morph PKCS #7 signedData to em:signature.
Optional authenticatedAttributes contains original text's hash value and a hash value is calculated from DER-encoded authenticatedAttributes. At last the hash value of authenticatedAttributes is signed.
See the below page for details.
http://docs.sun.com/source/816-6152-10/sgntxt.htm

This may be another bug, but I have little time to file it.

(In reply to comment #104)
> Personally, I would have preferred if the formats were entirely standard, so
> that existing tools would suffice and new tools would not be needed.

Nelson, my above proposal may ease a part of your concern, right?
(Again, you'll be able to easily morph PKCS #7 signedData to em:signature.)
No longer depends on: 394990
Fore inexperienced users- to switch back to the FF2 behaviour with insecure updates enabled, these are the steps to follow:

1) open about:config
2) create new boolean value extensions.checkUpdateSecurity=false
3) restart
Inexperienced users should _not_ be disabling security features.
That being said it still took me some time to figure out how to make my installed extensions work on ff3 while waiting for the authors to supply updates. The hint in the release notes (or this bug) was missing from my point of view.
It's fine for it to take some time to figure out, and to require expertise.  How many people do you think are going to forget to turn that back on later?  Making the (permanent) workaround more widely known and accessible to the naive is likely to trade a pretty significant vulnerability for some very short-term convenience; I don't think it's a wise trade, or a helpful one for us to be promoting or easing.

(Maybe we should reset that pref on upgrade, hmm.)
(In reply to comment #114)
> It's fine for it to take some time to figure out, and to require expertise. 
> How many people do you think are going to forget to turn that back on later? 
> Making the (permanent) workaround more widely known and accessible to the naive
> is likely to trade a pretty significant vulnerability for some very short-term
> convenience; I don't think it's a wise trade, or a helpful one for us to be
> promoting or easing.

I agree.  Security is becoming more and more important with this ever growing user base, which is also changing in terms of expertise it seems.  Having this kind of pref setting all over the Internet might hurt someone or a certain group of users one day, which nobody here is waiting for of course.

> (Maybe we should reset that pref on upgrade, hmm.)

Either that or a big fat annoying warning, because even a kid like me (age 15) was able to get extensions going for FF3 (thanks to McCoy!).

Never mind. I didn't come to argue about security but to document the solution for a workaround while waiting for the real solution (an extension upgrade). If others are quicker- fine (comment 115).
Btw comment 114- the pref to disable it was not present which made it so hard to find.
Flags: wanted1.8.1.x-
Product: Firefox → Toolkit
Hmm, a setting that forces me to forego use of an extension entirely just because it doesn't have a secure update method registered is the first thing that will have to be disabled, and stay so. If you wanted to make things secure you would prevent auto-update of such extensions, not their installation.
Can you please head over to bug 429319 and raise your opinion there? Thanks.
(In reply to comment #119)
> Can you please head over to bug 429319 and raise your opinion there? Thanks.

That bug has nothing to do with disabling the installation of extensions with no secure update mechanism
(In reply to comment #120)
> (In reply to comment #119)
> > Can you please head over to bug 429319 and raise your opinion there? Thanks.
> 
> That bug has nothing to do with disabling the installation of extensions with
> no secure update mechanism

In such a case the initial and following comments on bug 429319 are unclear for people outside of this component.
You need to log in before you can comment on or make changes to this bug.