Closed Bug 471779 Opened 16 years ago Closed 10 years ago

Autoupdater needs to enforce a specific SSL server certificate

Categories

(Toolkit :: Application Update, defect)

defect
Not set
major

Tracking

()

RESOLVED WONTFIX

People

(Reporter: BenB, Unassigned)

References

Details

(Whiteboard: [sg:investigate])

There were lots of talk about why the auto-updater, for application and all extensions needs to use SSL. Otherwise, it's possible for an attacker to insert himself and feed the victim malicious executables.

Last week, we learned that some CAs are "less diligent than they should be and claim to be", to be polite. In fact, there were certificates issued by Comodo with *no checks at all*. Even if the checks were done, they are only done via http, which means https is pointless.
EV certificates require more checks, but even there, the security hangs off a phone call and similarly weak checks.
None of the CAs say that they are sure that they make no errors. They are prone to make errors, and they admit themselves that errors are inherent to the system.

Therefore, we cannot rely on CAs for highly security-critical stuff like autoupdate.

The solution is rather easy: Just mandate a *specific* certificate.

This is similar to the root CA certs that we ship. In fact, to not have this important cert on Internet-accessible servers, mozilla.org could create a root cert and sign the concrete (autoupdate) server cert with that, and the autoupdater would refuse any certs that are not signed by that root cert. The mozilla.org root is only used to autoupdate servers, nothing else.

With that, we don't prevent CAs to hand out certs for addons.mozilla.org and similar. Thus, the weaknesses in the CA system, where anybody can get a cert for their private mail server, is eliminated.

For third-party add-ons not distributed and updated via addons.mozilla.org, we can find a similar solution. E.g. the extension ships either the (root) cert or the fingerprint of a key, next to its update URL, and the update URL is mandated to have that key. This has the additional value that the author does not have to pay the certificate. There need to be tutorials, though. Security is not an additional concern, because the source code and binary write access needs to be kept secure as well, anyways.

Concrete actions:
1. Create private key and cert as root cert for mozilla.org update servers. let that key sign another cert for addons.mozilla.org and update.mozdev.org (or however the servers are called), to be placed on the servers. The private root key is put on a USB stick in a safe.
(1b. Transition period: Old clients need to accept the new cert as well. Either one of the existing CAs signs the root cert as intermediate (for free), or we just make a new hostname/IP-address and let the new clients update to that, while old clients check the old hostname with the old CA-based certs.)
2a. Let auto-updater intervene at the cert check step (I hope such a hookup callback already exists in PSM), look at the cert, and walk up the cert root, and check whether any of them match the expected certificate fingerprint. If so, accept the cert and connection, otherwise abort the connection and log an error.
2b. (Better alternative to 2a.) Add a feature to NSS/PSM/Necko to (only) allow a specific certificate for a certain connection (nsIChannel), irregardless of whether it's signed by a cert in the root cert database or not. Autoupdater would set this.
3. Put the certificate fingerprint in the autoupdater config, next to the autoupdate URL.
Component: Security → Application Update
Product: Core → Toolkit
QA Contact: toolkit → application.update
Version: 1.9.1 Branch → Trunk
Importance: High.
As it stands, the security of my system is dependent on the CA system.

And the security of the CA system is generally considered to be rather low, by
the Mozilla security-group and m.d.t.crypto. Any mildly dedicated cracker could
probably get a wrongly issued CA cert. As said, CAs make no assurances, either.
This is a dub of 471672
Status: NEW → RESOLVED
Closed: 16 years ago
Resolution: --- → DUPLICATE
No, it's not. I specifically say that EV is not adequate either.
Status: RESOLVED → REOPENED
Resolution: DUPLICATE → ---
Ben, I think Mozilla shouldn't start using (and taking care of) its own CA roots, certainly not saving on USB sticks. But I reflected your idea about checking specific fingerprint(s) by the updater in bug 471672.
Status: REOPENED → RESOLVED
Closed: 16 years ago16 years ago
Resolution: --- → DUPLICATE
Sorry, I didn't meant to dup it again. That happened by mid-air collision.
Status: RESOLVED → REOPENED
Resolution: DUPLICATE → ---
I don't trust any roots that the public can get certs to, nor any roots that are operated by third-party organizations. Not when the risk is getting access to 200 million PCs.
Note that Mozilla.org already needs to properly secure the executables and their creation process. The risk of the private key is not higher than that, as binaries can do anything on all the users' systems, including replacing root certs.
Please don't get distracted by the "root cert". That is merely a technical.
The main idea is that the autoupdater knows and enforces the certificate fingerprint that the update server has.
Status: REOPENED → NEW
The major problem with this is that certs expire. If we're diligent we can note
the cert expiry time and get the new cert enough in advance (shortening its
effective life) to be able to ship an update with knowledge of the new cert.
That update will either have to support two certs (old and new) or updates will
be broken until the actual cert cut-over.

At cert cut-over anyone who hadn't upgraded yet is out of luck and updates will
fail.

There's more than one critical update cert, though. There's aus2.mozilla.org
for application updates and versioncheck.addons.mozilla.org for daily addon
update checks. In addition the "Get Recommended addons" feature uses
services.addons.mozilla.org, though you could argue that's no less secure than
visiting the website itself and is good enough as-is. It uses the same wildcard
cert as versioncheck so currently that's not an additional problem anyway.

The blocklist check still uses plain addons.mozilla.org which seems wrong to
me. I could argue either way that it's a versioncheck or a service, but if
we're trying to build in SSL cert fingerprints I'll call it a versioncheck to
keep the number of hosts down.

Oh, and the Plugin Finder Service uses pfs.mozilla.org

We could build-in knowledge of a single certcheck site, and download a file
containing the current expected certs. You still have the expiry problem where
old clients are no longer able to update.

(In reply to comment #8)
> Note that Mozilla.org already needs to properly secure the executables and
> their creation process. The risk of the private key is not higher than that,
> as binaries can do anything on all the users' systems, including replacing
> root certs.

True, but if we have a security problem after we've shipped the binaries you're using you're still safe. If we detect it we wouldn't ship again until we cleaned it up. If we lose control of the private key, on the other hand, that's a one-time event that immediately compromises all past and future versions (until we detect it and change certs, stranding all old clients with no access to the correct update server but still able to connect to a rogue update server).
> The major problem with this is that certs expire.

Not necessarily. You can create a cert without expiry, or expiry in 20 years. Most CA root certs are like that.

> We could build-in knowledge of a single certcheck site, and download a file
> containing the current expected certs.

or create a small cert hierarchy, i.e. have the cert, that the app knows, sign the actual server certs. The server certs would need to be signed by that known cert.

> If we lose control of the private key, on the other hand, that's
> a one-time event that immediately compromises all past ... versions

Same is true for other (arbitary code execution) security hole in the code that we find.

Personally, I don't think it's that hard to keep one file secure, which you use 3 times a year. THe risk of that being compromised is much, much smaller than the current risk inherent in the CA system where anybody can get certs and the checks are - at best - based on a phone call.
Ben,  You're on record as not generally trusting CAs.  But for Mozilla to 
stop using a signature verified with a CA-issued cert, or to stop verifying
that signature on the strength of the cert being issued by a trusted CA,
sends a signal to the world that Mozilla doesn't really trust the PKI 
system of CAs that its products expect the rest of the world to trust.
So, I think this is a Mozilla policy issue/question.  
The question is: does Mozilla trust PKI & CAs, or not?
OS: Linux → All
Hardware: x86 → All
So:
- Create a key/cert ("A"), with 20 year expiry, and store it offline.
- Create another key/cert ("S1") for aus3.mozilla.org, similar for other
  update servers. This is placed on aus3.mozilla.org, to drive https/SSL
  (including private key). Expiry could be one or two years.
- Key A signs cert S1.
- The autoupdater knows the fingerprint of cert A, in the autoupdate config,
  and mandates and checks that all update server certs (S1) is
  signed with that cert A.
- Cert A could be in NSS root cert store to allow browsing to it, or not.
  I prefer not putting it there. Either way,
  key A never signs the general public, only mozilla.org servers.

---

> Ben,  You're on record as not generally trusting CAs.

I wonder how you can still question this, given recent developments. A credit-card purchase for $100 is one thing, but the autoupdate is about 200 million PCs. Please see the initial description.

> does Mozilla trust PKI & CAs, or not?

We do *not* trust it with $200 billion. None of the CAs say it's to be trusted with that. (200 million PCs, each with data in value of $1000, some desktops with credentials of value of $1000000 or even life-critical data, e.g. Chinese dissidents or node photos of your wife.)

Nelson, please do not make a political question out of that. It's rather trivial to limit the attack surface, by limiting the certificate issuance from the general public to internal.
What impact would this have on updating or installing extensions from
www.mozdev.org?  How about installing extensions that have been downloaded to a
local hard drive?   

What impact would this have on downloading Mozilla installation files via FTP (e.g., Thunderbird, SeaMonkey, Camino)?  How about reinstalling old installation files (e.g., reinstalling a version over the same version that has become corrupted)?  

Yes, I know this bug report addresses automatic updates.  However, I'm
concerned that implementation might go beyond that (as often happens).
(In reply to comment #13)
> I wonder how you can still question this, given recent developments. A
> credit-card purchase for $100 is one thing, but the autoupdate is about 200
> million PCs. Please see the initial description.

Your comparison is flawed. You are comparing the impact of a single compromised credit card transaction to 200 million compromised application updates.
> What impact would this have on updating or installing extensions from
> www.mozdev.org?

mozdev could use its own root, via the same mechanism as other extensions (see above). Essentially, the valid cert fingerprint is configured right next to the updater URL.

Alternatively, mozdev.org update server could be signed by the mozilla.org cert, pretending that mozdev is internal, because mozdev gets special treatment from Mozilla anyways (whitelisted). I prefer the other approach, though.

Whether to use or enforce this for the browsable websites is another question. This bug is only about the updater (automatic or semi-automatic updates of FF and extensions).

> How about installing extensions [or installers] that have been downloaded
> to a local hard drive?

No relevance, No SSL involved. Nothing changed there, that should work either way.
(In reply to comment #16)
> Alternatively, mozdev.org update server could be signed by the mozilla.org
> cert, pretending that mozdev is internal, because mozdev gets special treatment
> from Mozilla anyways (whitelisted). I prefer the other approach, though.

Huh? Afaik, we don't whitelist mozdev.
OK, you used to, for XPI install. Ignore that part, then.
IMO, mozilla.org does not have the knowhow or infrastructure to get into the CA business. May I suggest a compromise? The autoupdater should require an EV root, and it should require an EV root from a specific CA. 

This way, we trust only a single CA and are immune to a) poor quality DV cert issuing practices anywhere and b) the possible errors of any CA other than the one we pick. It's like having our own CA, but outsourcing it - which is good, as they are going to be better at it than we are.

I think this is a much more flexible solution than requiring a specific cert (dveditz ably outlined the practical problems with that) and yet still addresses the threats outlined. We can revoke compromised certs, and buy updates as necessary, without this system breaking.

Gerv
Gerv, this is *NOT* a CA. And it's not a business.
The cert does not sign general certs. It only signs Mozilla servers.
It's basically a self-signed cert, with only one more step involved.
Most admins have created self-signed certs, and there are tutorials how to sign one cert with another. Surely, we should know how to sign a cert?
And keeping one file secure is not that hard.

As said above, I do not believe that EV is secure enough.
I pointed out big loopholes during the EV discussion a year or so ago. The whole security of EV can hang off a single phone call (which should be easy to intercept, technically or via social engineering). And the weakest CA and person and step determines the strength. I wasn't concerned much, if it's only about credit card transactions. I am extremely concerned when it comes to my system security and that of 200 million other people.
Ben: it's getting into the business of having to hold highly valuable private keys secret. That's (one part of) the CA business. Your point is that a mozilla.com certificate is very valuable. I agree. Without wanting to diss our IT team, I'm sure they can think of better things to do than put in place procedures and hardware to safely and securely store keys capable of creating one (or the equivalent of one).

> The whole security of EV can hang off a single phone call (which should be 
> easy to intercept, technically or via social engineering).

Can you please point me at where you outlined the scenario in which this is the case?

> And the weakest CA and person and step determines the strength.

That isn't true if you adopt my proposal in comment #19.

Gerv
Whiteboard: [sg:want P2]
Gerv, we already have to protect extremely high value binaries, even on *public* servers. If somebody manages to get access to the servers and alter the Firefox updates, all hell will break loose. Keeping a tiny file securely *offline* is a ridiculously easy task in comparison.

Please consider that AOL has a public root cert in NSS, just for their intranet email. Really, I think this is "we can't secure this file" argument is silly.

Aagin, this is *not* CA business. This is a just self-signed cert (which signs 3 other certs) for your *own* servers, nothing more. It's not valid for webbrowsing, just for mozilla.org update servers, and just in one case (updater).

---

Another case to consider:
If you mandate EV, you shut out extension authors with their own update URL. Either you leave their users in the cold, they'd still be vulnerable to DV certs like now, or you mandate extension authors to buy EV certs, which almost nobody can affort and some (natural persons) can't even get per the rules, even if they had the money.
With my proposal, they only have to create a self-signed cert and sign another with it, put the fingerprint in the config, and store away the key on a CD. (Note that we already rely on the extension author with source code and executables, so if they can't keep their system secure, their extension is also vulnerable.) This means they get away cheaper than now.
> > The whole security of EV can hang off a single phone call (which should be 
> > easy to intercept, technically or via social engineering).

> Can you please point me at where you outlined the scenario in which
> this is the case?

http://groups.google.com/group/mozilla.dev.security/msg/f053b6a2c07394d8
(In reply to comment #12)
> 
> The question is: does Mozilla trust PKI & CAs, or not?

No, it isn't a black & white issue of that nature. The question is whether we need them for this use case. It doesn't seem like we should. Would TLS-PSK or some other mechanism of bootstrapping the connection be a better fit?

That said, Gerv's suggestion in Comment 19 would be an improvement.
In reply to comment 25, Ben, that assertion has been thoroughly repudiated.
I didn't see anything in here about other XUL apps so I figured I'd mention that the update mechanism is used by non-MoCo apps like Songbird, Komodo, and others.
Yes, they'd work just like mozilla.org or third-party extension owners: They have their own certs and configure the fingerprint next to the update URL. There is nothing special about mozilla.org or the mozilla.org cert - mozilla.org's cert is only for mozilla.org servers, and you have your own.
(In reply to comment #23) 
> http://groups.google.com/group/mozilla.dev.security/msg/f053b6a2c07394d8

And I refuted your assertion at the time. 

If it's so easy to do, I give you permission to attempt to obtain an EV cert for my domain dsmltools.org. If you can do it, or even if you can outline a detailed set of steps by which it might be done (foiling all checks, not just one) then we'd be very interested and in your debt, and will fix the process.

Gerv
> if you can outline a detailed set of steps by which it might be done
> (foiling all checks, not just one)

I think I showed in the email that the chain of checks hangs off the one test I mentioned, and I hinted at a few ways to circumvent it. You consider them to be not severe or made off by other checks, and I don't think that is the case, as I think these checks are also weak (like checking a phone book).

Your challenge requires me to go through *all* checks. I do not have the time to do that at the moment (or ever). I personally think that I have shown that while the EV checks are higher than current standards, they are not impossible to circumvent with some effort.

And I don't think any CA claims that their checks cannot be circumvented by a an attacker with some time. This is all I need to know.

The stakes for the application update are extremely high, and therefore I think we must protect it as well as possible. I believe that my proposal is *much* safer than EV certs, and more practical. If you can break into Mozilla offices (or your bank office) and the safe there, you can probably gain root access to the build servers one way or other, too.

Plus, EV is cost prohibitive for more third party authors.
> I personally think that I have shown that
> while the EV checks are higher than current standards, they are not impossible
> to circumvent with some effort.

Indeed. No checks are impossible to circumvent. The question is rather whether it's financially beneficial to do so. If it costs you $20,000 to fool the checks, and you make $5,000 from phishing before OCSP shuts you down, you aren't going to bother. The checks don't have to be perfect, they have to be good enough. And I think they are, but am always open to assessing evidence to the contrary.

I think updating Firefox itself should mandate an EV certificate from a specific provider. For extensions, you are right, we can't mandate EV. But we could still lock to a specific provider nominated by the extension author. That would segment the attack space.

Gerv
> The question is rather whether it's financially beneficial to do so.
> If it costs you $20,000 to fool the checks, and you make $5,000 from phishing

This is not about phishing, it's about access to 200 million PCs. Even a single PC can be valuable enough for one million. Or information on the PC which is intimate enough to destroy lifes. I can guarantee you that there are such PCs. There is no financial calculation which can reflect the damage done when you break into some of these people's computers. Given that CAs work with such financial calculations and assumptions, they are unsuitable for updates of executables.
For the security team: when I implemented bug 544442 I discussed using EV certs instead and it was decided that implementing bug 544442 would be enough. If it isn't enough, would implementing this allow removing what was implemented in bug 544442 and if not, why not? If this is implemented do you foresee something else needing to be implemented and if so, what specifically? Thanks
(Speaking for myself, a member of the security group, but not for the group: )

> would implementing this allow removing what was implemented in bug 544442?

Yes. If you check for a specific cert, there's no point in checking the issuer, too.

(Unless you don't trust your prefs, which would be fairly silly.)

The protection in this bug would be much stronger, because with bug 544442, we're still dependent on a third party and their security measures, cert checks and organizational weaknesses.

With this bug here, we could be reasonably sure that we really speak with *our* server.

> If this is implemented do you foresee something
> else needing to be implemented and if so, what specifically?

You cannot get a stronger ensurance that you're really talking to our server / server farm.

However, we'd still be vulnerable to our server being hacked or the upload chain being hacked.

An even stronger protection would be to sign each file, via PGP, S/MIME, or similar. This protects against hacks of the server farm, too. The files would be signed on the same machine where they are built / compiled. This machine infect the binaries anyway and I don't know of a way to detect that, so that machine *must* be secure anyway, so it can just as well sign the binaries and binary diffs.
> If you check for a specific cert, there's no point in checking the issuer, too.

Sorry, I misspoke. See comment 13 and 22.

But I do think that file-level protection is even better.
My main concern is related to reworking what we implement to ensure the security of the update periodically with myself being the only person working on app update at the moment.

If this is desired, please also file a releng bug to get the mar signed.
Re comment #16:  

At the end of the comment (replying to the second paragraph of my comment #14), it is asserted that this change would not impact installation from local hard drives.  

I accepted that assertion until I ran into bug #646962.  That later bug report describes the problem where a signed .xpi file on a local drive cannot be installed while the Internet connection is disabled because the Ad-ons Manager wants to verify the signature.  

Yes, I realize that the Ad-ons Manager is a different component in Toolkit than Application Update.  However, the same problem can be created while implementing this (471779) if care is not taken to recognize the case of requesting installation from a local drive when there is no Internet connection.
I'm pretty sure we don't actually want to pin SSL to a single cert, but I'll move it to sg:investigate so it remains flagged as a security-related issue. Maybe some libpkix tricks to require the SSL cert be cross-signed by a particular cert? In any case I'd much rather put the effort into signing the update snippets themselves (the original intention of bug 544442) than put more effort into SSL-me-harder on the channel.

Note we also are interested in the emerging specifications about locking a site to a particular cert and/or CA. Although those specs are currently immature (and some would require DNSSEC support and deployment first) I'd also rather put effort there and piggyback off that to satisfy this bug request.
Whiteboard: [sg:want P2] → [sg:investigate]
> cross-signed by a particular cert?

Great idea!

> I'd much rather put the effort into signing the update snippets themselves

Shall I file a new, separate bug for that? (Last part of comment 35)
What is the current status off this issue? (After Dutch CA blow up)
The current state is noted in comment #39.

Regarding the "Dutch CA blow up" there is already a mitigation in place that would have prevented a bogus cert issued from this CA for Mozilla from compromising Firefox.
[Sorry for the bug spam; there are MANY Toolkit bugs about this issue, either with respect to application updates or with respect to addon install/updates.]

I strongly suggest we WONTFIX this. See bug 770594 comment 9 and bug 770594 comment 11. This does not actually help secure users because we need to be able to update Firefox through (corporate and/or local anti-virus) MitM proxies, which aren't (hopefully!) even chained to our built-in trusted roots, and which are basically worthless as far as ensuring the integrity of Firefox updates is concerned.

This is basically a solved problem on Windows because on Windows we use signed MARs for updates. Instead of doing what this bug suggests, we need need to extend the
MAR signing mechanism used on Windows to the other platforms, and then do the *exact opposite* of what this bug suggests: remove the extra checks of the update service SSL certificate that we already have in place--that is, fix bug 770594. (In fact, I would
like to avoid using SSL at all as part of updates, to reduce the number of
moving parts in the update process; e.g. to avoid bugs in our SSL
implementation affecting the ability of users to download Firefox.)

The reason this extra check would hurt security is that it would disallow (or delay) anybody using a corporate MitM proxy from updating Firefox while they are on that network. But, it is better to (securely) give people Firefox updates as fast as possible.
> we need to be able to update Firefox through (corporate and/or local anti-virus) MitM proxies

1) They are already broken. We currently check for a specific CA, and these CAs don't give out MITM certs, from what I know (and "hope", as you do).
2) The admin can reconfigure the clients which cert they check for. The admin can just configure their own cert (or update server, even)

> This is basically a solved problem on Windows because on Windows we use signed MARs for updates.

Where can I read about this? I agree that signing the content, the executables themselves, is the better way, as long as the tools are free and easy to use. And it works on all platforms.
> The admin can just configure their own cert (or update server, even)

FYI: https://developer.mozilla.org/en-US/docs/Mozilla/Setting_up_an_update_server
(In reply to Ben Bucksch (:BenB) from comment #44)
> > we need to be able to update Firefox through (corporate and/or local anti-virus) MitM proxies
> 
> 1) They are already broken. We currently check for a specific CA, and these
> CAs don't give out MITM certs, from what I know (and "hope", as you do).
> 2) The admin can reconfigure the clients which cert they check for. The
> admin can just configure their own cert (or update server, even)

Even if we were "against" MitM proxies as an organization (which I don't want to debate here), it still doesn't make sense to do things that prevent software updates over these kinds of networks if we have alternative security measures (i.e. signed updates) that make those things superfluous.

> > This is basically a solved problem on Windows because on Windows we use signed MARs for updates.
> 
> Where can I read about this? I agree that signing the content, the
> executables themselves, is the better way, as long as the tools are free and
> easy to use. And it works on all platforms.

https://wiki.mozilla.org/Software_Update:MAR. The code for the tools like signmar is in modules/libmar.
For app update we are going with mar signing as our method to prevent exploits.
Status: NEW → RESOLVED
Closed: 16 years ago10 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.