Closed Bug 435082 Opened 16 years ago Closed 12 years ago

Make Mozilla products check public keys against Blacklist for Debian/Ubuntu openssl flaw (CVE-2008-0166)

Categories

(Core :: Security: PSM, enhancement)

enhancement
Not set
normal

Tracking

()

RESOLVED WONTFIX

People

(Reporter: jamie, Unassigned)

References

Details

User-Agent:       Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b5) Gecko/2008050509 Firefox/3.0b5
Build Identifier: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9b5) Gecko/2008050509 Firefox/3.0b5

Debian and its derivatives like Ubuntu had a flaw in its PRNG which ended up generating very weak ssl certificates[1][2]. Ubuntu developed openssl-vulnkey and openssl-blacklist[3] (which is now in Debian) to discover weak X.509 certificates and RSA private keys. The blacklist consists of sha1sums of moduli that are known to be weak.

Random values as generated by RAND_bytes() are based on PID, architecture and status of ~/.rnd. On standard Debian and Ubuntu installations, there are 2^15 PIDs. There are three architectures: little-endian 32bit, little-endian 64bit and big-endian 32bit. If ~/.rnd is empty or non-existant, openssl generates one set of keys, and if ~/.rnd is writable, another (~/.rnd changes depending on but, but subsequent runs with the same PID results in unchanged ~/.rnd). Currently, it is known that 2^16*3 moduli for each number of bits are weak. Currently, Ubuntu has blacklists for 2048 and 1024 bits, with 512 and 4096 lists being generated.

Ubuntu has issued several security updates for packages that have been affected by a the broken PRNG, but it would be a good idea if mozilla products could check the openssl-blacklist for bad certificates to protect its users.

I am the author of openssl-vulnkey and openssl-blacklist, and would be happy to help in any way I can. The full sha1sums of the moduli are in the source code for [3], as well as openssl-vulnkey.

[1] http://www.debian.org/security/2008/dsa-1571
[2] http://www.ubuntu.com/usn/usn-612-1
[3] https://launchpad.net/ubuntu/intrepid/+source/openssl-blacklist/0.2


Reproducible: Always

Steps to Reproduce:
1.
2.
3.
Jamie: thanks for getting in touch, and for your work on openssl-vulnkey.

What exactly are you suggesting? That Firefox should bundle the list of hashes and check each SSL certificate it encounters against it? I think the size of the list would preclude such a thing. There's also the possibility of a web service, but that has performance and privacy implications.

Gerv
(In reply to comment #1)
> Jamie: thanks for getting in touch, and for your work on openssl-vulnkey.
> 
> What exactly are you suggesting? That Firefox should bundle the list of hashes
> and check each SSL certificate it encounters against it? I think the size of
> the list would preclude such a thing. There's also the possibility of a web
> service, but that has performance and privacy implications.
> 

oops ... gerv, i had some initial discussion with kaie (CCed). This most likely should be tackled in nss; since we are in firefox string freeze we probably only can just disallow to connect to any such sites completely. So maybe we should move this bug to the NSS product

Anyway, ideas are welcome.

Jamie, kaie asked about eventual false positives of our detection algorithm. Could you elaborate on this?

Anyway, confirming bug for now.
Status: UNCONFIRMED → NEW
Ever confirmed: true
Product: Firefox → Core
QA Contact: firefox → toolkit
I am not really sure the best way this should be handled, but it's clear that users of mozilla products should be protected in some way. As Alexander said, it does seem to make sense to have firefox check the blacklist and deny or notify the user. If you do this in NSS, then you protect all the products that use NSS, which seems like a good idea.

As for false positives, I don't believe there are any. The blacklist contains sums of moduli with known factors (p and q), so if a site happened to create a certificate on a non-Debian or non-Ubuntu system, it doesn't matter-- the certificate/key is compromised because the modulus can be factored.

Today it was determined that openssl 0.9.8g (and possibly 0.9.8f, which we never shipped) acts differently than 0.9.8e, and there are an additional 2^15*3 keys that are vulnerable that are not in the openssl-blacklist package currently. As these versions of openssl only occurred in non-stable releases of Debian, and Ubuntu Hardy only being a month old, it is not believed that many of these keys are actually in the wild. That said, we are generating an updated blacklist with these keys as well.
jamie: what approximate size will a compressed version of the blacklist be when it's complete for all variants?

Gerv
It depends on how you want to check-- we currently use the least significant 80 bytes of the sum in the installed blacklist (we ship the full blacklist in the source package). Others have suggested using only 40 bytes, but the fewer bytes used means more chance of false positives.

This said, the complete, compressed, full sums should be approximately 12MB, with 80 byte sums approximately 6MB. This is for 1024 and 2048 bit sizes. We are generating blacklists for 512 and 4096 also, but we are going to ship these as addons. Basically, approximately 3MB compressed per bit size with 80 byte sums.
Right. 9MB (3MB compressed x 3 bit sizes) is larger than the rest of Firefox and NSS put together. There's absolutely no chance we can ship that. None.

The only other option is either a web service, which has privacy implications, or getting Firefox to download the file on the sly after it's been installed. And for some people, 9MB is a lot. (Metered tariffs, modems etc.)

And that's without considering how much engineering we'd need to do to set up the checks.

We've already checked all the certs in the NSS database against the existing lists. Can you outline for us what the security risk is if we didn't do anything else?

Gerv
(In reply to comment #7)
> Right. 9MB (3MB compressed x 3 bit sizes) is larger than the rest of Firefox
> and NSS put together. There's absolutely no chance we can ship that. None.
> 

another idea would be to download that blacklist while browsing. You already pull down a considerable amount of data for safe-browsing. so maybe an option.
sorry, see that you just suggested this :/ Anyway, IMO downloading the blacklist would be a reasonable thing to do.
But before we even consider that, we need a threat assessment.

Gerv
Basically, the security of SSL/TLS depends on the modulus being hard to factor and private. Certificates/keys/certificate requests generated with the vulnerable openssl results in these factors being known, and therefore messages can be decrypted if the traffic is available (classic man-in-the-middle (MITM)). Client side certificates have traditionally been used to help prevent MITM, but in this case, because the factors of the moduli are known, client side certificates offer no protection. Any certificate with the vulnerable moduli should be considered completely compromised and offer no protection, regardless of what is used in other parts of the certificate (eg Subject, ...).
Another option which may or may not be appropriate that was suggested by a colleague would be to create a Firefox extension for the blacklist and checking. This would allow users to opt-in, and distributors of Firefox and related products could ship this by default. Presumably, extensions would then have to be written for each class of products (eg, Firefox, Thunderbird, Sunbird, etc) that uses SSL.
While it is implied in my comments, I want to explicitly say that this affects
all x509 certificates, RSA private keys and certificate requests. Therefore,
any protocols that can use these certificates have to be considered. For
example, https, pop3s, imaps, ldaps, etc are all affected.
Can we break this down a bit more? Tell me where this is wrong.

1) Certificate requests are vulnerable. So if someone has made a bad request for a (good or bad) certificate and an attacker has obtained a copy of the request (they are often sent by unencrypted email) then they can compromise the certificate for which the request was made.

There is nothing Firefox can do about this situation.

2) Certificates are vulnerable. If a bad certificate (generated on a problematic system) is being used on a website or service, that service can be MITMed by anyone with access to the traffic, because they can decrypt the stream, modify it, and reencrypt it before sending it on.

We could fix this by checking all certificates we encounter against the bad certificate database.

3) Certificates are vulnerable. If a bad certificate is a root certificate in Firefox (discussion in bug 434128) then someone can forge certificates which appear to be signed by that certificate, and therefore impersonate any website via DNS poisoning.

We can check this using the mechanisms discussed in bug 434128. This is already in progress.

Are there any other risks?

If we decide that the Mozilla project needs to do something about case 2, we have two possibilities - which match the two options for the anti-phishing filter. We can have a real-time checking service, which may have privacy issues, or we can have a list download. The latter is even more feasible than for anti-phishing because the list won't change. It's easier to scale and doesn't require server-side work. So, particularly at short notice, I'd go for that option. We may need to find a partner to host the file if our infrastructure can't cope.

There are several possible ways of integrating this technically. We could embed the checking into NSS: NSS could use its newfound ability to make HTTP requests via the client (used for OCSP) to grab the list, then return bad certificate errors if it finds a bad one. This has the advantage of being a single code change which could work for all apps. If we ship the database in a DB format NSS understands already, my guess is that the code would be roughly:

At startup:
- Download database if you don't have it
- Open database

At check time:
- Make hash of certificate (if it doesn't already exist); take lowest 80 bytes
- Try and retrieve record from database using that key
- If success, return error code
- If failure, continue

Alternatively, we could do it at a higher level using an extension. This has the advantage of not changing core crypto code, but we would need a new extension for each app, and separate code for each protocol. And probably some sort of UI treatment as well. This sounds like more effort for patchy coverage; IMO, and as kai says, we should explore the NSS route first.

NSS team: is it feasible to change NSS in this way? Is there a suitable error NSS could return to the application?

Then we need to decide what blacklists to ship, and how much. 

Given that Netcraft found 870,000 valid certificates in May, the chances of their being even a single accidental false positive with 40 bytes seems tiny. (40 bytes is 2x10^98 possibilities, more than there are atoms in the universe.) 8 bytes (2x10^19) sounds like plenty to me.

We don't need 512 bit sizes - very few people use such weak keys anyway, and I think we have a warning about them. (Opera certainly does). So we need 1024, 2048 and 4096. We could also probably get away with shipping the 0.9.8e duff keys only, given what you say about the limited distribution of 0.9.8g. 

I calculate that (8 byte sums, 3 bit sizes, one version) from your figures above as 900k. Better than 12MB :-)

Comments?

Gerv
Your 3 listed risks are correct IMO. FWIW, the NSS approach makes a lot of sense to me, because the code can be fixed in one place and all products are fixed.

I am in the process of regenerating the blacklists so there is more data in the pristine source. This will include bits, type (writable .rnd, unwritable .rnd and zeroed .rnd), PID, architecture and full hash. This should make it easier for Mozilla (and others) to decide what to include or not include. This will take a while as I am generating 512, 1024, 2048 and 4096, but I will let you know when they are ready.
Some comments in reply to comment 14:
1) The list of 3 vulnerabilities doesn't appear to include intermediate CA 
certs.  But the solution for item 2 should also cover them.

2) NSS already uses a callback to ask the application to make the decision 
regarding received certs.  PSM supplies that callback.  NSS supplies functions
that are commonly used by those certificate decision callbacks.  Applications
are free to use the NSS-supplied functions or not, as they wish.  My point
here is that the problem can technically be solved in either PSM or in NSS.  
if we want a plugin to be able to participate, PSM changes will be required,
I think.

3) As far as making this an addition to NSS, I think the way to do this would
be to add another helper function to the set of helper functions that NSS 
now makes available to the application's certificate authorization callback.
If it is going to be very large, I would recommend putting it into a separate
shared library, built and packaged separately from the rest of NSS, so 
that we do not burden all developers with building it, and do not burden all
NSS-based apps with the code. 

4) Not all key sizes are powers of 2.  IIRC, the US Government's recommendation for key sizes sets a minimum recommended RSA key size at 3k bytes for 2010.
So, I'd expect to find some 3k bit keys, too.

5) I'm not yet convinced this is worth doing.

6) see also bug 435261.
Nelson: I'm also not yet convinced this is worth doing, but I'm open to being so.

What other ways are there of quantifying or solving the problem? If NetCraft  sent us copies of all 3 million keys they've found, we could test them and get a handle on the size of the problem. If people think that would be a good idea, I can contact my contact at NetCraft.

Do CAs keep records of all the keys they sign? Can they test them and contact owners?

Gerv
see bug 435490 for a bot scanning idea.
Actually, now that I read comment 17, I think you've also thought of it.
Assignee: nobody → kaie
Severity: normal → enhancement
Component: Security → Security: PSM
QA Contact: toolkit → psm
Summary: Blacklist available for Debian/Ubuntu openssl flaw (CVE-2008-0166) → Make Mozilla products check public keys against Blacklist for Debian/Ubuntu openssl flaw (CVE-2008-0166)
(In reply to comment #6)
> It depends on how you want to check-- we currently use the least significant 80
> bytes of the sum in the installed blacklist (we ship the full blacklist in the
> source package). Others have suggested using only 40 bytes, but the fewer bytes
> used means more chance of false positives.

80 bytes? Did you mean bits? Could these be stored as hashes (say, 128 bits per), or is the actual value required for detecting the bad keys?

An alternative to shipping the full list might be to trickle it down to clients at run time, as is done with the Safebrowsing (phishing/malware) list in the browser... It's shipped empty, and clients download data after install. Mine (urlclassifier3.sqlite) is currently -- eep! -- 56MB. Doing this would require significant server-side work, though. And bandwidth: 12 MB * 180 million Firefox users is 2,000 TB.
Justin: see comment #14 for some better size figures. Only 150TB to shift :-)

Gerv
Several comments about all this.

If ever there was a good argument in favor of honoring only certs from 
real approved CAs, and disallowing self-issued certs, this is it. 
The very thought that Mozilla programs are going to need to carry around 
8-10 Megabytes of more stuff just to deal with all those bogus certs for
which no OCSP revocation service exists is discouraging.  

This incident does NOT show that the SSH-model is the solution, and that CAs
are worthless (as so many assert).  On the contrary, it shows the value of 
CAs and of OCSP revocation.  It also demonstrates the need for revocation
that scales up to large numbers, which CRLs don't, but OCSP does.

If our choices are:
a) carry around 8MB of extra crud to continue to deal with self-issued certs,
and expect this to grow each time that some other flawed software is deployed,
or 
b) honor only certs from CAs that do revocation, 
it's clear to me which of those makes the most sense for Mozilla.

This incident also demonstrates the value of periodic FIPS evaluation of 
crypto code.  (NSS has been tested and passed FIPS testing 4 times now.)
I wish we could get a little more press mileage out of this.  
(In reply to comment #17)

> What other ways are there of quantifying or solving the problem? 

For an article we contacted all major CAs, including Verisgn (also running Thawte, GeoTrust, and RapidSSL), Commodo and a couple of german ones. 
The results varied a little bit, but most of them found about 3 to 5 percent of weak keys among their customers. 
That matches our quick check with almost 4500 web servers offering valid SSL certificates (i.e. issued by a trusted CA and CN matching the hostname). We found round about 3 percent were *still* using weak keys (1st June).

Taking the cited Netcraft number of "870,000 valid certificates" you get around 26,000 weak but valid certificates out there. 

BTW: We asked if Verisign and Comodo are goig to contact the owners of weak keys -- they won't. They are happy with their press releases that got them "good coverage".

Now for my 2cents:

From a users perspective SSL is almost unusable right now. In the default configuration most current browsers including FF 2.x do no online verification with CRLs or OCSP. So they don't ever see that a cert has been revoked. On the other hand the user has no way to know if a site owner published a weak certificate somewhen in the past that did not expire yet. How should I trust?
 
There is a weak certificate for a bank out there, valid until 2011. Who has it? He could set up a fake banking site. I have a couple of hundred weak certificates, I bet others have thousands of them. They could all be used to setup spoofed sites with correct certificates. Combined with a big pharming attack this would be a feast for fraudsters.

So action is necessary because the only place to migitate that is in the browsers. We definitly need some kind of extension to check for the weak certificates. I am not sure if it is the right approach to ship it with the standard download package. Making it an optional (recommended) extension and activating OCSP by default in FF 2 might be a good compromise.


bye, ju


PS:
If you check the relevant CRLs like that one from Verisign:

http://SVRIntl-crl.verisign.com/SVRIntl.crl

you find that they revoked about 900 certs in the relevant time from 13-31 May. I am not sure if this is much or rather: enough -- taking in account that they revoked about 400 in the same time frame last year it seems too few to me.
(In reply to comment #21)

> On the contrary, it shows the value of CAs and of OCSP revocation.
  
OCSP is fine but only if weak certificates do get revoked. 

I wrote:
> If you check the relevant CRLs like that one from Verisign:
> 
> http://SVRIntl-crl.verisign.com/SVRIntl.crl
> 
> you find that they revoked about 900 certs in the relevant time from 13-31 May.

This is only an average rate and even slightly lower than in March or February. Not exactly what you expect after an incident like this.

bye, ju
(In reply to comment #17)
> Do CAs keep records of all the keys they sign? Can they test them and contact
> owners?

Yes, this is obviously a must. I would like to know about a CA which doesn't have these records.

How about blacklisting all the bad certs instead of the bad keys?
The problem is, that any server that had a vulnerable cert is vulnerable even after the cert is replaced until the vulnerable cert expires, unless the CA manages to revoke the key in some way that acutally works.

Akamai, a very big content distribution provider used by MANY organisations including the german Finanzamt (equivalent to the IRS), had a weak key. If I put
127.0.0.1 a248.e.akamai.net
into my hosts file and run an apache locally with the broken cert (cert+key got published in some forums), I can use firefox 3 to connect to https://a248.e.akamai.net (which is my local machine because of the hosts entry!) without any warnings with a default setup of FF3. I could do a MitM on a network and deliver a trojan to anyone who would try to download the official software for tax return reports, although the certificate has been already changed. (akamai is used by many others too, see https://www.pentagon.mil/ or the ATI driver download from http://game.amd.com/us-en/drivers_catalyst.aspx?p=xp/mobility-xp and probably many many more, so this single case alone is quite critical)

I neither know nor care why this works, but I know it does and I know it is not good that it does. Probably the CA did not revoke the cert correctly. It is not enough to blacklist one cert even if you just want to close the akamai hole, because they might have had multiple vulnerable certs, a single one is enough!

This shows how necessary it is to include a full blacklist of weak keys, I don't think anything else will help. If such an important key has not yet been revoked in a working way, what about the thousands of less visible keys? The RSA blacklists for 1024 and 2048 are 12.8 MB total in *full uncompressed ASCII (hex)* format, which is not that big if you consider the 8 MB urlclassifier2.sqlite. Using a database structure (or even just an ordered binary file), it should be quite fast to look up if a cert is included, and storing the data in binary form will cut the size in half, reducing the bit length can further help. By using a binary format and reducing the length from 160 (its bits btw) to 80 bits, we would reduce the size to 3 MB, which seems acceptable to migitate such a big risk. 

My suggestion: Issue a FF3 update ASAP that includes (or downloads) the blacklists, and shows a warning if a server uses a vulnerable cert. When writing the warning, consider that it is possible that stupid admins forgot to change the certs (so the warning MAY be shown on legitimate sites), but it is also possible that a site already using a new cert gets MitM-ed with an old cert.
The issuer of Akamai's certs relies on CRLs, rather than OCSP, for revocation.
FF3 does OCSP but does not do automatic CRL fetching. That's something that
we hope to fix very soon.  
Seriously Nelson? This is a scoop!!! Which bug number?

In any case CRL fetching is certainly useful also for fail-over service.
(In reply to comment #26)
Jan, if you use Windows, can you test IE against your local
Apache server with the broken cert?  You need to make sure
"Check for server certificate revocation" is enabled in
Internet Options > Advanced > Settings > Security.
GTE CyberTrust includes a CDP URI in its EE certs, I expect the revocation to be recognized in IE.
(In reply to comment #30)
IE did detect it, but only after I manually enabled CRL checking. I had to manually enable it (IE7 updated from IE6 on XP Home SP2, but maybe I messed it up a few year ago or so), I checked on a vista computer and that one had it enabled (I assume it was the default there).

Its a shame IE is more secure than FF here.

However, AFAIK akamai needed well more than a week to replace the key and the CA needed weeks to revoke the cert. The cert was revoked 2008-05-28, five days after this blogpost http://blog.fefe.de/?ts=b6c9ec7e and ten days after the compromised private key was publicly posted. The currently used key is valid from May 21. Add in the fact that the CRLs can get cached by the browsers, and the problem becomes apparent.

So although using the CRLs is really necessary, the blacklist should still be implemented in addition to it (this key was quite an important one, and it was a big company that should have noticed, and still it took over half a month to change and revoke the key - imagine how the situation will look with other keys). This would also help to make administrators notice the problem.

We could of course also discuss the issue until all the certs have expired, but that does not sound like a useful solution to me ;-)
I now have a working standalone implementation of a perl/shell blacklist converter that creates a small binary blacklist and the corresponding C++ algorithm that checks if a hash is in the list, VERY quickly (max. 25 list accesses so list could maybe not even need loading to RAM, no other complicated computations).

false positive rate depends on the size of the blacklist, the 7.2 MB one has 4 false positives per billion certs checked, the 6 MB one has 1091, i.e. 0.0001% of the checked certs, all the size values are uncompressed ready-to-use sizes and contain all hashes from openssl-blacklist_0.4.2.tar.gz

according to my tests, the 6 MB binary blacklist can be compressed from to 5 MB with gzip, not at all with bzip2, and to 3.5 MB with self-extracting 7zip.

If this is considered useful and someone wants to implement this, I can create a cleaned-up version of the C++ code and submit it. I don't think a 4 MB download is too much considering all the malware/phishing filters.

There is already a non-free extension that does check against uncompressed blacklists. If someone wants to make a free version using these "compressed" blacklists, tell me.
>Ubuntu has blacklists for 2048 and 1024 bits, with ***512*** and 4096
>lists being generated.

512 bits for rsa and discrete logarithm in F_p^* should be considered compromised - see Bug 377548

...despite the blatant lies quoting "universe" here:
http://www.mccune.cc/PGPquotes.htm
Hi what is the current status of implementing a key revoke system. http://codefromthe70s.org/sslblacklist.asp seems to do an ok job as an extension. Althought it appears to be copyrighted. I would like to see firefox get some ssl key system going. if i go to foo.com and foo.com has a known revoked key well i should be told that. or a note / an icon where the lock is should be different. Upon going to a ssl protected website for the first time and finding a weak key firefox should be able to prompt the user to report the key to firefox or to the system admin. There are still weak ssl keys being used. So I proposed that for MAIN / IMPORTANT websites (such as bank and others that are of high important) there should be a small list or a small method of checking (prompt the user perhaps ? --> you should first check the date range of when the key was generated to fit within the release of the weak openssl package etc.--> reduces the amount of user pain / other checking required). 

In addition, firefox comes with ca certs .... but not the crl's . I think that in future update firefox should include the crls that go with those ca certs / included cert bodies.
We have no plans to implement such a system.

Gerv
reassign bug owner.
mass-update-kaie-20120918
Assignee: kaie → nobody
(In reply to Gervase Markham [:gerv] from comment #36)
> We have no plans to implement such a system.

I suggest to close this as "WONTFIX", as it has not been implemented, it is not planned to ever implement it, and five years after the disaster, all affected certificates will have expired or been revoked by CAs so implementing it now would not make any sense.
Marked the bug "WONTFIX".
Status: NEW → RESOLVED
Closed: 12 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.