Closed Bug 610162 Opened 14 years ago Closed 14 years ago

SHA-512 and SHA-384 hashes are incorrect for inputs of 512MB or larger when running under Windows and other 32-bit platforms (Fx 3.6.12 and 4.0b6)

Categories

(NSS :: Libraries, defect, P1)

x86
All
defect

Tracking

(status1.9.2 wanted, status1.9.1 wanted)

RESOLVED FIXED
3.12.9
Tracking Status
status1.9.2 --- wanted
status1.9.1 --- wanted

People

(Reporter: dan, Assigned: briansmith)

Details

(Keywords: platform-parity, Whiteboard: [sg:nse][sg:vector-X] for Firefox, possibly worse for other NSS users)

Attachments

(5 files, 1 obsolete file)

User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/534.11 (KHTML, like Gecko) Chrome/9.0.570.0 Safari/534.11
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-GB; rv:1.9.2.12) Gecko/20101026 Firefox/3.6.12

Using nsICryptoHash.updateFromStream() with the SHA-512 algorithm, the results returned when running under Windows are incorrect for files of exactly 512MB or larger.

I've tested the same XPI under the following hardware / OS / browser combos:

64-bit hardware, Windows 7 64-bit, Fx 3.6.12
64-bit hardware, Windows XP 32-bit (in a VM), Fx 4.0b6
64-bit hardware, Mac OS X 10.6.4 32-bit, Fx 3.6.12
64-bit hardware, Mac OS X 10.6.4 32-bit (in a VM), Fx 3.6.12

Both Windows versions gave correct hashes until the file size hit exactly 512MB (536,870,912 bytes), when the hashes returned became incorrect. Both Mac OS X instances gave the correct hash regardless of the file size.

I've verified the hash using the Mac OS X terminal, and 5 independent Windows applications (dp-hash, KRyLack, DivHash, Febooti Tweak, HashCalc)... they all return the same hash, so it really does appear as if Fx running under Windows returns incorrect hashes when the file is 512MB or larger.

See the attachment "testnsicrypto.xpi" for a very simple add-on which will let you hash files (it's available at the bottom of the "Tools" menu after installation).

Note 1: The add-on runs in the main thread as I wanted to rule out running in a background thread causing the issue, so you have to ignore the "Not responding" message in the title bar until the hashing process has finished (it takes about 15 seconds per 500MB).

Note 2: Most of the files I tested were created from the Windows 7 command line using "fsutil file createnew <filename> <length in bytes>" - which seems to create files filled with NUL characters. However, I have also tested some other large files, such as virtual disk images, which have real data in them, and saw the same issue.

Note 3: I have attached a ZIP file containing two large files - "lf536870911", which is exactly 1 byte less than 512MB, and "lf536870912" which is exactly 512MB. You will need 1GB of disk space to expand these to, even though the download is only 1.2MB!

Note 4: This issue also seems present when using SHA-384 algorithm but not SHA-256 or earlier. However, the bulk of my testing has been using the SHA-512 algorithm.


Reproducible: Always

Steps to Reproduce:
1. Select a file less than 512MB
2. Hash it on Windows and Mac OS X
3. Select a file of exactly 512MB or more
4. Hash it on Windows and Mac OS X

Actual Results:  
For the file less than 512MB, the hashes are the same over OSes. For the larger file, the Windows version of Fx is giving an incorrect hash.

Expected Results:  
I would expect the same hash on both Windows and Mac OS X, regardless of the file size.

For the 536,870,911 byte file, I get the following SHA-512 hash from all programs tested:

ca38ed29e4b841a2d666805615ccf741e11e9a7dae3c06ae5d5a055bfe1deec4f03adab6e3f86b5c843e008001570a782f9a1b8cf730bb2a370e371452d71abd

For the 536,870,912 file, I get the following SHA-512 hash from Fx 3.6.12 (under Mac OS X), as well as the 5 Windows applications mentioned previously:

df68d060d2adafc2c4794407118f8116d000715233b2550302115556380d1d5b018ebce1c7fa412a8bc5e01e097b33db64d1e9117b3f7bdd8925f09b6594590a

However, for Fx 3.6.12 and 4.0b6 running under Windows, I get this hash:

db28f910b037d4d73758c790c5753980ed5903120e242649d092cc3466875559d86d64946020e4b3f8760d3c191428df33ed45fedbc6162bda4a4d2e3e116411
Tested against Fx 3.6.12 and Fx 4.0b6
You will need 1GB of space to uncompress these to, even though the ZIP file itself is only 1.2MB !
Keywords: pp
Version: unspecified → 1.9.2 Branch
Summary: SHA-512 hashes are incorrect for files of 512MB or larger when running under Windows (and SHA-384 as well?) → SHA-512 hashes are incorrect for files of 512MB or larger when running under Windows
Summary: SHA-512 hashes are incorrect for files of 512MB or larger when running under Windows → SHA-512 hashes are incorrect for files of 512MB or larger when running under Windows (Fx 3.6.12 and 4.0b6)
Hardware: x86_64 → x86
Not sure if this is related to Bug 383390 or not, as that was marked as fixed back in 2007...
Group: core-security
Whiteboard: [sg:nse] for Firefox, possibly worse for other NSS users
It seems this is a bug in FreeBL's SHA-512 implementation.

I wrote a quick simple test and ran it in the MSVS debugger. CAPI calculates the expected SHA-512 hash and FreeBL calculates the different value, exactly as reported by the reporter. Changing the test case to have one less byte of input results in a match, just as described by the reporter.
Bob, I couldn't change the product/component to NSS/libraries without making the bug public. Note that the bug was public when reported.
Assignee: nobody → nobody
Component: Security: PSM → Libraries
Product: Core → NSS
QA Contact: psm → libraries
Version: 1.9.2 Branch → unspecified
So, here we have a report that two different implementations produce 
different results.  To blindly assume that NSS's one is faulty is 
unwarranted.  NSS's is FIPS 140-2 validated.  Maybe the bug is Microsoft's.  
I'd be much more inclined to use some official NIST published test vectors, 
rather than a comparison with some competitor's result.

I know there in an upper bound to the number of bits that may be included 
in a single SHA hash (for any flavor of SHA) but don't recall the limits.
I wonder if those limits are relevant here.
(In reply to comment #6)
> So, here we have a report that two different implementations produce 
> different results.

The reporter tested Firefox against itself on multiple platforms and got different results. And, the CAPI results match exactly the results the reporter claims for Firefox on MacOSX and Linux.
From SHA512_End:

#if defined(HAVE_LONG_LONG)
    unsigned int inBuf  = (unsigned int)ctx->sizeLo & 0x7f;
    unsigned int padLen = (inBuf < 112) ? (112 - inBuf) : (112 + 128 - inBuf);
    PRUint64 lo, t1;
    lo = (ctx->sizeLo << 3);
#else
    unsigned int inBuf  = (unsigned int)ctx->sizeLo.lo & 0x7f;
    unsigned int padLen = (inBuf < 112) ? (112 - inBuf) : (112 + 128 - inBuf);
    PRUint64 lo = ctx->sizeLo;
    PRUint32 t1;
    lo.lo <<= 3;
#endif

Earlier in the file, there is an #undef HAVE_LONG_LONG so the second implementation is used. When the total bytes hashed are greater or equal to 512*1024*1024, (lo.lo <<= 3) overflows. Commenting out the #undef HAVE_LONG_LONG and rebuilding FreeBL causes my test cases to pass (see upcoming attachment).

I will need help determining what is the actual correct fix as I don't understand all the conditional logic regarding which implementations to choose.
Severity: major → critical
Status: UNCONFIRMED → NEW
Ever confirmed: true
Priority: -- → P1
Attached file Updated test
Note that this test program shows that two 256MB inputs have the same result as one 512MB input. That's how I narrowed the problem down to SHA512_End.
I am also concerned that ROTR64a and SHR64a have the same definition:

> #define ROTR64a(x,n,lo,hi) (x.lo >> n | x.hi << (32-n))
> #define  SHR64a(x,n,lo,hi) (x.lo >> n | x.hi << (32-n))
Here is a fix that I think is acceptable. I will post an updated testcase that can be integrated into the NSS test suite tomorrow.
Assignee: nobody → bsmith
Attachment #489100 - Flags: review?(rrelyea)
Attachment #489100 - Attachment is patch: true
Attachment #489100 - Attachment mime type: application/octet-stream → text/plain
Would the same fix be applicable for the SHA-384 algorithm as well, given it seems to suffer from the same issue?
Reed, thanks for fixing my sloppy patch attaching.

Dan, SHA-384 is implemented in terms of SHA-512. That's why my patch to SHA512_End corrects the issue for SHA-384 too:

> void 
> SHA384_End(SHA384Context *ctx, unsigned char *digest,
>		 unsigned int *digestLen, unsigned int maxDigestLen)
> {
>    unsigned int maxLen = SHA_MIN(maxDigestLen, SHA384_LENGTH);
>    SHA512_End(ctx, digest, digestLen, maxLen);
> }

Updated the metadata it isn't Windows-specific but it doesn't happen on all platforms either.
OS: Windows 7 → All
Summary: SHA-512 hashes are incorrect for files of 512MB or larger when running under Windows (Fx 3.6.12 and 4.0b6) → SHA-512 and SHA-384 hashes are incorrect for inputs of 512MB or larger when running under Windows and other 32-bit platforms (Fx 3.6.12 and 4.0b6)
Comment on attachment 489100 [details] [diff] [review]
Patch to use LL_SHL and remove MSVC warnings in SHA512.c

r- mostly for the attempts to get rid of warnings. The analysis and actual fix looks correct.

1. The temps needed from some of the macro cases. If you want to get rid of them, I would prefer something tied closer to the macros themselves, rather than peppering more __MSVC defines in the file. I may be misreading things, but it appears t1 and 2 are necessary in the LONG_LONG case.

2. The actual fix LL_SFT macro looks correct to me. Just to be clear *BOTH* windows platforms fail, not just win64.

3. RE the LONG_LONG undefine. I've looked in the cvs log, and it is apparently a performance optimization. 32 bit windows defines a 64 bit long long in the compiler, even though there are only 32 bit registers in the processor. Using the 32-bit SHA512 was deemed faster. For win64, I would expect that it would be better to allow LONG_LONG to stand
Attachment #489100 - Flags: review?(rrelyea) → review-
Does ANY Mozilla product EVER push 512MB through any single cryptographic 
hash?  EVER?  

I doubt it, sincerely.  

And if not, then this bug isn't critical, isn't P1, isn't required at all
for Mozilla products.  Other companies' products might want it, but not
Mozilla.
There are download manager add-ons for Firefox that compute hashes of files, and I'm sure that some users of these will want to ensure that the large files they download (VOBs, AVIs, ISOs, DMGs, etc) match known hashes. In fact, I found the issue because of a similar situation.

Surely it'd be a bad thing if these people had no confidence in the download capabilities of Firefox and switched to using other browsers instead? After all, how are they to know that their download isn't corrupted rather than just the hash being returned incorrectly?
Thanks, Dan.  That's interesting input.  It suggests to me that add-ons 
have access to these functions.  Are those c add-ons?  or Java-Script?
Does FF make NSS crypto/hash primitives accessible to JS?
> Thanks, Dan.  That's interesting input.  It suggests to me that add-ons 
> have access to these functions.  Are those c add-ons?  or Java-Script?
> Does FF make NSS crypto/hash primitives accessible to JS?

Comment 1 in this bug is an add-on for Firefox written in JS which utilities the SHA-512 hashing algorithm, so the answer is 'Yes - add-ons do have access to these functions'. In fact, I discovered this issue during the course of writing a new add-on which uses hashing extensively.

A real world example: Looking at its source, you can see that the DownThemAll download manager add-on (which has been downloaded well over 40 million times) uses the hashing functions.
OS: All → Windows 7
In pre-4.0 and 4.0, hash and HMAC functions are exposed through nsICryptoHash and nsICryptoHMAC to all add-ons, JavaScript and native. All NSS functionality is available (AFAICT) to all add-ons with native code.

In FF 4.0, all NSS functionality is exposed to all add-ons through Javascript using JS-ctypes. The Firefox Sync add-on is an example of such a plugin.
OS: Windows 7 → All
Whiteboard: [sg:nse] for Firefox, possibly worse for other NSS users → [sg:nse][sg:vector-X] for Firefox, possibly worse for other NSS users
Warnings can be handled in another bug.

Note that Firefox on Win64 is still 32-bit (using 32-bit NSS) so the Win64 result above isn't surprising.

How should I integrate the test for this into the test suite? I guess it makes sense to test all of the hash algorithms with inputs crossing various size barriers (512MB; 1GB; 2GB; 4GB) but these tests will run for a very long time. Plus, I think we should avoid creating tests that generate massive files as. I looked into modifying bltest to include tests for this but bltest is very complicated. Do we want to make it even more complicated? If not, I can make a new stand-alone program under cmd/ that does something similar to my original test program, but verifying the results against fixed known-good values instead of against CAPI.
Attachment #489100 - Attachment is obsolete: true
Attachment #489745 - Flags: review?(rrelyea)
adding Elio
Comment on attachment 489745 [details] [diff] [review]
Patch to use LL_SHL

r=wtc.

In the original code for the HAVE_LONG_LONG undefined case, we have:

>-    PRUint64 lo = ctx->sizeLo;

PRUint64 is a struct, so this initializes only the first struct member,
which may be lo.lo or lo.hi depending on the endianness:
http://bonsai.mozilla.org/cvsblame.cgi?file=mozilla/nsprpub/pr/include/prtypes.h&rev=3.43&mark=357-366#357

So that's another (rare) bug fixed by this patch.
Attachment #489745 - Flags: review+
Please ignore everything I said in the previous comment except "r=wtc".
ctx->sizeLo is also a PRUint64, so the statement
    PRUint64 lo = ctx->sizeLo;
is a struct assignment.
Dan,

First, thank you for reporting the bug and for providing the detailed information that we needed to find the problem right away.

We are planning to fix this bug in the next release of NSS. However, according to my understanding of the discussion, some platforms like certain RHEL distributions will not get that fix in their system NSS library and so the fix won't be available for every single install, even after we release an updated Firefox. I do not think that your extension will have any way of detecting whether the bug is fixed. Also, we may (or may not) put in some check in nsICryptoHash such that it won't accept inputs >= 512MB for SHA-384 and SHA-512 on platforms where we use the system NSS. Consequently, you will not be able to rely on SHA-384 or SHA-512 for your extension with the current plan. You can avoid the problem completely by using SHA-256 instead. Thoughts?
(In reply to comment #24)
> 
> We are planning to fix this bug in the next release of NSS. However, according
> to my understanding of the discussion, some platforms like certain RHEL
> distributions will not get that fix in their system NSS library and so the fix
> won't be available for every single install, even after we release an updated
> Firefox. 

This simply means it will take a bit longer until this fix can be rolled out to certains RHEL versions, and won't affect our general task of releasing a fixed version of NSS.


> I do not think that your extension will have any way of detecting
> whether the bug is fixed.

I've long wanted to be able to use the help/about menu in Mozilla products and see the versions of the currently used 3rd party libraries (such as NSS or NSPR). Maybe we should consider to make such version information available through an API call.


> Also, we may (or may not) put in some check in
> nsICryptoHash such that it won't accept inputs >= 512MB for SHA-384 and SHA-512
> on platforms where we use the system NSS.

... and have it throw an error, so at least you'd detect at runtime if the feature is unavaiable. And we'd still have to figure out how to detect the NSS configuration and/or version reliably. It might not be worth to go that path. It might be simpler to accept that RHEL versions contain this bug until a fix gets rolled out.


> Consequently, you will not be able to
> rely on SHA-384 or SHA-512 for your extension with the current plan.

I think there is no need to make this broad conclusion. As with any bug, the rollout of fixes is not synchronized across the world.

You will be able to rely on the fix whenever the installed or bundled NSS version contains the fix. The common practice is to add a release note to a product, if it requires specific minimum versions of base libraries.
Comment on attachment 489745 [details] [diff] [review]
Patch to use LL_SHL

r+ rrelyea for both 3.13 and 3.12
Attachment #489745 - Flags: review?(rrelyea) → review+
Does PSM expose methods to save and restore hash contexts? 
If so, one could devise a method to test for a "fixed" SHA-512 hash as
follows:
One time (e.g. in development)
1. construct a hash context containing (2**29) - 1 bytes of input.
2. save it.  
Each time:
3. restore that hash context.
4. Hash in one (or a few) more bytes.
5. Check the result against a known good value.
Nelson, I'm not aware of such a feature.

How do you define "save and restore hash context"?
Is there a NSS API that allows to do that?

I assume you're not proposing to make a bitwise copy, as I suspect that's not safe in general, i.e. if the context object contains pointers to external data structures.
Target Milestone: --- → 3.12.9
Group: core-security
Trunk commit:

Checking in lib/freebl/sha512.c;
/cvsroot/mozilla/security/nss/lib/freebl/sha512.c,v  <--  sha512.c
new revision: 1.16; previous revision: 1.15
done
3.12 branch commit

Checking in lib/freebl/sha512.c;
/cvsroot/mozilla/security/nss/lib/freebl/sha512.c,v  <--  sha512.c
new revision: 1.14.6.1; previous revision: 1.14
done


Marking fixed.
Status: NEW → RESOLVED
Closed: 14 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: