Closed Bug 985097 Opened 10 years ago Closed 10 years ago

Should Input.Read(uint16_t&) deal with endianness?

Categories

(Core :: Security: PSM, defect)

x86
macOS
defect
Not set
normal

Tracking

()

RESOLVED INVALID

People

(Reporter: st3fan, Unassigned)

References

Details

Something I noticed while writing tests ... I don't think Input.Read(uint16_t&) correctly deals with big-endian systems. Not sure if this is relevant, but important enough to double check.

(We do have people who still make builds for big endian systems. Like Sparc and PowerPC)
(In reply to Stefan Arentz [:st3fan] from comment #0)
> Something I noticed while writing tests ... I don't think
> Input.Read(uint16_t&) correctly deals with big-endian systems. Not sure if
> this is relevant, but important enough to double check.

AFAICT, the code is endian-agnostic:

  Result Read(uint16_t& out)
  {
    if (input == end || input + 1 == end) {
      return Fail(SEC_ERROR_BAD_DER);
    }
    out = *input++;
    out <<= 8u;
    out |= *input++;
    return Success;
  }

More specifically, DER encoding is *always* big-endian, so this code is always converting big-endian encoding to native byte order, regardless of what the native byte order is, AFAICT.
Please see my comment above. Shouldn't we make this RESOLVED INVALID?
Flags: needinfo?(sarentz)
Yeah I think you are right Brian. I'll flip this to RESOLVED INVALID.
Status: NEW → RESOLVED
Closed: 10 years ago
Flags: needinfo?(sarentz)
Resolution: --- → INVALID
You need to log in before you can comment on or make changes to this bug.