Closed Bug 556734 (CVE-2010-3177) Opened 14 years ago Closed 14 years ago

XSS in gopher parser when parsing hrefs

Categories

(Core :: Networking, defect)

defect
Not set
normal

Tracking

()

RESOLVED FIXED
Tracking Status
blocking2.0 --- final+
blocking1.9.2 --- .11+
status1.9.2 --- .11-fixed
blocking1.9.1 --- .14+
status1.9.1 --- .14-fixed

People

(Reporter: reed, Assigned: mayhemer)

References

()

Details

(Whiteboard: [sg:high][critsmash-high:investigate])

Attachments

(2 files, 1 obsolete file)

Robert Święcki of Google Switzerland GmbH reported an XSS in the gopher parser via security@.

He also states "not sure what's the FF same origin policy and whether it's possible to steal cookies or perform XMLHttpRequest(), but it's probably worth fixing".

gopher://glog.fosconetwork.org/0/billets/rejet-hadopi-09-04-2009.?&a=http://www.google.com%27%3CZZ%22%3E%3C/a%3E%3Cscript%3Ealert%283%29%3C/script%3E
1. Wireshark missing support for gopher is probably a sign that this should get killed (Actually biesi told me it already was, about a year ago - I'm not really keeping up on this)
2. I'm curious how this was found
3. The bug appears to be in the txt to html converter:

http://mxr.mozilla.org/mozilla-central/source/netwerk/streamconv/converters/nsTXTToHTMLConv.cpp#183

It goes and finds the strings it wants to htmlify and then considers the 'stuff to replace' as anything back to the previous whitespace and to the following whitespace. Which sounds like it was designed for emails originally, but:

gopher://glog.fosconetwork.org/0/foo@gmail.com%22%3E%3Cscript%3Ealert%283%29%3C/script%3E

also has the same issue. The JS can't have any whitespace in it, but that's not really a huge limitation, I think.

This seems to date back to v1.1 of the file in CVS - anyone got NS4 around to test?

Using nc on localhost, I can set a cookie using document.cookie and then read it again with a different response, and also read a cookie I set on localhost through http (with a cookiepath of /; the gopher path is the whole URI including the leading /0)

Its also not directly an XSS - its a bug in parsing the response, so a malicious gopher server could send any response it wants, regardless of the query. In this case the gopher server sends the text of the request in the response (it interprets it as a directory read for an invalid directory, which is where the '3' at the front comes from).

I'm not active in moz development any more, so I'll leave someone else to fix this, or just kill it off completely.
We could consider killing off gopher entirely here, but this really is a bug in nsTXTToHTMLConv, yeah.  Honza, do you have time to look, by any chance?
Ad 2: Don't remember really, manual testing; something like 'let's put a lot of non-alnum characters in the input and see what is going to happen'

Ad 'The JS can't have any whitespace in it, but that's not
really a huge limitation, I think.': Yup, it'd be probably trivial to overcome this, let's say with <string>eval(String.fromCharCode(x,y,z))</string> or so.

But it reminded me about another issue. It seems that FF has a nasty habit of treating slash characters (/) as delimiters in tags. E.g.

<img/onerror="alert(3)"/src="http://www.google.com/zz">

is perfectly equivalent to:

<img src="http://www.google.com/zz" onerror="alert(3)">

If we're on using whitespaces, maybe we could also change this behavour, as it makes creating reasonable html sanitizer a little bit harder task.... just saying ;).
For the slash thing, the behavior is actually required by the HTML5 spec (and is implemented in browsers other than Firefox).  If you want to write an HTML sanitizer, use the HTML5 parsing algorithm in your sanitizer.
(In reply to comment #2)
> Honza, do you have time to look, by any chance?

I did took a short look at this.  I hope to be able to patch this in the second half of the week.  I have important doctor appointments these days.
(In reply to comment #5)
> (In reply to comment #2)
> > Honza, do you have time to look, by any chance?
> 
> I did took a short look at this.  I hope to be able to patch this in the second
> half of the week.  I have important doctor appointments these days.

poke... been quite a while.
Honza, can you please have a look at this before 1.9.3 final, i.e. sometime this summer?
Assignee: nobody → honzab.moz
blocking2.0: --- → final+
Gopher seems to be the only consumer of nsITXTtoHTMLConv. There's also mozITXTToHTMLConv used by the spellchecker code. Dunno if both versions have this bug nor why we have two copies. Also haven't checked any addons to see if this is used by any of them.

But assuming it's not, we can remove nsTXTToHTMLConv from the tree when we replace gopher with a js version.
(In reply to comment #2)
> We could consider killing off gopher entirely here, but this really is a bug in
> nsTXTToHTMLConv, yeah.  Honza, do you have time to look, by any chance?

(In reply to comment #8)
> Gopher seems to be the only consumer of nsITXTtoHTMLConv.

Can we _please_ just kill Gopher and be done with this mess?
To summarize:

- on trunk, we would like to remove the Gopher support completely, i.e. remove the related protocol classes and all other code used exceptionally by this protocol

- on branches, do the same (probably not) or, fix the actual issue with nsTXTToHTMLConv?
(In reply to comment #10)
> - on trunk, we would like to remove the Gopher support completely, i.e. remove
> the related protocol classes and all other code used exceptionally by this
> protocol

That's not entirely true. See some of the recent comments in bug 388195 about some of the issues related to removing gopher.

> - on branches, do the same (probably not) or, fix the actual issue with
> nsTXTToHTMLConv?

We're not going to remove gopher support on the branches (see bug 388195, comment #117).

We just need to fix nsTXTToHTMLConv on both trunk and branch, ignoring the outcome of bug 388195, as that bug is still up-in-the-air.
Whiteboard: [sg:high] → [sg:high][critsmash-high:investigate]
Attached patch v1 (obsolete) — Splinter Review
So, the response is (some non-printables are excluded):

3'/billets/rejet-hadopi-09-04-2009.?&a=http://www.google.com'<ZZ"></a><script>alert(3)</script>' does not exist (no handler found)error.host1

- we find http://
- then we lookup backward (before http://) for a delimiter (one of \t \r \n or space)
- then forward
- we get a result, in this case:

3'/billets/rejet-hadopi-09-04-2009.?&a=http://www.google.com'<ZZ"></a><script>alert(3)</script>'

- and take this whole string (s) as an argument for <a href="(s)">(s)</a> but thanks "> after ZZ we close the tag too soon and the rest of the string leaks to the html content, twice ; the alert is displayed two times

Suggestions:
- remove this horrible code for injecting links at all
- do NS_EscapeURL with esc_Minimal flag ; as the patch does (actually works as a simple html sanitizer and leaves the functionality of otherwise correct url)
Attachment #472796 - Flags: review?(bzbarsky)
Comment on attachment 472796 [details] [diff] [review]
v1

>+++ b/netwerk/streamconv/converters/nsTXTToHTMLConv.cpp
>+        if (NS_EscapeURL(linkTextUTF8.BeginReading(), back-front, esc_Minimal, escaped)) {

back-front is wrong, since that's the length in UTF-16 codepoints.  You want linkTextUTF8.Length(), no?

>+            linkText = NS_ConvertUTF8toUTF16(escaped);

  CopyUTF8toUTF16(escaped, linkText);

r=bzbarsky with the above fixed.  Can we add a test?
Attachment #472796 - Flags: review?(bzbarsky) → review+
(In reply to comment #13)
> back-front is wrong, since that's the length in UTF-16 codepoints.  You want
> linkTextUTF8.Length(), no?

No, back and front are both filled with RFindCharInSet and FindCharInSet respectively.  The result is char position in the string, that is compatible with both UTF-16/8 encoding, if I understand correctly.

However I'll change this to:
NS_EscapeURL(linkTextUTF8.Data(), linkTextUTF8.Length(), esc_Minimal, escaped) as one of NS_EscapeERL overloads does.
Updated to comments.  Thanks for review.
Attachment #472796 - Attachment is obsolete: true
Attachment #474364 - Flags: review+
(In reply to comment #13)
> Can we add a test?

I'll implement a simple "gopher" server for this case using nsIServerSocket, or do you have any simpler idea to do this?
Changing flags, if I understand correctly this is no longer applicable to mozilla-central.
blocking1.9.1: --- → ?
blocking1.9.2: --- → ?
blocking2.0: final+ → ---
> that is compatible with both UTF-16/8 encoding

No.  It's not.  It's very very very not.  The number of UTF-16 codepoints in a UTF-16 string and the number of bytes in a UTF-8 version of the same string need not be the same.  One could be bigger than the other.  Or vice versa.  And note that if the UTF-8 length is smaller than the UTF-16 one what you had there was an attempt to escape memory you don't own.  So that change to use linkTextUTF8.Length() is _very_ important.

For testing this, can't you just send a simple bit of text through this stream converter and see what the output looks like?  No need for gopher.
And this can probably be triggered on m-c by things other than gopher (mailnews, for example, and so forth).  Please don't mess with the blocker flags unilaterally?
blocking2.0: --- → final+
I was convinced this component had been removed from mozilla-central as part of bug 388195, therefor I changed the flags.  I had to re-check that.
Oh, we also have a mozTXTToHTMLConv, huh.  Looks like nsTXTToHTMLConv is still in the tree, but we need to check which one is still used.
(In reply to comment #21)
> Oh, we also have a mozTXTToHTMLConv, huh.  Looks like nsTXTToHTMLConv is still
> in the tree, but we need to check which one is still used.

mozTXTToHTMLConv is used here [1], seems to be part of the mochitest kit only, and we do escaping on links here [2].  Also from the description: "Currently only functions to enhance plain text with HTML tags. Stream conversion is defunct."

[1] http://mxr.mozilla.org/mozilla-central/source/testing/mochitest/browser-harness.xul#195
[2] http://mxr.mozilla.org/mozilla-central/source/netwerk/streamconv/converters/mozTXTToHTMLConv.cpp#495
Boris, do you want to review the test?
Attachment #474364 - Attachment is obsolete: true
Looks ok to me.
Comment on attachment 475134 [details] [diff] [review]
v1.2 (includes test)

Thanks.
Attachment #475134 - Flags: review+
Attachment #474364 - Attachment description: v1.1 → v1.2 (no test)
Attachment #474364 - Attachment is obsolete: false
Comment on attachment 474364 [details] [diff] [review]
v1.2 (no test) [Check-in comment 26] [Check-in 1.9.2 comment 27] [Check-in 1.9.1 comment 28]

http://hg.mozilla.org/mozilla-central/rev/e8631a66c042
Attachment #474364 - Attachment description: v1.2 (no test) → v1.2 (no test) [Check-in comment 26]
Status: NEW → RESOLVED
Closed: 14 years ago
Resolution: --- → FIXED
blocking1.9.1: ? → .14+
blocking1.9.2: ? → .11+
Comment on attachment 474364 [details] [diff] [review]
v1.2 (no test) [Check-in comment 26] [Check-in 1.9.2 comment 27] [Check-in 1.9.1 comment 28]

http://hg.mozilla.org/releases/mozilla-1.9.2/rev/55bad5ea82c9
Attachment #474364 - Attachment description: v1.2 (no test) [Check-in comment 26] → v1.2 (no test) [Check-in comment 26] [Check-in 1.9.2 comment 27]
Comment on attachment 474364 [details] [diff] [review]
v1.2 (no test) [Check-in comment 26] [Check-in 1.9.2 comment 27] [Check-in 1.9.1 comment 28]

http://hg.mozilla.org/releases/mozilla-1.9.1/rev/e68c015fafa0
Attachment #474364 - Attachment description: v1.2 (no test) [Check-in comment 26] [Check-in 1.9.2 comment 27] → v1.2 (no test) [Check-in comment 26] [Check-in 1.9.2 comment 27] [Check-in 1.9.1 comment 28]
Why did this land without approval? *ALL* branch patches require explicit approval. blocking+ does not mean you can land. Please keep that in mind for the future.
Simple mistake, I'm sure. Any harm done (I can't see any)?
Depends on: 600371
Alias: CVE-2010-3177
Group: core-security
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: