Bug 1797231 Comment 0 Edit History

Note: The actual edited comment in the bug view page will always show the original commenter’s name and original timestamp.

VERSION
Firefox Version: Firefox 106.0.1 (affects all Firefox) stable
Operating System: macOS 12.6, but affects all operating systems

NOTE
I am reporting this bug after being encouraged to do so by Dan Veditz, and have filed a similar bug with the Chromium team. As it affects both browsers and many websites, please do not make this bug public.


REPRODUCTION CASE

I recently ran into an issue where a piece of Javascript I was working with was causing issues with the cookie parsing library of the programming language I was using. When it encountered these characters, it would simply fail to parse any further cookies. Not only did this cause issues with code that interacted with the cookie in question, but it also broke any downstream code that relied on the presence of the other cookies.

This led me down a long and twisted road to understand how all browsers (Firefox, Chromium, and Safari) as well as all language libraries (Python, Go, etc.) behave in the presence of the sometimes confusing and conflicting language in RFC 6265. In was in the process of doing this research (please contact me offline for a copy of the manuscript) that I discovered a series of unusual bugs related to the intersection of how browser and language libraries parse cookies.

For this particular issue, I discovered that Firefox allows the following characters inside cookie values: htab, space, dquote, comma, backslash, and 0x80-0xFF + Unicode. While allowing these characters as per RFC 6265bis Section 5 is acceptable, it also seems to cause denial-of-service attacks across numerous websites across the web.

Running this code:

```
document.cookie='cookieUnicode=🍪';
```

Will cause many websites to simply fail to work at all. As of my submitting this bug, that includes both facebook.com (which will forever tell you that "something went wrong") and netflix.com (which will also tell you that "something went wrong", with error code NSES-500). The only way to fix an affected user is to either have them manually clear their cookies or to have the receiving web server / website's Javascript enumerate a user's cookies and invalidate them.

In most cases this wouldn't be a huge issue, as it requires code execution to set these "poisoned" cookies in the first place. Unfortunately, cookies are a bit of a special case as due to their cross-origin nature, a subdomain such as poorlysecuredsubdomain.example.com can execute a piece of code such as:

```
document.cookie='cookieUnicode=🍪; domain=.example.com; path=/';
```

Which will permanently break the primary site as well as any of its subdomains. This is a significant issue, as many websites today bind their sensitive cookies using host-only (__Host) cookies to TLD+1 and allow subdomains to exist with far poorer security practices.

In my discussions with Dan Veditz (as well as his counterpart at Google), it is a bit unclear as to what the solution to this problem should be. If telemetry shows that not many sites are making use of cookies containing these characters, perhaps the solution is to bar the the creation of any cookies containing `0x00-08`, `0x0A-x1F`, and `0x7F-FF`, including Unicode, and which would apply both to `Set-Cookie` and to `document.cookie`.

If the telemetry instead shows that these kinds of cookies commonly exist, then the fix will likely be quite a bit more painful: getting spot fixes across the numerous programming languages, websites, and parsing cookie parsing libraries that make up the web. In only extremely cursory testing, I was able to break both facebook.com and netflix.com, but I suspect the number of affected sites is quite large.


REPLICATION STEPS

> Go to facebook.com or netflix.com (or any subdomain)
> Open up your console
> Run: document.cookie='cookieUnicode=🍪'; 
> Reload the page
VERSION
Firefox Version: Firefox 106.0.1 (affects all Firefox) stable
Operating System: macOS 12.6, but affects all operating systems

NOTE
I am reporting this bug after being encouraged to do so by Dan Veditz, and have filed a similar bug with the Chromium team. As it affects both browsers and many websites, please do not make this bug public.


REPRODUCTION CASE

I recently ran into an issue where a piece of JavaScript I was working with was causing issues with the cookie parsing library of the programming language I was using. When it encountered these characters, it would simply fail to parse any further cookies. Not only did this cause issues with code that interacted with the cookie in question, but it also broke any downstream code that relied on the presence of the other cookies.

This led me down a long and twisted road to understand how all browsers (Firefox, Chromium, and Safari) as well as all language libraries (Python, Go, etc.) behave in the presence of the sometimes confusing and conflicting language in RFC 6265. In was in the process of doing this research (please contact me offline for a copy of the manuscript) that I discovered a series of unusual bugs related to the intersection of how browser and language libraries parse cookies.

For this particular issue, I discovered that Firefox allows the following characters inside cookie values: htab, space, dquote, comma, backslash, and 0x80-0xFF + Unicode. While allowing these characters as per RFC 6265bis Section 5 is acceptable, it also seems to cause denial-of-service attacks across numerous websites across the web.

Running this code:

```
document.cookie='cookieUnicode=🍪';
```

Will cause many websites to simply fail to work at all. As of my submitting this bug, that includes both facebook.com (which will forever tell you that "something went wrong") and netflix.com (which will also tell you that "something went wrong", with error code NSES-500). The only way to fix an affected user is to either have them manually clear their cookies or to have the receiving web server / website's Javascript enumerate a user's cookies and invalidate them.

In most cases this wouldn't be a huge issue, as it requires code execution to set these "poisoned" cookies in the first place. Unfortunately, cookies are a bit of a special case as due to their cross-origin nature, a subdomain such as poorlysecuredsubdomain.example.com can execute a piece of code such as:

```
document.cookie='cookieUnicode=🍪; domain=.example.com; path=/';
```

Which will permanently break the primary site as well as any of its subdomains. This is a significant issue, as many websites today bind their sensitive cookies using host-only (__Host) cookies to TLD+1 and allow subdomains to exist with far poorer security practices. As such, XSS vulnerabilities or subdomain takeovers on poorly secured subdomains can semi-permanent denial-of-service on the primary site.

In my discussions with Dan Veditz (as well as his counterpart at Google), it is a bit unclear as to what the solution to this problem should be. If telemetry shows that not many sites are making use of cookies containing these characters, perhaps the solution is to bar the the creation of any cookies containing `0x00-08`, `0x0A-x1F`, and `0x7F-FF`, including Unicode, and which would apply both to `Set-Cookie` and to `document.cookie`.

If the telemetry instead shows that these kinds of cookies commonly exist, then the fix will likely be quite a bit more painful: getting spot fixes across the numerous programming languages, websites, and parsing cookie parsing libraries that make up the web. In only extremely cursory testing, I was able to break both facebook.com and netflix.com, but I suspect the number of affected sites is quite large.


REPLICATION STEPS

> Go to facebook.com or netflix.com (or any subdomain)
> Open up your console
> Run: document.cookie='cookieUnicode=🍪'; 
> Reload the page

Back to Bug 1797231 Comment 0