Closed Bug 818340 Opened 11 years ago Closed 11 years ago

Block cookies from sites I haven't visited

Categories

(Core :: Networking: Cookies, defect)

defect
Not set
normal

Tracking

()

RESOLVED FIXED
mozilla22
Tracking Status
firefox22 --- disabled

People

(Reporter: bugzilla, Assigned: bugzilla)

References

(Depends on 2 open bugs, Blocks 3 open bugs)

Details

(Keywords: dev-doc-needed, privacy, site-compat)

Attachments

(2 files, 13 obsolete files)

86.95 KB, image/png
Details
72.95 KB, patch
Dolske
: review+
Details | Diff | Splinter Review
See Bug tracking-protection for meta discussion of third-party tracking countermeasures.

The Safari cookie blocking policy is quite straightforward.

1) if an origin is first-party, it has ordinary cookie permissions
2) if an origin is third-party
	a) if the origin already has cookies, it has ordinary cookie permissions
	b) otherwise, the origin gets no cookie permissions

Here's a sketch of what might be required to implement Safari-like third-party cookie blocking.

1) add a new value to the network.cookie.cookieBehavior pref schema
2) in nsCookieService.cpp
	a) add the new pref value as a constant
	b) modify PrefChanged to allow the new pref value
	c) modify CheckPrefs to add support for the new pref value, whether loaded from a CookiePermission per-site preference or a global preference
		i) if a domain is first-party, return STATUS_ACCEPTED
		ii) if a domain is third-party, check whether any cookies are set by calling CountCookiesFromHost
		iii) return STATUS_ACCEPTED if there are cookies, and STATUS_REJECTED if there aren't
3) in nsICookiePermission.h, define a new permission enum
4) in nsCookiePermission.h, add a new private variable for a cookie manager ref
5) in nsCookiePermission.cpp
	a) grab a ref to the cookie manager on initialization
	b) modify CanSetCookie to add support for the new permission enum
		i) check whether a domain is third-party (see the ACCESS_ALLOW_FIRST_PARTY_ONLY case for complete code)
		ii) if it's a third-party domain, check whether it has any cookies set by calling the cookie manager
		iii) return true if it's a first-party domain or it's a third-party domain and cookies exist, otherwise return false
	c) modify CanAccess to add support for the new permission enum (can duplicate the logic from CanSetCookie)
6) in CookieServiceChild.cpp, add support for loading and passing the new pref value (this could be kinda annoying)
7) add UX support for the new pref
8) add Sync support for the new pref
9) add unit tests for the new pref
Keywords: privacy
Attached patch Proposed Patch (obsolete) — — Splinter Review
Here's a first-pass implementation. Seems to work as expected. Still on the todo list: UI and Sync.
Hi! Thanks for the patch! I'm assuming someone's introduced you to Mozilla processes and such?

Bouncing over to the Cookie component in Bugzilla. Not sure who's doing cookie-related reviews these days, perhaps jduell can suggest a reviewer?
Assignee: nobody → jmayer
Component: General → Networking: Cookies
Product: Firefox → Core
I'm somewhat familiar with the release and contribution processes. Pointers very much welcome.
Attached patch Proposed Patch (obsolete) — — Splinter Review
Here's a revision that includes a rough UI implementation. Screenshot to follow.
Attachment #693704 - Attachment is obsolete: true
Attached image Screenshot —
Flags: sec-review?
Status: UNCONFIRMED → NEW
Ever confirmed: true
Status: NEW → ASSIGNED
Flags: sec-review? → sec-review?(dveditz)
Attached patch Proposed Patch (obsolete) — — Splinter Review
Small tweaks for formatting, naming conventions, and comments.
Attachment #693739 - Attachment is obsolete: true
Attached patch Proposed Patch (obsolete) — — Splinter Review
One more formatting fix.
Attachment #694164 - Attachment is obsolete: true
Attached patch Proposed Patch (obsolete) — — Splinter Review
Added unit tests. I think this is about ready for a first review.
Attachment #694172 - Attachment is obsolete: true
Status: ASSIGNED → UNCONFIRMED
Ever confirmed: false
I find the patch to be quite readable, so nice job there. I could go through and do a code review, but I'm uncertain whether it's worth doing that before the security review takes place. If there are significant revisions required, or the team decides that it's not a feature that belongs in Firefox, I wouldn't want to make you do unnecessary work.
Status: UNCONFIRMED → NEW
Ever confirmed: true
Attachment #694217 - Flags: review?(dveditz)
Attached patch Proposed Patch (obsolete) — — Splinter Review
Better code uniformity for handling of UTF-8 domains.
Attachment #694217 - Attachment is obsolete: true
Attachment #694217 - Flags: review?(dveditz)
Attachment #695013 - Flags: sec-approval?
Attachment #695013 - Flags: review?
Comment on attachment 695013 [details] [diff] [review]
Proposed Patch

Thank you for the patch !

The sec-approval flag relates to landing hidden security bugs, so you don't need to set that :)
Attachment #695013 - Flags: sec-approval?
Comment on attachment 695013 [details] [diff] [review]
Proposed Patch

dolske: can you take a look at the firefox bits (or delegate it)?
jdm: can you take a look at the cookie stuff from a peer perspective?  :)  Thanks guys.
Attachment #695013 - Flags: review?(josh)
Attachment #695013 - Flags: review?(dolske)
Attachment #695013 - Flags: review?
I apologize for letting this languish for so long; I'm really busy with the b2g wrap-up right now, but I'll be able to review this next week.
Comment on attachment 695013 [details] [diff] [review]
Proposed Patch

Review of attachment 695013 [details] [diff] [review]:
-----------------------------------------------------------------

The front-end UI changes look mostly OK, just a few comments. I'd mark r+ now, but keeping the request open to remind myself to try this out in a build just to be sure.

Sid: have you talked with various product folk about the default pref changing here? I believe it's very much wanted by the privacy/security teams, but AIUI there's potential for breaking sites (beyond "tracking" sites, where the breakage is intentional ;). Before this ships folks should know what the risks are so no one is surprised. Maybe even a dev.planning heads up when this lands?

Is the intention to let this ride the trains straight to release (ie, Firefox 21 if it landed today), or bake on Nightly/Aurora for an extra cycle or so?

::: browser/components/preferences/in-content/privacy.xul
@@ +167,5 @@
> +            <menulist id="acceptThirdPartyMenu" preference="network.cookie.cookieBehavior">
> +              <menupopup>
> +                <menuitem label="&acceptThirdParty.always.label;" value="0"/>
> +                <menuitem label="&acceptThirdParty.visited.label;" value="3"/>
> +                <menuitem label="&acceptThirdParty.never.label;" value="1"/>

This is almost a nitpick, but it would probably be cleaner to keep the pref-related magic-numbers entirely within writeAcceptCookies() / readAcceptCookies(). And just use values here of, say, always/visited/never. (You'd need to add back the onsyncfrom/onsyncto as well.)

I'm not sure I care.

But it is a little odd to have read/writeAcceptCookies() only become incidentally called by init().

::: browser/components/preferences/privacy.js
@@ +345,2 @@
>      if (accept.checked)
> +        acceptThirdPartyMenu.value = 3;

Hmm, does that work? I thought normally one needed to use somemenu.selectedIndex to change the currently displayed item.

::: browser/locales/en-US/chrome/browser/preferences/privacy.dtd
@@ +25,5 @@
> +<!ENTITY  acceptThirdParty.pre.label      "Accept third-party cookies:">
> +<!ENTITY  acceptThirdParty.pre.accesskey  "c">
> +<!ENTITY  acceptThirdParty.always.label   "Always">
> +<!ENTITY  acceptThirdParty.never.label    "Never">
> +<!ENTITY  acceptThirdParty.visited.label  "From visited">

I know this UI (once you get to it) is already kind of busy, and I do like me some short UI strings... but... I wonder if "From visited" is a little too short?

"From visited sites"? "Only if I've visited them"?

I don't have a great suggestion at the moment.
(In reply to Justin Dolske [:Dolske] from comment #14)
> Comment on attachment 695013 [details] [diff] [review]
> Proposed Patch
> 
> Review of attachment 695013 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> The front-end UI changes look mostly OK, just a few comments. I'd mark r+
> now, but keeping the request open to remind myself to try this out in a
> build just to be sure.
> 
> Sid: have you talked with various product folk about the default pref
> changing here? I believe it's very much wanted by the privacy/security
> teams, but AIUI there's potential for breaking sites (beyond "tracking"
> sites, where the breakage is intentional ;). Before this ships folks should
> know what the risks are so no one is surprised. Maybe even a dev.planning
> heads up when this lands?

Yeah, I've socialized a bit (but we're all busy and have short memories right now). I'll start a discussion about it in dev.privacy now, but we have plans to obtain some cookie data from Test Pilot to measure what this might break, and will have to monitor it carefully.

> Is the intention to let this ride the trains straight to release (ie,
> Firefox 21 if it landed today), or bake on Nightly/Aurora for an extra cycle
> or so?

Ideally we can just let it ride the trains, but we will want to watch it carefully.
Attached patch Proposed Patch (obsolete) — — Splinter Review
(In reply to Justin Dolske [:Dolske] from comment #14)
> Comment on attachment 695013 [details] [diff] [review]
> Proposed Patch
> 
> Review of attachment 695013 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> [snip]
> 
> ::: browser/components/preferences/in-content/privacy.xul
> @@ +167,5 @@
> > +            <menulist id="acceptThirdPartyMenu" preference="network.cookie.cookieBehavior">
> > +              <menupopup>
> > +                <menuitem label="&acceptThirdParty.always.label;" value="0"/>
> > +                <menuitem label="&acceptThirdParty.visited.label;" value="3"/>
> > +                <menuitem label="&acceptThirdParty.never.label;" value="1"/>
> 
> This is almost a nitpick, but it would probably be cleaner to keep the
> pref-related magic-numbers entirely within writeAcceptCookies() /
> readAcceptCookies(). And just use values here of, say, always/visited/never.
> (You'd need to add back the onsyncfrom/onsyncto as well.)
> 
> I'm not sure I care.
> 
> But it is a little odd to have read/writeAcceptCookies() only become
> incidentally called by init().

Done. I agree that this approach is a bit cleaner. There's also the benefit of explicitly handling when all cookies are disabled.

> ::: browser/components/preferences/privacy.js
> @@ +345,2 @@
> >      if (accept.checked)
> > +        acceptThirdPartyMenu.value = 3;
> 
> Hmm, does that work? I thought normally one needed to use
> somemenu.selectedIndex to change the currently displayed item.

Seemed to work. Anyways, switched to selectedIndex to be sure.

> ::: browser/locales/en-US/chrome/browser/preferences/privacy.dtd
> @@ +25,5 @@
> > +<!ENTITY  acceptThirdParty.pre.label      "Accept third-party cookies:">
> > +<!ENTITY  acceptThirdParty.pre.accesskey  "c">
> > +<!ENTITY  acceptThirdParty.always.label   "Always">
> > +<!ENTITY  acceptThirdParty.never.label    "Never">
> > +<!ENTITY  acceptThirdParty.visited.label  "From visited">
> 
> I know this UI (once you get to it) is already kind of busy, and I do like
> me some short UI strings... but... I wonder if "From visited" is a little
> too short?
> 
> "From visited sites"? "Only if I've visited them"?
> 
> I don't have a great suggestion at the moment.

The "From visited" language is borrowed from Safari. I'm totally open to alternatives.
Attachment #695013 - Attachment is obsolete: true
Attachment #695013 - Flags: review?(josh)
Attachment #695013 - Flags: review?(dolske)
Attachment #702113 - Flags: review?(josh)
Attachment #702113 - Flags: review?(dolske)
Comment on attachment 702113 [details] [diff] [review]
Proposed Patch

Review of attachment 702113 [details] [diff] [review]:
-----------------------------------------------------------------

This is a really excellent patch; thanks! The tests are complete and make sense to me, which is a nice bonus.

::: extensions/cookie/nsCookiePermission.cpp
@@ +216,5 @@
>      // If it's third party, we can't set the cookie
>      if (isThirdParty)
>        *aResult = false;
> +    else
> +      *aResult = true;

We initialize aResult to a default policy (of true, in this case), so I don't think this change is necessary.

@@ +227,5 @@
> +      uint32_t priorCookieCount = 0;
> +      nsAutoCString hostFromURI;
> +      aURI->GetHost(hostFromURI);
> +      mCookieManager->CountCookiesFromHost(hostFromURI, &priorCookieCount);
> +      if (priorCookieCount == 0)

This block can be simplified to |*aResult = priorCookieCount != 0;|

@@ +232,5 @@
> +        *aResult = false;
> +      else
> +        *aResult = true;
> +    }
> +    else

nit: Join this with the preceding } and brace the next statement, please.

@@ -285,5 @@
>  
> -        if (NS_SUCCEEDED(rv) && countFromHost > 0)
> -          rv = cookieManager->CookieExists(aCookie, &foundCookie);
> -      }
> -      if (NS_FAILED(rv)) return rv;

This early failure return has been lost.

::: extensions/cookie/test/unit/test_bug526789.js
@@ +8,5 @@
>  
>    cm.removeAll();
>  
> +  // Allow all cookies.
> +  Services.prefs.setIntPref("network.cookie.cookieBehavior", 0);

Please put this in head_cookies.js and only modify tests that explicitly need a different value.

::: netwerk/cookie/nsCookieService.cpp
@@ +3245,5 @@
>          }
>          return STATUS_ACCEPTED;
>  
> +      case nsICookiePermission::ACCESS_LIMIT_THIRD_PARTY:
> +        if (aIsForeign) {

Invert this condition and early return so we avoid the extra indent here.
Attachment #702113 - Flags: review?(josh) → review+
Attached patch Proposed Patch (obsolete) — — Splinter Review
(In reply to Josh Matthews [:jdm] from comment #17)
> Comment on attachment 702113 [details] [diff] [review]
> Proposed Patch
> 
> Review of attachment 702113 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> This is a really excellent patch; thanks! The tests are complete and make
> sense to me, which is a nice bonus.

Thanks!

> ::: extensions/cookie/nsCookiePermission.cpp
> @@ +216,5 @@
> >      // If it's third party, we can't set the cookie
> >      if (isThirdParty)
> >        *aResult = false;
> > +    else
> > +      *aResult = true;
> 
> We initialize aResult to a default policy (of true, in this case), so I
> don't think this change is necessary.

Done.

> @@ +227,5 @@
> > +      uint32_t priorCookieCount = 0;
> > +      nsAutoCString hostFromURI;
> > +      aURI->GetHost(hostFromURI);
> > +      mCookieManager->CountCookiesFromHost(hostFromURI, &priorCookieCount);
> > +      if (priorCookieCount == 0)
> 
> This block can be simplified to |*aResult = priorCookieCount != 0;|

Done.

> @@ +232,5 @@
> > +        *aResult = false;
> > +      else
> > +        *aResult = true;
> > +    }
> > +    else
> 
> nit: Join this with the preceding } and brace the next statement, please.

Done (I think).

> @@ -285,5 @@
> >  
> > -        if (NS_SUCCEEDED(rv) && countFromHost > 0)
> > -          rv = cookieManager->CookieExists(aCookie, &foundCookie);
> > -      }
> > -      if (NS_FAILED(rv)) return rv;
> 
> This early failure return has been lost.

Since the cookie manager service is now (potentially) used frequently, I moved the code for grabbing it to Init(). Is that reasonable?

> ::: extensions/cookie/test/unit/test_bug526789.js
> @@ +8,5 @@
> >  
> >    cm.removeAll();
> >  
> > +  // Allow all cookies.
> > +  Services.prefs.setIntPref("network.cookie.cookieBehavior", 0);
> 
> Please put this in head_cookies.js and only modify tests that explicitly
> need a different value.

Many (most?) of the cookie tests broke without this change. I could check and modify each test if that's preferable.

> ::: netwerk/cookie/nsCookieService.cpp
> @@ +3245,5 @@
> >          }
> >          return STATUS_ACCEPTED;
> >  
> > +      case nsICookiePermission::ACCESS_LIMIT_THIRD_PARTY:
> > +        if (aIsForeign) {
> 
> Invert this condition and early return so we avoid the extra indent here.

Done.
Attachment #702113 - Attachment is obsolete: true
Attachment #702113 - Flags: review?(dolske)
Attachment #703335 - Flags: review?(josh)
Attachment #703335 - Flags: review?(dolske)
Comment on attachment 703335 [details] [diff] [review]
Proposed Patch

I think jdm r+ed this, so carrying over the r=jdm.

Dolske: can you have a look and either + or - it?  Based on your last comment we're not sure what's next.  :)
Attachment #703335 - Flags: review?(josh) → review+
>> @@ -285,5 @@
>> >  
>> > -        if (NS_SUCCEEDED(rv) && countFromHost > 0)
>> > -          rv = cookieManager->CookieExists(aCookie, &foundCookie);
>> > -      }
>> > -      if (NS_FAILED(rv)) return rv;
>> 
>> This early failure return has been lost.
>
>Since the cookie manager service is now (potentially) used frequently, I moved the code for 
>grabbing it to Init(). Is that reasonable?

That's fine. What's not fine is that there's an |rv = mCookieManager->CookieExists(aCookie, &foundCookie);| that's now unchecked.

>> ::: extensions/cookie/test/unit/test_bug526789.js
>> @@ +8,5 @@
>> >  
>> >    cm.removeAll();
>> >  
>> > +  // Allow all cookies.
>> > +  Services.prefs.setIntPref("network.cookie.cookieBehavior", 0);
>> 
>> Please put this in head_cookies.js and only modify tests that explicitly
>> need a different value.
>
>Many (most?) of the cookie tests broke without this change. I could check and modify each 
>test if that's preferable.

I'm not sure we're clear on the change I wanted. head_cookies.js is executed before each run_test(), so the tests that required this should not longer complain.
Attached patch Proposed Patch (obsolete) — — Splinter Review
(In reply to Josh Matthews [:jdm] from comment #20)
> >> @@ -285,5 @@
> >> >  
> >> > -        if (NS_SUCCEEDED(rv) && countFromHost > 0)
> >> > -          rv = cookieManager->CookieExists(aCookie, &foundCookie);
> >> > -      }
> >> > -      if (NS_FAILED(rv)) return rv;
> >> 
> >> This early failure return has been lost.
> >
> >Since the cookie manager service is now (potentially) used frequently, I moved the code for 
> >grabbing it to Init(). Is that reasonable?
> 
> That's fine. What's not fine is that there's an |rv =
> mCookieManager->CookieExists(aCookie, &foundCookie);| that's now unchecked.

D'oh, fixed.

> >> ::: extensions/cookie/test/unit/test_bug526789.js
> >> @@ +8,5 @@
> >> >  
> >> >    cm.removeAll();
> >> >  
> >> > +  // Allow all cookies.
> >> > +  Services.prefs.setIntPref("network.cookie.cookieBehavior", 0);
> >> 
> >> Please put this in head_cookies.js and only modify tests that explicitly
> >> need a different value.
> >
> >Many (most?) of the cookie tests broke without this change. I could check and modify each 
> >test if that's preferable.
> 
> I'm not sure we're clear on the change I wanted. head_cookies.js is executed
> before each run_test(), so the tests that required this should not longer
> complain.

Ah, I see. I didn't take this approach since I figured future authors of cookie tests might expect a default settings environment. If setting the old default in head_cookies.js is preferable, glad to make the change.
Attachment #703335 - Attachment is obsolete: true
Attachment #703335 - Flags: review?(dolske)
Attachment #703387 - Flags: review?(josh)
Attachment #703387 - Flags: review?(dolske)
That's a fair point actually. You can leave the setIntPref calls in the tests that need them. It's probably better to be explicit in this case. No need to flag me for further reviews; I trust you to make the changes I require.
Comment on attachment 703387 [details] [diff] [review]
Proposed Patch

Marking r+, thanks!
Attachment #703387 - Flags: review?(josh) → review+
Comment on attachment 703387 [details] [diff] [review]
Proposed Patch

I noticed one oddity, but it's basically an existing bug:

When unchecking "Accept Cookies", the 3rd party <menulist> UI always changes to "Never". And when rechecking "Accept Cookies" the 3rd party <menulist> will always be reset to "From visited". Ideally, changes to the checkbox and <menulist> should be independent.

Consider: if someone changes the 3rd party setting to "Never" (while continuing to allow other cookies), then at some point unchecks "Allow cookies" to temporarily test something... Upon rechecking "Allow cookies" we will have lost their original preference to never accept 3rd party cookies. Oops.

Of course, this is all a due to using a single pref that is exposed as multiple UI elements. Usually "don't expose implementation details to the user" is a fine principle, but this is one of the times it leads to oddities.

I'm not sure this is even worth fixing (as an edge case), and even if we did it would seem like a big ugly hack... The backend uses 1 pref, the UI would store some other state in another pref, and it would all need to stay synchronized even when the prefwindow isn't around. Bleeeeech.

So, r+ and let's file this oddity as a known-bug, lest someone want to take a shot at fixing it.
Attachment #703387 - Flags: review?(dolske) → review+
(And BTW: solid, outstanding work for a first patch. Surprisingly rare to see!)
Um, this was dropped on the floor, wasn't it? Is this something we want to land now, or are we waiting on a security review or something?
(In reply to Josh Matthews [:jdm] from comment #26)
> Um, this was dropped on the floor, wasn't it? Is this something we want to
> land now, or are we waiting on a security review or something?

Looks like Curtis has flagged it to dveditz for a security review.
Flags: needinfo?(dveditz)
jdm: well, I solicited feedback in the usual forums and got crickets.  I will double-check with dveditz about the review and then if cleared, land it myself (and send out a note to put peoples' eyes on it).

Dolske: should we file a follow up for the known bug you mentioned in comment 24, or just file it in our mental cabinet?
Status: NEW → ASSIGNED
Flags: needinfo?(dolske)
Eh, probably not worth filing (comment 24). I think this is good to go.
Flags: needinfo?(dolske)
dveditz, do you want me to secreview this (I'm looking at 3rd party cookie stuff later this week anyway)?
Depends on: 835844
Flags: sec-review?(dveditz) → sec-review?(mgoodwin)
Summary: Implement Safari-like Third-Party Cookie Blocking → Block cookies from sites I don't visit directly
Summary: Block cookies from sites I don't visit directly → Block cookies from sites I haven't visited
Keywords: relnote
Does anyone know what the effect of this is on the average user?

Obviously third-party cookie surveillance is a privacy concern -- Mozilla built Collusion to show how many tracking entities there are, and how they relate. But the current commerce patterns on the web depend on cookies, with some user benefits as well as costs and risks.

Yes, I'm saying targeted ads based on cookies that this patch would block are sometimes a new win for some users. Maybe not enough to justify the downside, but who knows? I do not. I see no one on this bug trying to assess.

I personally want us to get to a better, user-centric model that supports ads and commerce without tracking by parties engaged in non-transparent business practices. Users ideally should be able to opt into a self-profiling system, client-based and highly private (after all, we already keep your history, you trust Firefox to keep this information private), that can selectively disclose abstracted or even (depending on to whom) precise/concrete information that helps give the user more value than today's cookie-based world.

Given such a system, I'd even want something like: opt into this better "user profiling" agent and at the same time turn off all third-party cookies.

There are tons of ways to track users, of course. See http://www.businessinsider.com/facebooks-plan-to-kill-the-tracking-cookie-2013-1 and look past the hype that implies cookies won't work. We don't want to start an arms race or play whack-a-mole. But in order to move the debate to a better place, we need innovation.

Without something innovative, such as the better user agent with opt-in disclosure to select second parties, we risk "escalation" -- ultimately to regulators who can't innovate. Regulators under political pressure tend to simply freeze existing structures, locking in incumbents.

Speaking of advantaging the incumbents, this patch won't touch existing cookie stores chock-full of already-set, long-lived third-party cookies, right? That doesn't seem great on its face even if this patch makes life better with respect to third-party cookies the user might face in the future.

We need a newsgroup thread or better to discuss this further, unless I've missed an analysis of what the impact of this patch on Firefox users in the large, especially loss of useful ads (if such exist), might be.

/be
> new win for some users.

"net win", of course.

/be
It's been a while since someone tried to change default cookie behavior, back in 2008

https://bugzilla.mozilla.org/show_bug.cgi?id=324397
https://bugzilla.mozilla.org/show_bug.cgi?id=417800

From skimming through these bugs it seems that the earlier attempt tried to block all 3rd party cookies from being set or read, regardless whether the user had visited the site in a 1st party context. So Jonathan's patch is less stringent, and will hopefully not cause the same issues as 5 years ago.
(In reply to Brendan Eich [:brendan] from comment #32)

> We need a newsgroup thread or better to discuss this further, unless I've
> missed an analysis of what the impact of this patch on Firefox users in the
> large, especially loss of useful ads (if such exist), might be.

This patch is pretty small and self-contained; does this objection also apply to landing it without changing the default behavior? (i.e., allow interested users to opt-in to the behavior this patch adds) That could be helpful with getting some experience with the real-world impact.

[Conversely, if it initially lands that way and we later determine it will never be the default... Do we leave it in as an optional feature or back it out in search of a better replacement?]
I think we should strive to land on by default. We need to have the newsgroup discussion, see whether Jonathan or anyone has data to knock down some concerns, let's say "the easy ones", and then identify the hard or imponderable concerns. Given that list of hard issues, we can probably land in nightly and monitor.

Monitoring should mean new Telemetry, if feasible, to report when behavior diverges from the status quo. This could be open-ended research but if we keep it simple, perhaps we can keep track of the biggest hard issues that have immediate effects (i.e., not whether this new policy tends over time to push server-side architectures toward same-ETLD+1-origin hosting architectures, which go counter to best security practices).

More in a week, after I expect the newsgroup thread has run its course.

/be
Clearing my tracking flags set in error. Marking this fixed on Target milestone FF 22 will be enough.
Attached patch Proposed Patch (obsolete) — — Splinter Review
Building on Try revealed a number of issues. Huge thanks to Sid, Monica, and Dan for quick-turnaround help with the spotting, fixing, and testing.

A quick breakdown of what's new:
-Removed the circular reference from the cookie preference service to the cookie manager service. The manager's now only grabbed as needed.
-Caught the case where generally third-party cookies are blocked and per-site third-party cookies are limited.
-Added ifdefs in the preferences defaults to keep the old policy in Android and B2G.
-Fixed a menu value inconsistency between the dialog and in-content third-party cookie settings.
-Updated the "Remember History" preference to use the new policy.
-Patched a bunch of privacy UI tests to account for the new feature and preference pane layout.
-Reverted to the old policy in a number of tests that use third-party cookies.

Dolske and jdm, any chance you could do a re-review tomorrow? I'm a bit crunched for time starting next week, and I'd like to make sure I can help if anything goes awry.

Thanks,
Jonathan
Attachment #703387 - Attachment is obsolete: true
Attachment #714274 - Flags: review?(josh)
Attachment #714274 - Flags: review?(dolske)
Comment on attachment 714274 [details] [diff] [review]
Proposed Patch

Review of attachment 714274 [details] [diff] [review]:
-----------------------------------------------------------------

There's a problem with adding the setIntPref lines to the various mochitest/mochitest-browser tests: all such tests execute in the same browser instance, which means that the pref will stay in effect once they finish running. You should either use SpecialPowers.pushPrefEnv, which resets the pref to default once the test finishes, or registerCleanupFunction with a function that calls SpecialPowers.clearUserPref. Also, please update the comment by mCookieBehavior in nsCookieService.h. I'd like to see the result of the test changes, please.

::: browser/components/preferences/tests/browser_bug705422.js
@@ +11,5 @@
>  
> +    // Allow all cookies.
> +    var prefService = Components.classes["@mozilla.org/preferences-service;1"]
> +                                .getService(Components.interfaces.nsIPrefService);
> +    prefService.setIntPref("network.cookie.cookieBehavior", 0);

Just use Services.prefs instead.

::: browser/components/preferences/tests/privacypane_tests_perwindow.js
@@ +235,5 @@
>        is(control.checked, checked,
>          control.getAttribute("id") + " should " + (checked ? "not " : "") + "be checked");
>      });
> +
> +    is(true, true, "menu is " + thirdPartyCookieMenu.value);

This should just be info(...) if you're going to leave it in, but it doesn't look necessary.
Attachment #714274 - Flags: review?(josh)
Attached patch Proposed Patch (obsolete) — — Splinter Review
(In reply to Josh Matthews [:jdm] from comment #42)
> Comment on attachment 714274 [details] [diff] [review]
> Proposed Patch
> 
> Review of attachment 714274 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> There's a problem with adding the setIntPref lines to the various
> mochitest/mochitest-browser tests: all such tests execute in the same
> browser instance, which means that the pref will stay in effect once they
> finish running. You should either use SpecialPowers.pushPrefEnv, which
> resets the pref to default once the test finishes, or
> registerCleanupFunction with a function that calls
> SpecialPowers.clearUserPref.

Here's what I used instead:

|SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]}, function() {});|

I *think* that's safe to do, at least where I used it. Yes?

> Also, please update the comment by mCookieBehavior in nsCookieService.h.

Done.

> I'd like to see the result of the test changes, please.

Sure thing. Monica, would you mind pushing to Try and pasting a link?

> ::: browser/components/preferences/tests/browser_bug705422.js
> @@ +11,5 @@
> >  
> > +    // Allow all cookies.
> > +    var prefService = Components.classes["@mozilla.org/preferences-service;1"]
> > +                                .getService(Components.interfaces.nsIPrefService);
> > +    prefService.setIntPref("network.cookie.cookieBehavior", 0);
> 
> Just use Services.prefs instead.

Done.

> ::: browser/components/preferences/tests/privacypane_tests_perwindow.js
> @@ +235,5 @@
> >        is(control.checked, checked,
> >          control.getAttribute("id") + " should " + (checked ? "not " : "") + "be checked");
> >      });
> > +
> > +    is(true, true, "menu is " + thirdPartyCookieMenu.value);
> 
> This should just be info(...) if you're going to leave it in, but it doesn't
> look necessary.

Remains of debugging past. Done.
Attachment #714274 - Attachment is obsolete: true
Attachment #714274 - Flags: review?(dolske)
Attachment #714674 - Flags: review?(josh)
Attachment #714674 - Flags: review?(dolske)
(In reply to Jonathan Mayer from comment #43) 
> Here's what I used instead:
> 
> |SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]},
> function() {});|
> 
> I *think* that's safe to do, at least where I used it. Yes?

Technically this is safe in certain situations, but it's not a pattern I want to put in the tree and allow people to cargo-cult. If we're going to use pushPrefEnv, let's use it right and move the rest of the test into the callback. The easiest way to do this is probably to rename the existing run_test to something else and make that the callback, and create a new run_test that just kicks off the pushPrefEnv call.
Attached patch Proposed Patch (obsolete) — — Splinter Review
(In reply to Josh Matthews [:jdm] from comment #45)
> (In reply to Jonathan Mayer from comment #43) 
> > Here's what I used instead:
> > 
> > |SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]},
> > function() {});|
> > 
> > I *think* that's safe to do, at least where I used it. Yes?
> 
> Technically this is safe in certain situations, but it's not a pattern I
> want to put in the tree and allow people to cargo-cult. If we're going to
> use pushPrefEnv, let's use it right and move the rest of the test into the
> callback. The easiest way to do this is probably to rename the existing
> run_test to something else and make that the callback, and create a new
> run_test that just kicks off the pushPrefEnv call.

Done.
Attachment #714674 - Attachment is obsolete: true
Attachment #714674 - Flags: review?(josh)
Attachment #714674 - Flags: review?(dolske)
Attachment #714735 - Flags: review?(josh)
Attachment #714735 - Flags: review?(dolske)
Try is green. Thanks, Monica!
(In reply to Jonathan Mayer from comment #48)
> Try is green. Thanks, Monica!

I assume the xpcshell failure (test_signed_apps.js) is generated in a previous changeset, right?  Not this patch?
Comment on attachment 714735 [details] [diff] [review]
Proposed Patch

Review of attachment 714735 [details] [diff] [review]:
-----------------------------------------------------------------

::: content/base/test/test_CrossSiteXHR.html
@@ +29,3 @@
>  
> +function initTestCallback() {
> +  SimpleTest.waitForExplicitFinish();

Move this into initTest.

::: content/base/test/test_bug338583.html
@@ +594,5 @@
>        setTestHasFinished(test_id);
>      }, parseInt(8000*stress_factor));
>    }
>  
>    function doTest(test_id)

Just get rid of the test_id.

::: extensions/cookie/test/test_app_uninstall_cookies.html
@@ +157,5 @@
>  
>  var gManifestURL = "http://www.example.com/chrome/dom/tests/mochitest/webapps/apps/basic.webapp";
>  
> +// Allow all cookies, then run the test
> +SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]}, confirmNextInstall);

This should really encompass all the below code too. However, this test is already using setXPref in a bunch of other places, so just do the same thing and clear the pref at the end along with dry_run.
Attachment #714735 - Flags: review?(josh) → review+
Attached patch Proposed Patch (obsolete) — — Splinter Review
(In reply to Josh Matthews [:jdm] from comment #51)
> Comment on attachment 714735 [details] [diff] [review]
> Proposed Patch
> 
> Review of attachment 714735 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> ::: content/base/test/test_CrossSiteXHR.html
> @@ +29,3 @@
> >  
> > +function initTestCallback() {
> > +  SimpleTest.waitForExplicitFinish();
> 
> Move this into initTest.
> 
> ::: content/base/test/test_bug338583.html
> @@ +594,5 @@
> >        setTestHasFinished(test_id);
> >      }, parseInt(8000*stress_factor));
> >    }
> >  
> >    function doTest(test_id)
> 
> Just get rid of the test_id.
> 
> ::: extensions/cookie/test/test_app_uninstall_cookies.html
> @@ +157,5 @@
> >  
> >  var gManifestURL = "http://www.example.com/chrome/dom/tests/mochitest/webapps/apps/basic.webapp";
> >  
> > +// Allow all cookies, then run the test
> > +SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]}, confirmNextInstall);
> 
> This should really encompass all the below code too. However, this test is
> already using setXPref in a bunch of other places, so just do the same thing
> and clear the pref at the end along with dry_run.

Done. Ran the changed tests locally, all still pass.
Attachment #714735 - Attachment is obsolete: true
Attachment #714735 - Flags: review?(dolske)
Attachment #716841 - Flags: review?(josh)
Attachment #716841 - Flags: review?(dolske)
Comment on attachment 716841 [details] [diff] [review]
Proposed Patch

Carrying over r=jdm from comment 51.
Attachment #716841 - Flags: review?(josh) → review+
Comment on attachment 716841 [details] [diff] [review]
Proposed Patch

Review of attachment 716841 [details] [diff] [review]:
-----------------------------------------------------------------

r- for the network_requests_iframe.html issue. Everything else looks good-to-go, though!

::: browser/components/preferences/tests/browser_permissions.js
@@ +163,5 @@
>    },
>  
>    function test_all_sites_permission() {
> +    // apply the old default of allowing all cookies
> +    Services.prefs.setIntPref("network.cookie.cookieBehavior", 0);

....ah, this test is already setting prefs and calling clearUserPref() at the end.

::: modules/libpref/src/init/all.js
@@ +1286,5 @@
> +pref("network.cookie.cookieBehavior",       3); // 0-Accept, 1-dontAcceptForeign, 2-dontUse, 3-limitForeign
> +#ifdef ANDROID
> +pref("network.cookie.cookieBehavior",       0); // Keep the old default of accepting all cookies
> +#endif
> +#ifdef MOZ_WIDGET_GONK

Why not Android and FFOS too? Are we just being conservative to avoid surprise breakage / test issues, or is a reason/desire to not change them?

I've no objection, just want to understand the reason.

[Should probably have followup bugs to track enabling this on Android/FFOS.]

::: toolkit/devtools/webconsole/test/network_requests_iframe.html
@@ +36,3 @@
>  
> +      // Allow all cookies before attempting to set cookies
> +      SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]}, setCookies);

Hmm, this might not work so well. I didn't look in detail, but I assume this test (which is loading in an <iframe> of a _real_ test) is relying on cookies being set during page load, but pushPrefEnv() uses setTimeout() to fire the callback...

So it's possible this becomes a randomly failing test, if the |load| event (or however the rest of this test goes) happens to run before the setTimeout.

You could just move the pref-setting code into the 3 tests that make use of this subtest... test_network_get.html  / test_network_post.html  / test_network_longstring.html. Although I didn't look to see if they actually all actually check cookies.
Attachment #716841 - Flags: review?(dolske) → review-
Attached patch Proposed Patch — — Splinter Review
(In reply to Justin Dolske [:Dolske] from comment #55)
> Comment on attachment 716841 [details] [diff] [review]
> Proposed Patch
> 
> Review of attachment 716841 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> r- for the network_requests_iframe.html issue. Everything else looks
> good-to-go, though!
> 
> ::: browser/components/preferences/tests/browser_permissions.js
> @@ +163,5 @@
> >    },
> >  
> >    function test_all_sites_permission() {
> > +    // apply the old default of allowing all cookies
> > +    Services.prefs.setIntPref("network.cookie.cookieBehavior", 0);
> 
> ....ah, this test is already setting prefs and calling clearUserPref() at
> the end.
> 
> ::: modules/libpref/src/init/all.js
> @@ +1286,5 @@
> > +pref("network.cookie.cookieBehavior",       3); // 0-Accept, 1-dontAcceptForeign, 2-dontUse, 3-limitForeign
> > +#ifdef ANDROID
> > +pref("network.cookie.cookieBehavior",       0); // Keep the old default of accepting all cookies
> > +#endif
> > +#ifdef MOZ_WIDGET_GONK
> 
> Why not Android and FFOS too? Are we just being conservative to avoid
> surprise breakage / test issues, or is a reason/desire to not change them?
> 
> I've no objection, just want to understand the reason.

My reasoning is simply that we don't have a UI yet :)

I would certainly be in favor of adding support for Android and Firefox OS.

> [Should probably have followup bugs to track enabling this on Android/FFOS.]

Sounds good.

> ::: toolkit/devtools/webconsole/test/network_requests_iframe.html
> @@ +36,3 @@
> >  
> > +      // Allow all cookies before attempting to set cookies
> > +      SpecialPowers.pushPrefEnv({"set": [["network.cookie.cookieBehavior", 0]]}, setCookies);
> 
> Hmm, this might not work so well. I didn't look in detail, but I assume this
> test (which is loading in an <iframe> of a _real_ test) is relying on
> cookies being set during page load, but pushPrefEnv() uses setTimeout() to
> fire the callback...
> 
> So it's possible this becomes a randomly failing test, if the |load| event
> (or however the rest of this test goes) happens to run before the setTimeout.
> 
> You could just move the pref-setting code into the 3 tests that make use of
> this subtest... test_network_get.html  / test_network_post.html  /
> test_network_longstring.html. Although I didn't look to see if they actually
> all actually check cookies.

Fixed and re-ran the tests locally. Thanks for the help, Dolske!
Attachment #716841 - Attachment is obsolete: true
Attachment #716917 - Flags: review?(josh)
Attachment #716917 - Flags: review?(dolske)
Comment on attachment 716917 [details] [diff] [review]
Proposed Patch

\o/
Attachment #716917 - Flags: review?(josh)
Attachment #716917 - Flags: review?(dolske)
Attachment #716917 - Flags: review+
I think this may (finally!) be ready to land.
Pushed to inbound.
https://hg.mozilla.org/integration/mozilla-inbound/rev/8679f0f1c215

Thanks for your hard work and patience, Jonathan.
Flags: in-testsuite+
Target Milestone: --- → mozilla22
Thanks, Sid, and huge thanks to you and all the other contributing Mozillans for coaching me through this first patch.

I've sketched a brief FAQ about the new cookie policy. Hopefully helpful for the community.

http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/
https://hg.mozilla.org/mozilla-central/rev/8679f0f1c215
Status: ASSIGNED → RESOLVED
Closed: 11 years ago
Resolution: --- → FIXED
My bank and one of my credit unions both require that I accept third-party cookies in order to pay bills over the Web and to access monthly statements on the Web.  How will this implementation affect such transactions?
Much like David E Ross points out in comment 62, some websites will break completely if third party cookies are blocked.

Perhaps instead of blocking them, they should default to session only. This would solve the concern about tracking while allowing existing websites to work. Firefox currently supports what I described above in the setting named network.cookie.thirdparty.sessionOnly.

I set mine to true and it doesn't break existing websites. When I close firefox, the tracking data is destroyed. This even allows you to avoid tracking problems even if you need one third party cookie but don't want to permanently allow them to be set.
I've added this bug to the compatibility doc. Please correct the info if I'm wrong.
https://developer.mozilla.org/en-US/docs/Site_Compatibility_for_Firefox_22
I think this change is MEANINGLESS for protection user from tracking.
(I apologize that this comment is overlapping with my comments in bug 818337.)

3rd party cookie model is not the only way to  track user. They (government, SNS company, or ad-network) can track user with using other finger print information which is UA strings, IP address, or the timezone his primary access to their contents.

If they rely on the beacon IDs embedded in 3rd party cookie to track user, User can control a little it with removing their cookies. But If trackers uses other finger print information I said above, It will be difficult that user control their tracking ID because UA strings, or IP addresses are lower than 3rd cookies from the viewpoint of modifying possibility by users. This is not undoubted future. But it's possible future. In such world, someone may say "All internet user should use Tor for protect their privacy!". It's not make sence!

So blocking 3rd party cookies has no meaning for restricting tracking. And maybe it get more difficult world, I think that.
3rd party cookies has an evil factor. Some SNS company or search engine one may track user browsing history via their widget parts. But this root problem is that they uses same domain as their normal services.

For internet democracy & freedom of user, I think that "Do Not Track" model is better than changing the policy of 3rd party cookie. Of course, I agree that DNT model is incomplete for tracking protection. However, DNT model respects user decisions. This is important things. Blocking 3rd party cookie model provides only a fake way to protect tracking.

The interest matching advertisement which is provided with tracking information has some certain merits for some users. We should keep their merit for their users. I think that It's a regular manner for internet democracy & freedom of user. 
All users don't want advertisement based on tracking. But also all users don't want blocking tracking for very restriction privacy. The important thing is user decisions.

Therefore, I propose that we should implement DNT signal preference per-domain, and backout this policy change.
(In reply to Tetsuharu OHZEKI [:saneyuki_s] from comment #65)
> 3rd party cookie model is not the only way to  track user. They (government,
> SNS company, or ad-network) can track user with using other finger print
> information which is UA strings, IP address, or the timezone his primary
> access to their contents.

That's bug 572650.
 
> Therefore, I propose that we should implement DNT signal preference
> per-domain, and backout this policy change.

I'm not sure why we have to backout this bug, but please file a new bug about per-site DNT settings.
(In reply to Tetsuharu OHZEKI [:saneyuki_s] from comment #65)
> The interest matching advertisement which is provided with tracking
> information has some certain merits for some users. We should keep their
> merit for their users. I think that It's a regular manner for internet
> democracy & freedom of user. 
> All users don't want advertisement based on tracking. But also all users
> don't want blocking tracking for very restriction privacy. The important
> thing is user decisions.

Users who want interest matching advertisement can enable sending the third-party cookie.
(In reply to Masatoshi Kimura [:emk] from comment #66)
> I'm not sure why we have to backout this bug, but please file a new bug
> about per-site DNT settings.

(In reply to Masatoshi Kimura [:emk] from comment #67)
> Users who want interest matching advertisement can enable sending the
> third-party cookie.

I think this change has no effect to block 3rd party cookie for tracking protection. Because, 
* If user click the 3rd party link which will set tracking ID cookie once, its 3rd party cookie will be set & be sent to his server. So this change cannot help him who don't want tracking but want to look some advertisement and sometimes click it.
* Some big player (e.g. SNS provider) that may track user uses the domain which is same as their main SNS site for their social/ad widget. (They sometimes share their domain across their service & social/ad widget.)  This policy change cannot block to sent their cookie if user uses them.
* This has the other meaning. Thus big player which has many users only track user activities. This is unfair for small player. From internet freedom, Its unfair may be evil.
(These are overlapping https://bugzilla.mozilla.org/show_bug.cgi?id=818337#c5 in some part.)

Users who want interest matching advertisement need not enable sending 3rd party cookies. The browser will continue to sent 3rd party cookies which has been set already. Therefore I think this change is meaningless.

And tracking ways are not only 3rd party cookie. Blocking 3rd party cookie will not resolve anythings. This change would make the relation of tracking & privacy more difficult.

So I think we should backout this change, and implement DNT signal per-domain.
File the new bug about per-site DNT settings: bug 844600
If the change is meaningless, why do we have to backout the change in the first place?
From Ryan Day ‏@soldair on twitter:

> http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/ … @BrendanEich does this
> policy allow cookies from third parties whom you make allowed cors requests too?

A good question, from the patch I would say "no", which seems like a bug. Thoughts?

Also the point I made in comment 32 penultimate paragraph about incumbents in the cookie store (I do not believe most users clear cookies frequently, sorry -- I'm skeptical without real data) stands. Dolski on dev.privacy suggested "Add a timestamp for when the
last first-party visit was for the site, and if you've not made a first-party visit within X weeks, drop the cookie." Recording here without splitting discussion, since it seems actionable in a code patch (followup bug fine).

/be
Target Milestone: mozilla22 → ---
> Dolski

Argh, Dolske (sorry!). Jetlag biting me pre-MWC.

/be
Target Milestone: --- → mozilla22
Many implementations contains a workaround for the mentioned Safari behaviour. A while back they allowed form posting to the 3rd party domain, but that was removed in later Safari versions. The workaround changed to requesting a user interaction like pressing a button to continue on the 3rd party domain. Will this workaround also be applicable to Firefox when this is released?
(In reply to Masatoshi Kimura [:emk] from comment #70)
> If the change is meaningless, why do we have to backout the change in the
> first place?

It's that this change might break the current web tracking status & balance.

The mainstream model of current tracking system are based on 3rd party cookies. (Also, This mainstream is point of issue of this bug...)  This means that user have some control point for whether he want to be tracked or not. User can know whether he is tracked by tracking player or not. Because this model rely on tracking "unique" ID which is in 3rd party cookies as local storage. 

If user remove its cookies which has tracking ID, The user tracking history maybe reset. This can reject some tracking which rely on only tracking ID by technical design. And it's different from DNT because DNT is legal & trust based model. This is very important for user dominion.

But the merit of this technical model is based on that many tracking player uses 3rd party cookie for tracking user. So it's just a balance.
This balance is de-facto balance by using 3rd party cookies. Maybe some few tracking plays uses other fingerprint information. But fortunately, it's not mainstream.

If Firefox changes to new policy introduced by this bug, The current tracking model might be broken. (The number of Firefox user has big impact. It's different from Safari.)  If it will be so, tracking players would use other fingerprint information: e.g. IP address, timezone, location, and/or UA strings.

(Of course, they may use other local storage a.k.a DOM storage. But DOM storage may be restricted with 3rd party cookie policy in the near future. So I think that this is not useful for updating tracking system after the new policy of this bug. I seem they will like other fingerprint information.)

Their fingerprint information have fewer control-ability than cookie or other local storage. User cannot change easily his IP address, location, timezone... So user have only way to use Tor for fending tracking. This will be the dark ages.
User will not have any ways to know whether he is tracked by tracking player or not.

So I think we should not break the current sensitive balance. If this change did at ten years ago, it would be no problem. However, at now, if we change this balance, we need to introduce a new design model to ensure an alternative one.

Therefore, I propose that we need rollback in the first place.
Hey everyone: it sounds like there's some follow-up discussion to have, but this bug is not the best place to have discussions about what should change.  Could we take the policy discussions over to dev-privacy and leave the technical work in bugzilla?  The dev-privacy forum has a wider audience and is a much better place for discussion.

https://lists.mozilla.org/listinfo/dev-privacy
https://groups.google.com/forum/?fromgroups#!forum/mozilla.dev.privacy
(In reply to Brendan Eich [:brendan] from comment #71)
> From Ryan Day ‏@soldair on twitter:
> 
> > http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/ … @BrendanEich does this
> > policy allow cookies from third parties whom you make allowed cors requests too?
> 
> A good question, from the patch I would say "no", which seems like a bug.
> Thoughts?

Filed bug 844622.

> Dolske on dev.privacy suggested "Add a timestamp for when the
> last first-party visit was for the site, and if you've not made a
> first-party visit within X weeks, drop the cookie." Recording here without
> splitting discussion, since it seems actionable in a code patch (followup
> bug fine).

Filed bug 844623.
It's come to my attention that some people think this removes the old default.  It does not: this introduces a *new* state (accept third party cookies "From Visited") and picks that new state for the default.  

You can still go into the preferences pane and change your third party cookies prefs to Always accept third party cookies (see screenshot in attachment 693740 [details]).
Could they also make the accept third party cookies as session only (but regular cookies keep their normal expiration) option available? Right now you have to turn it on in about:config as I described in comment 63.

In the GUI config, presently you can only either set all cookies to session only or set them all to have their normal expiration. I really like that option more than the block option, as the block option will require manual intervention every time you run into a site that breaks with that setting.
(In reply to deckerj from comment #78)
> Could they also make the accept third party cookies as session only (but
> regular cookies keep their normal expiration) option available?

That sounds like a separate change, so it should probably be in its own bug.
One financial service that I use online is Western Federal Credit Union.  As I use that site, it requires cookies that are NOT session-only for the following domains:  www.western.org, .western.org, and ebankingwestern.org.  I would expect the first to be part of the second.  However, would ebankingwestern.org be considered a third-party cookie?
It's a different domain in the context of western.org, thus my expectation is that it would be treated as a third-party cookie.
Is an iframe treated as its own separate page with its own definition of "third party"?

The reason I ask is that in testing how this all works, a page containing an iframe for a different domain creates cookies that belong to a different domain from the domain of the top level page, and for a domain that has never been visited before.  They belong to the domain of the iframe.  Isn't this just the Safari loophole all over again?

I should note that I'm creating the iframe from an addon, in case that makes any difference.
This doesn't work with third-party cookies blocked:

http://www.aboutads.info/choices/
Depends on: 846101
When I upgraded a profile from FF21 to FF22, cookie setting permissions were "downgraded".  A behaviour of "never allow 3rd party cookies" was changed to "allow third party for visited".  Possibly non-default cookie permissions shouldn't be touched at all.
Depends on: 846350
Ian: thanks for the report. I filed bug 846350 for this.  Please test with a brand new/clean profile and mention over in that bug whether it's easily reproduced or if we need to investigate your case more closely.
No longer blocks: 844622
Depends on: 844622
Depends on: 848437
Depends on: 851606
See Also: → 845353
See Also: 845353
Depends on: 849948
QA Contact: bogdan.maris
This bug has been listed as part of the Aurora 22 release notes in either [1], [2], or both. If you find any of the information or wording incorrect in the notes, please email release-mgmt@mozilla.com asap. Thanks!

[1] https://www.mozilla.org/en-US/firefox/22.0a2/auroranotes/
[2] https://www.mozilla.org/en-US/mobile/22.0a2/auroranotes/
(In reply to Ian Nartowicz from comment #82)
> Is an iframe treated as its own separate page with its own definition of
> "third party"?
> 
> The reason I ask is that in testing how this all works, a page containing an
> iframe for a different domain creates cookies that belong to a different
> domain from the domain of the top level page, and for a domain that has
> never been visited before.  They belong to the domain of the iframe.  Isn't
> this just the Safari loophole all over again?

I have the same issue: iframe on another domain than the parent page, willing to create cookies local to the iframe domaine from within the iframe : it fails.
example at http://www.football365.fr/ and http://www.sport365.fr/ga/iframe-365.php
The iframe is used to count unique visitors among a network of sites from the same publisher, which is a real business requirement for such type of site.
Flags: needinfo?(dveditz)
Without telemetry or equivalent monitoring (comment 15, comment 37, tracked by bug 837326), we should not go to Beta. jmayer agrees. Asking for approval for the one-line patch to keep the Beta pref unchanged in bug 851606.

I'll have a blog post about the big picture beyond Telemetry up pretty soon.

/be
Brendan: We've been a little slow to implement telemetry, but there's progress (see bug 837326).  It's not as simple as we originally thought to "measure impact" because we have to count # of impacted sites without sending domain names to our servers so there's some data structure work to do and potentially perf concerns too.
I agree that we should take an extra release cycle to catch up on measurement. In particular, I'd like to improve our understanding of false positives (i.e. trusted third parties) and false negatives (e.g. untrusted first parties that are grandfathered in or that the user is temporarily redirected through).
(In reply to Jonathan Mayer from comment #90)
> or that the user is temporarily redirected through

I don't think we can eliminate this case easily.

Testcase
http://torisugari.site90.net/dev/adhost/adhost_index.html
http://torisugari.hostei.com/development/test/adguest/adguest_index.html
Is this patch really getting held back again? If so, why?

From bug reports and Safari's experience, we know there's almost no breakage. From measurements of Safari, we know it's effective in providing privacy.
(In reply to Jonathan Mayer from comment #94)
> From bug reports and Safari's experience, we know there's almost no
> breakage.

We haven't shipped this on beta, and even beta doesn't always get us enough coverage for large web-impacting changes. I don't think you can draw that conclusion.

See also: https://brendaneich.com/2013/06/the-cookie-clearinghouse/
I think it's important to stay focused on the goals of improving user privacy and aligning cookie behavior with user expectation, while minimally impacting user experience. The goals of other players in this space are tangential to that, and I think it is unwise to worry about the ecosystem as a whole given the number of unknowns and the fact that Mozilla is charged with helping users, not other players. One clarification: examples where the domain model doesn't reflect the organization model (e.g. foo.com uses foocdn.com) is only relevant in cases where foocdn.com uses cookies in a way that negatively impacts user experience. The mere existence of this in the wild does not imply real breakage.

If there is demonstrable breakage for users, that is useful, and should give us pause. But merely pointing out sites like foocdn.com I don't think is enough. Examples like those of David E Ross (https://bugzilla.mozilla.org/show_bug.cgi?id=818340#c80) are great, but they must be reproduced before accepted. I'm all for gathering more data if there is evidence of breakage, but think the natural burden of proof rests with those saying that things will break to demonstrate that. I will follow up and try to reproduce concrete breakages reported by David and others in this thread, but I've seen scant evidence of breakage in general.

If the needs of the broader ecosystem are being considered beyond the needs of the user, my question is: can we have a clear enumeration of the goals? It's frustrating to have a patch sitting in limbo without clear criteria of what it would take to accept or reject it. It'd be great to understand the cookie clearinghouse in this context -- is it PURELY about maximizing privacy gains while minimizing user breakage from Mozilla's perspective? If so, what empirical evidence is there for its necessity? If it's about more than just user breakage, what are the metrics of success, and when can we declare victory and push forward cookie blocking to improve user privacy?
See comment #75 and don't comment here.
(In reply to :Gavin Sharp (use gavin@gavinsharp.com for email) from comment #95)
> We haven't shipped this on beta, and even beta doesn't always get us enough
> coverage for large web-impacting changes. I don't think you can draw that
> conclusion.

Let's set aside the adequacy of current non-breakage data for the moment. It's a conversation I'm glad to have—but maybe we need not agree on it. Would you support moving the default cookie policy into Beta to get more data? I would note that the upcoming Telemetry on cookie blocking does not appear to address breakage.

> See also: https://brendaneich.com/2013/06/the-cookie-clearinghouse/

I'm a member of the Cookie Clearinghouse Advisory Board. It's an exciting initiative! I don't, however, see why this patch should wait even longer while that project spins up. We shouldn't make the perfect the enemy of the good. Moreover, in effect, shipping this patch as-is would be functionally identical to supporting Cookie Clearinghouse now, in that the lists are presently blank.
(In reply to Masatoshi Kimura [:emk] from comment #97)
> See comment #75 and don't comment here.

I think Dan is asking exactly the right engineering questions about this patch.

1) What are the release criteria? They were previously a) improved conformance to user privacy expectations and b) non-breakage. We have substantial data supporting both points. Is that data insufficient to advance to Beta? If so, what more data do we need? Or have the release criteria changed? If so, what are the new criteria, and why the change?

2) Why do domain name false positives matter if a) the third-party origin doesn't use cookies, or b) blocking cookies from the third-party origin doesn't cause breakage? Put differently: we seem to be conflating two different senses of false positives and false negatives. The Facebook example is illustrative. When visiting Facebook, content is served primarily from facebook.com, but also from fbcdn.net. The new policy blocks cookies from fbcdn.net when using Facebook. In the domain name sense, fbcdn.net is a false positive—we're labeling it as distinct from facebook.com, but they're actually the same company. In the feature design sense, however, fbcdn.net is *not* a false positive: Facebook works just fine without fbcdn.net cookies. Why is it a problem to have domain name false positives of this sort?
(In reply to Jonathan Mayer from comment #98)
> Let's set aside the adequacy of current non-breakage data for the moment.
> It's a conversation I'm glad to have—but maybe we need not agree on it.
> Would you support moving the default cookie policy into Beta to get more
> data?

In general, yes. But it's not just my decision to make, and there's a complicated set of tradeoffs; such a decision would need consensus from a broader group. It's probably best discussed in a newsgroup/mailing list rather than in this bug.
(In reply to :Gavin Sharp (use gavin@gavinsharp.com for email) from comment #100)
> In general, yes. But it's not just my decision to make, and there's a
> complicated set of tradeoffs; such a decision would need consensus from a
> broader group. It's probably best discussed in a newsgroup/mailing list
> rather than in this bug.

There's already been lengthy conversation in Bugzilla, mozilla.dev.privacy, and several other fora. I thought we were on the same page with release criteria: http://webpolicy.org/2013/05/21/next-steps-for-the-firefox-cookie-policy/

Let me try a different approach. Who is holding up this patch and why?
I don't have any extra insight into the state of things than what's visible in the relevant bugs. Bug 851606 comment 10 suggests that we're waiting for telemetry (bug 837326) to go to beta. Bug 837326 is actively progressing and should be on trunk soon. Bug 851606 is tracking-firefox23+ and we'll need to make a decision by Monday as to how to proceed.

Members of release drivers (https://mail.mozilla.org/listinfo/release-drivers) and ultimately Brendan Eich are the decision makers, if you want reach out for further clarification.
(In reply to :Gavin Sharp (use gavin@gavinsharp.com for email) from comment #102)
> I don't have any extra insight into the state of things than what's visible
> in the relevant bugs. Bug 851606 comment 10 suggests that we're waiting for
> telemetry (bug 837326) to go to beta. Bug 837326 is actively progressing and
> should be on trunk soon. Bug 851606 is tracking-firefox23+ and we'll need to
> make a decision by Monday as to how to proceed.

As I explained in my writeup, I was comfortable with a brief delay to validate that we had met our release conditions. I think we've amply done that. Bug 837326 is largely duplicative of (and much less precise than) what we've already measured on the efficacy release condition with crawl and Safari data. Moreover, bug 837326 doesn't address the breakage release condition. I'm all in favor of gathering additional data, but this Telemetry work hardly seems to justify withholding the default from Beta.

> Members of release drivers
> (https://mail.mozilla.org/listinfo/release-drivers) and ultimately Brendan
> Eich are the decision makers, if you want reach out for further
> clarification.

I've pinged the intersection of the release-drivers list and the bug CC list with a pointer. I hope we can get some clarity on why this patch is trapped in limbo. It sure seems odd that a much-demanded feature is getting held up, and nobody can quite explain by whom or why.
Mozilla folks, how much time is left to make a decision on 23 Beta? It sure would be a shame to kick the can down the road. Thanks.
See https://brendaneich.com/2013/06/the-cookie-clearinghouse/.

We are not delaying only for missing telemetry -- we know that naive visited-based cookie blocking has false results (in particular empowering first parties), and we want the CCH set up and integrated into Firefox nightly and aurora before shipping in beta or final.

/be
Brendan,

I'm a little confused. What "false results" do you have in mind? Our data supports the release criteria: there's a substantial privacy gain and almost no breakage. We know that there's room for privacy improvement (i.e. underblocking), but why should that delay release?

Also, I'm a newbie around here and unfamiliar with Mozilla's decision making process. Who are the "we" that are deciding to delay?

Thanks!
Jonathan
Jonathan: you and I have discussed in person FB social widgets broken by false positives, and first parties empowered by false negatives. My blog posts talk about these in plain language.

It won't do to feign ignorance based on limited bugzilla-reported problems with the patch. The false results are real problems as you and I have discussed in person; they led to us then discussing exception mechanisms, where you proposed UI to put the user in the loop (I have a whiteboard picture from this meeting).

As for decision-making, Mozilla delegates to module owners in a pretty-flat structure, per http://www.mozilla.org/about/owners.html, with escalation up to me on technical disputes and Mitchell on process/governance. Mitchell and I are at least two of the people in "we" but I believe Sid and Monica are too.

/be
Brendan,

If I understand correctly, you have two objections to landing this patch.

First, that it could break Facebook social widgets. But it doesn't! I don't follow what the concern is.

Second, that this patch could (even further?) tip the balance of advertising economics towards certain websites. Setting aside the merits of the concern for the moment—I'd be glad to discuss later—when did advertising economics become a component of the release criteria? And why should Mozilla prize speculation about advertising economics over long-demanded user privacy?

As to the decision making process—I'm earnestly trying to understand what's going on. I have no particular insight into what you, Mitchell, Sid, Monica, or anyone else is thinking.[1] I imagine that the broader Mozilla community has even less understanding of where this important issue stands. The Mozilla mission statement calls for "[t]ransparent community-based processes" that "promote participation, accountability, and trust." That's all I ask.

Best,
Jonathan

[1] We, for example, have only met about this patch twice, most recently a month and a half ago. Both occasions were down-to-the-release-deadline and inconclusive. I arranged a subsequent followup in Mountain View... and you no-showed. By contrast, if I recall correctly, you indicated you took nearly a week in New York to hear from advertising companies prior to deciding to hold in Aurora.
(In reply to Jonathan Mayer from comment #108)
> First, that it could break Facebook social widgets. But it doesn't! I don't
> follow what the concern is.

That's great to hear.

However the false positive problem remains, and people do turn off the default cookie policy in Safari -- Apple's user help docs tell them to! -- when they run into sites that it breaks.

> Second, that this patch could (even further?) tip the balance of advertising
> economics towards certain websites. Setting aside the merits of the concern
> for the moment—I'd be glad to discuss later—when did advertising economics
> become a component of the release criteria?

This isn't just about "economics", as if that was compartmentalized and isolated from privacy effects, including privacy degradation by further-empowered first parties.

This is about the patch having flaws due to a simplistic model of "visited = trusted".

That Apple users in significant numbers disable the default is further evidence that the patch isn't good enough.

> And why should Mozilla prize
> speculation about advertising economics over long-demanded user privacy?

Who says the patch improves user privacy in the long run? You are speculating that way, but no one knows.

Everything we do has effects in the future that we can only partially foresee. This cuts all ways and doesn't favor a simplistic approach with known flaws. We will attempt to fix the known flaws first.

Users who want really hardcore privacy have add-ons, but those users are few, and they do not buy into the paternalistic idea that we know best -- and that the Safari approach is best -- won't fly with hardcore privacy folks anyway. It's not enough, in part due to the false positives and negatives.

> As to the decision making process—I'm earnestly trying to understand what's
> going on. I have no particular insight into what you, Mitchell, Sid, Monica,
> or anyone else is thinking.[1]

We talked plainly about the need for exception management. Want me to post the whiteboard picture as a reminder?

We talk to ad-tech companies among others, but when I was in New York we also spoke with publishers, one of whom wanted us to roll the dice and push the simplistic patch.

Again, we talk to all affected parties, not just to you. No one has undue influence with us. As evidence of this, behold how both you and the ad-tech folks (IAB) are both mad at us right now.

/be
Hi,
This is getting very interesting.

Jonathan, 
Of course the Mozilla organization has a responsibility towards the ecosystem that makes the web exists today and tomorrow, and not only towards web browser users.
Users also understand that what they get for free today relies on their acceptance of some counterparts. It's a question of balance. Like for NSA accessing Facebook's data: there is opportunity for security and risk of abuse.

Publishers, IAB and also web analysts are concerned by this patch that proposes to mimic an arbitrary default behavior of Safari.

What's needed is a long term vision of what type of data sharing is acceptable for the end users and the publisher towards advertisers (as providers of value added services and revenue) for a sustainable ecosystem, where Google isn't the sole actor, where new services can emerge while adhering to common rules.

For example, cookie sync across content providers, ad providers and search providers looks to me as real concern since every aspect of life (what you read, whatch, where you are, what you buy, what you like, what you search for...) all get collected together.

Every single publisher today offers its audience to dozens of different ad networks collecting data on their audience without any control on what's being collected. It's probably another valid concern.

Rules of the acceptable behavior need to be defined, agreed by all parties and enforced by authorities. 
A web browser doesn't rule the internet, it's just influencing how it's being used, experienced. It's part of the ecosystem.

That's probably close to what Brendan says.
> That's probably close to what Brendan says.

Yes. Thanks.

/be
Brendan,

This patch assumes that visiting a website is a signal of trust. I agree, it is naive. Users frequently visit untrusted websites. The status quo, however, is far more naive: Merely existing on the web is a signal of trust. Merely existing on the web is sufficient to set cookies and track a user. If the measure of success is conformance to user privacy preferences, this patch is a vast improvement over the status quo.

The overwhelming majority of Firefox users are opposed to third-party web tracking. I understand that you have concerns about introducing structural risks into the advertising-supported web ecosystem. I disagree. But our views are irrelevant: who are we to paternalistically second-guess user privacy preferences? The Mozilla community has spoken. Our responsibility is to listen. That's what it means to put users first.

Jonathan
The status quo depends on both Firefox and websites. If Mozilla changes the implementation, they may change their websites. That's why it's hard to tell the future. When Safari did, most of them didn't. But how about Firefox + Safari? All we know is they do want to track users.

One of the worst results of this patch, I assume, would be every website will begin to redirect to an intermittent and instantaneous advertizing page simply so as to set cookies. That's waste of resource on the internet and will make nobody (web developers, UA developers, nor users) happier, but I guess they will do accordingly. For they strongly want to track users. I'm quite sure they will never give up.

By the way, in my opinion, being a paternalist is not bad. Some users don't know what a "cookie" means. I don't think all of them should learn it.
I admit I did not read this bug in full yet, but what happens if a user often cleans up temporary files with a cleaner utility like CCleaner?

I assume *all* pages are in the state of having never been visited afterwards?

Won't this basically disallow *all* pages to save cookies on first visit then and potentially produce massive issues?
Given that "previously visited" is determined from existing cookies for a given website, clearing the cookies for a domain should also reset the "visited" status.

Your last assumption doesn't seem to apply though as this restriction doesn't affect /direct/ (intended) visits to a website, which thus would still be allowed to set cookies, only its inclusion as a third party to /other/ websites you visit.
This patch needs an affirmative strategic decision. It's been in the works for over nine months. The code has been final and the bug has been RESOLVED FIXED for over six months. We initially intended to roll out with the Firefox 22 release cycle; Firefox 24 is around the corner. The status quo is an indefinite limbo that is a disservice to users and web developers.

The options, as I see it, are:

1) Land the patch as-is.
2) Yank the patch.
3) Wait on the Cookie Clearinghouse.

#3 doesn't make sense as a matter of product design (see http://webpolicy.org/2013/05/21/next-steps-for-the-firefox-cookie-policy/). Mozilla hasn't hesitated to iteratively improve on other features, and evidence of breakage is almost nil. Privacy is getting singled out for special treatment.

Setting aside those previous concerns, there's a new consideration. The Cookie Clearinghouse has proven slow-moving. I mean that solely descriptively, not as a criticism—there are myriad reasons for the snail's pace. If we wait on the Clearinghouse we are, most likely, delaying this feature by at least another six months. If that's the strategic decision, so be it. But we shouldn't make this a multiyear effort by default, and if we delay so egregiously, we certainly owe Firefox users an explanation.

Jonathan
I think option #3 makes the most sense from what I've seen.  The patch causes plenty of sites not to work.  Breaking the internet is not iterative improvement.  AFAICT the cookie clearinghouse idea *is* the iterative improvement here.

I agree with you that we should resolve this one way or another soon.
(In reply to Jason Duell (:jduell) from comment #117)
> I think option #3 makes the most sense from what I've seen.  The patch
> causes plenty of sites not to work.  Breaking the internet is not iterative
> improvement.  AFAICT the cookie clearinghouse idea *is* the iterative
> improvement here.

I completely agree: if this patch breaks the web experience for Firefox users, it shouldn't go to Release. Last I checked, though, there were no more than a handful of reported issues. Do you have some data that suggests otherwise?

> I agree with you that we should resolve this one way or another soon.

What's the best way to do that?
I just got my hands on some user study data that I can use to estimate the number of third party cookies that would be affected by this change. Analyzing this data and writing up the results is my highest priority. I should have something for internal review by Monday and hopefully be able to release it shortly after that.
(In reply to Monica Chew [:mmc] (please use needinfo) from comment #119)
> I just got my hands on some user study data that I can use to estimate the
> number of third party cookies that would be affected by this change.
> Analyzing this data and writing up the results is my highest priority. I
> should have something for internal review by Monday and hopefully be able to
> release it shortly after that.

Awesome. Does the data address breakage?
Yes, the data includes existing cookie count from host. I need to do my due diligence getting it vetted to make sure that the aggregated data is ok to release from a user data point of view, even though this was a paid study with explicit study consent agreements.
(In reply to Monica Chew [:mmc] (please use needinfo) from comment #121)
> Yes, the data includes existing cookie count from host. I need to do my due
> diligence getting it vetted to make sure that the aggregated data is ok to
> release from a user data point of view, even though this was a paid study
> with explicit study consent agreements.

I'm not following... how would that data address breakage?
I think I might have misunderstood your question. From the data we can find the number of third party cookies per domain that are currently accepted, that would be rejected under this patch, based on the number of cookies that already exist for that domain. We can also examine which domains are affected most.
(In reply to Monica Chew [:mmc] (please use needinfo) from comment #123)
> I think I might have misunderstood your question. From the data we can find
> the number of third party cookies per domain that are currently accepted,
> that would be rejected under this patch, based on the number of cookies that
> already exist for that domain. We can also examine which domains are
> affected most.

Sounds super-helpful. Looking forward to the results. To measure breakage, though, we'd have to also have a means of determining cookies that are necessary to the web experience.

So, Jason, what's got you thinking breakage is so excessive that the patch can't go to Release?
(In reply to Jonathan Mayer from comment #124)
> (In reply to Monica Chew [:mmc] (please use needinfo) from comment #123)
> > I think I might have misunderstood your question. From the data we can find
> > the number of third party cookies per domain that are currently accepted,
> > that would be rejected under this patch, based on the number of cookies that
> > already exist for that domain. We can also examine which domains are
> > affected most.
> 
> Sounds super-helpful. Looking forward to the results. To measure breakage,
> though, we'd have to also have a means of determining cookies that are
> necessary to the web experience.

As far as I know, that's an unsolved problem. We could extend the cookie RFC to include an isNecessary bit (kidding).

The other alternative is user feedback. Because of the nature of this kind of blocking, it is very hard for non-technical people to discover the root cause of breakage. See threads on Click-to-Play discoverability on firefox-dev for example (different technical problem and solution, similar user experience issues). Same thing with Mixed Content Blocker.

Cookie blocking is far behind on both of these other features in terms of discoverability and user experience.
Personally, I hit bug 860120, and I also ran into Mozilla's Persona login being busted for me 

   http://www.mozilla.org/en-US/persona/

At that point I confess I got lazy and just re-enabled 3rd party cookies.

I seem to remember a number of other bugs going by about 3rd party bustage.  I don't have the bug numbers handy though--does anyone else?

Also comment 109 makes it sound like there's a significant percentage of Safari users that are enabling 3rd party cookies.  If true that's more likely a result of bustage workaround than philosophical reasons (most users don't even understand what 3rd party cookies are).

I'm not going to pretend this is a
Whoops :)

I'm not going to pretend this is a precise set of metrics--more data would be great.  But without good data I'm hesitant to land this code--I've gotten tired of breaking the web for fixes that seemed "safe enough" without data to back that up.
Maybe a silly question, but why don't you just throw away third party cookies at the end of a session? That way you won't break sessions or logins or other site dependencies while actively browsing, but you're still preventing tracking cross-session with permanent profiling data (which would be the biggest privacy concern for users).
(In reply to Mark Straver from comment #128)
So, simply change this to "quickly discard" cookies from sites users haven't visited instead of block? That actually makes a lot of sense, though exploiting LSOs and other junk would allow some things to exploit their way around it. If it reduced breakage and those workarounds themselves are addressed (the so called "super cookie" routes) then that might be a good idea to investigate.

(In reply to Jason Duell (:jduell) from comment #117)
> Breaking the internet is not iterative improvement.

Sometimes it is. You can't keep backwards compatibility with everything forever and expect to continue to make improvements. Sometimes the part of the Internet that would break probably needs to be broken. We just need to find the best way to make that transition.
(In reply to Jason Duell (:jduell) from comment #126)
> Personally, I hit bug 860120, and I also ran into Mozilla's Persona login
> being busted for me 
> 
>    http://www.mozilla.org/en-US/persona/
> 
> At that point I confess I got lazy and just re-enabled 3rd party cookies.

I just tested Persona with the new cookie policy. It works fine. In fact, the Persona FAQ tells iPhone Safari users to choose the "From Visited" policy: https://support.mozilla.org/en-US/kb/how-enable-cookies-iphone

> I seem to remember a number of other bugs going by about 3rd party bustage. 
> I don't have the bug numbers handy though--does anyone else?

Here's what I have:

-Bug 849948 (Day One Center - scheduling)
-Bug 818340 Comment 80 (Western Federal Credit Union - ?)
-Bug 860120 (Boeing Employees Credit Union - document signing)

Seems quite short, especially since there's been over half a year to report compatibility issues.

> Also comment 109 makes it sound like there's a significant percentage of
> Safari users that are enabling 3rd party cookies.  If true that's more
> likely a result of bustage workaround than philosophical reasons (most users
> don't even understand what 3rd party cookies are).

When I last measured Safari browsers, about 80-90% had third-party cookie blocking enabled. I agree that those likely reflect workarounds. I wouldn't think that proportion was sufficiently significant to hold this patch, though. There's also going to be some amount of workaround disabling with CCH too, owing to breakage that doesn't make the list, delays populating the list, and off-list intranet sites.

> I'm not going to pretend this is a

(In reply to Jason Duell (:jduell) from comment #127)
> Whoops :)
> 
> I'm not going to pretend this is a precise set of metrics--more data would
> be great.  But without good data I'm hesitant to land this code--I've gotten
> tired of breaking the web for fixes that seemed "safe enough" without data
> to back that up.

The good news is, we do have data to back up our assumptions. The bug reports are very promising. Safari sure seems to work fine. And what's more, if we migrate the patch to Beta, we'll have even more data before going to Release. As Monica noted, user feedback is the way to evaluate the feature.

Mozilla needs to make a decision. This patch is not perfect. There is uncertainty. But I think we're close enough, and know enough, to responsibly advance to Beta. If it turns out that breakage is unacceptable, we can then decide whether to wait for half a year on the Cookie Clearinghouse.
(In reply to Dave Garrett from comment #129)
> So, simply change this to "quickly discard" cookies from sites users haven't
> visited instead of block? That actually makes a lot of sense, though
> exploiting LSOs and other junk would allow some things to exploit their way
> around it. If it reduced breakage and those workarounds themselves are
> addressed (the so called "super cookie" routes) then that might be a good
> idea to investigate.

If I recall correctly, sessionifying third-party cookies is presently broken. I agree that the approach is certainly worth investigating if it would allay bustage concerns. At present, though, I'm not touching a line of code until there's unambiguous Mozilla buy-in. This patch was enough of a bait and switch.
(In reply to Dave Garrett from comment #129)
> (In reply to Jason Duell (:jduell) from comment #117)
> > Breaking the internet is not iterative improvement.
> 
> Sometimes it is. You can't keep backwards compatibility with everything
> forever and expect to continue to make improvements.
That's not true. I encourage you to read https://github.com/DavidBruant/ECMAScript-regrets#foreword especially the "...but there is a way forward" section. Unfortunately, it doesn't apply to cookies.

> Sometimes the part of the Internet that would break probably needs to be broken. We just need to
> find the best way to make that transition.
History shows that this has only been possible for corner cases so far. I wish it was otherwise too. There are plenty of things we wish were fixed (typeof null shouldn't be "object" for instance) but that won't happen.
Another release date is nearly upon us. Is Monica's new study sufficient to unstick the patch?

For those waiting on Cookie Clearinghouse, FYI, the team hasn't even talked for a month and a half.
My interpretation of the data from http://monica-at-mozilla.blogspot.com/2013/10/cookie-counting.html is that this patch does not make a difference, either way -- it affects 9% of third-party cookies, and many of the organizations using third-party cookies have workarounds that allow them to set third-party cookies in a first-party context. It seems clear that we need to do something and equally clear that this patch is not the answer. I think this patch was useful in understanding challenges in doing privacy-related work, but has reached the end of its usefulness.

Beyond the scope of this bug, it doesn't seem possible to do experimentation in this area without a coherent strategy. Small technical changes incur a lot of non-engineering overhead resulting in stop energy. We need to be smarter about how we approach the problem, not just throw together a bunch of piecemeal changes.
I'm confused, Monica. The data appears to confirm our prior assumptions:

1) Third parties track *lots* of users. (See bug 818337.)

2) For the new policy to be effective, something will have to be done about old cookies. This patch would become most effective once users clear their cookies. (See http://webpolicy.org/2013/02/22/the-new-firefox-cookie-policy/.) Prior measurements suggest a substantial share of users clear their cookies over a multi-month period. (See, e.g., http://www.comscore.com/layout/set/popup/content/download/1445/16059/file/Cookie_deletion_white_paper.pdf and http://atlassolutions.com/wwdocs/user/atlassolutions/en-us/insights/AIDMIOnCookieDeletion.pdf.) Mozilla could, of course, conduct its own study of cookie clearing. Y'all could also add some logic to account for old third-party cookies during the transition to the new policy. (E.g. bug 844623.)

3) The new policy underblocks first-parties-as-third-parties and temporary redirects through third parties. (See http://webpolicy.org/2013/05/21/next-steps-for-the-firefox-cookie-policy/.) In the longer term, Cookie Clearinghouse provides a path forward on these issues.

The results suggest that the patch is on the right track, not that it "does not make a difference."
Hi Monica,

Your study says "Set-Cookie headers are not the only method for setting cookies, but they are sufficiently prevalent to be representative" but this is inaccurate.

You did not measure javascript writes to document.cookies so must have missed the majority of cookies. For example google-analytics script creates __utmX cookies (one with a 2 year expiry) via writes to document.cookies and is on over 70% of the top 100,000 websites making them the most ubiquitous cookies out there, and your study missed them. These (javascript generated) cookies can be first or third party (they can be created by script inside third-party frames).

The only way that third-party cookie using organisations have for co-opting first-party cookies is to get the first-party site to install a script library that writes to document.cookies, and then communicate this in URL query parameters or postMessage events. This puts some organisation road blocks in helping to reduce its prevalence.

You are correct that bug 818340 does not handle this third-party co-option of  first-party cookies, but it does give users a way to block many cookies placed by non-script content, both via document.cookies writes and HTTP response headers.  If your study has not uncovered a downside from doing that there is no reason why is should not be deployed. Even in the unlikely event a problem (with the default) did appear, this could be alleviated by only enabling the block when the DNT signal was set.

The first-party co-option case could then be addressed in a follow on.


Mike
(In reply to Jonathan Mayer from comment #135)
> I'm confused, Monica. The data appears to confirm our prior assumptions:
>
> 1) Third parties track *lots* of users. (See bug 818337.)

No arguments here.

> 2) For the new policy to be effective, something will have to be done about
> old cookies. This patch would become most effective once users clear their
> cookies.
<snip>
> 3) The new policy underblocks first-parties-as-third-parties and temporary
> redirects through third parties. (See
> http://webpolicy.org/2013/05/21/next-steps-for-the-firefox-cookie-policy/.)
> In the longer term, Cookie Clearinghouse provides a path forward on these
> issues.

The problem with clearing all third party cookies, is that they do have

> 1) Third parties track *lots* of users. (See bug 818337.)

No arguments here.
 
> 2) For the new policy to be effective, something will have to be done about
> old
> cookies. This patch would become most effective once users clear their
> cookies.
<snip>
> 3) The new policy underblocks first-parties-as-third-parties and temporary
> redirects through third parties. (See
> http://webpolicy.org/2013/05/21/next-steps-for-the-firefox-cookie-policy/.)
> In
> the longer term, Cookie Clearinghouse provides a path forward on these
> issues.

My point is, why is this the right approach when it is clear that third parties are managing to get in the location bar? Even with cookie clearing, over 25% of set-cookie attempts from adnxs.com came from users with adnxs.com in their history. Since the distribution of cookies is concentrated so heavily among relatively few domains, why isn't a better approach to simply disallow all cookies, regardless of party, for users who don't wish to be tracked by those domains?

This allows us to completely avoid false positives in comment 130, as well as the ones which I am sure would occur (with no UI to show for it) for less savvy users.
Sorry about the cut and paste error. I also wanted to say, I think that cookies are the wrong layer entirely for managing tracking. The engineering story for concentrating on cookies is very weak considering all of ways websites can compute statistical identifiers. I think it would make much more sense to use an nsIContentPolicy-like approach.
Getting in the location bar is not that easy. If it was easy there would not have been the expressed outrage. I do not think many people visit third-party OBA domains other than to get an opt-out cookie, did you check for that?

Statistical identifiers used by third-parties must be calculated by JS in an third-party frame and are usually threaded back using a cookie in the return XHR. This fix would help to stop some fingerprinting like that. A further fix could also block XHRs from third-party frames.

You have to start somewhere and this is a good place.
(In reply to Monica Chew [:mmc] (please use needinfo) from comment #137)
. . .
> My point is, why is this the right approach when it is clear that third
> parties are managing to get in the location bar? Even with cookie clearing,
> over 25% of set-cookie attempts from adnxs.com came from users with
> adnxs.com in their history.

If I'm reading the results correctly, users rarely visit pure third-party (e.g. advertising) origins. This aligns with my previous results on Safari users.

It appears that AppNexus is an outlier. I don't see how that invalidates the approach.

> Since the distribution of cookies is
> concentrated so heavily among relatively few domains, why isn't a better
> approach to simply disallow all cookies, regardless of party, for users who
> don't wish to be tracked by those domains?
> 
> This allows us to completely avoid false positives in comment 130, as well
> as the ones which I am sure would occur (with no UI to show for it) for less
> savvy users.
. . .
> I think that cookies are the wrong layer entirely for managing tracking. The engineering
> story for concentrating on cookies is very weak considering all of ways
> websites can compute statistical identifiers. I think it would make much
> more sense to use an nsIContentPolicy-like approach.

If you'd like to design a new response to third-party tracking, fire away. A curated blacklist for HTTP requests would sure be effective.

In the interim, why not deliver a feature that provides some privacy? This patch has been percolating for nearly *a year.*
(In reply to Monica Chew [:mmc] (please use needinfo) from comment #138)
> The engineering
> story for concentrating on cookies is very weak considering all of ways
> websites can compute statistical identifiers.

This doesn't make any sense. Cookies are the primary way pages were given to store data, and the most straightforward and commonly used way of tracking stuff. Let's not worry about new complicated tracking methods until we deal with the simple frequently abused ones first.

The point here is to just make sure that cookies are only stored for domains requested by the user in some form, rather than letting sites track people using their own computers' cookies without even having a prior association with them beyond a script they didn't even notice. Fingerprinting is a whole other problem to deal with another time.
(In reply to michael.oneill from comment #136)
> Hi Monica,
> 
> Your study says "Set-Cookie headers are not the only method for setting
> cookies, but they are sufficiently prevalent to be representative" but this
> is inaccurate.
> 
> You did not measure javascript writes to document.cookies so must have
> missed the majority of cookies. For example google-analytics script creates
> __utmX cookies (one with a 2 year expiry) via writes to document.cookies and
> is on over 70% of the top 100,000 websites making them the most ubiquitous
> cookies out there, and your study missed them. 

Google Analytics cookies are first party cookies, then out of the scope of this discussion, accumulating data only within the scope of each site's domain.
I said JS generated cookies are both first-party and third-party. GA cookies are often first-party in a narrow technical sense but very often they are created by JS inside third-party frames so are they are then third-party (in the technical sense). In fact the bulk of them are delivered that way.

This feature addresses these cookies in the latter case.

Even when they are in the "location bar" first-party domain, the UID encoded value in these cookies is communicated along with the containing host domain in the query parameters of the URL of am external resource request (to foreign domain google-analytics.com)so they are the "co-opted" cookies you referred to. They are being used to communicate a tracking event to a third-party domain other than the one in the location bar. 

This feature does not handle the last case but a future one "built on its shoulders" might.
I forgot to point out a domain specific UID concatenated to a host domain results is a universally unique identifier.
(In reply to Jonathan Mayer from comment #98)
> > See also: https://brendaneich.com/2013/06/the-cookie-clearinghouse/
> 
> I'm a member of the Cookie Clearinghouse Advisory Board. It's an exciting
> initiative! I don't, however, see why this patch should wait even longer
> while that project spins up. We shouldn't make the perfect the enemy of the
> good. Moreover, in effect, shipping this patch as-is would be functionally
> identical to supporting Cookie Clearinghouse now, in that the lists are
> presently blank.

So, bug 885136 on the Cookie Clearinghouse has been won't-fixed for now given that those efforts have apparently stalled. For the last six months now, bug 851606 has been in effect, disabling the feature in the releases by default while keeping it for the nightly and beta builds (not having  a consistent behavior between testing and release builds by itself is probably discouraged).

Thus, isn't it time to revisit this and consider backing out mozilla-central changeset 076b8758ecb0?

This is a useful feature, would be nice to have users benefit from it by default in the releases too. I'll be happy to file a new bug for the backout if you don't want to consider this discussion here, but it wouldn't be worth the effort if it's getting won't-fixed within 30min anyway with reference to the discussion here.
(In reply to rsx11m from comment #145)
> Thus, isn't it time to revisit this and consider backing out mozilla-central
> changeset 076b8758ecb0?

I would certainly support undoing bug 851606. About time.
Depends on: 999170
Ok, so let's continue the discussion in bug 999170 which I've just opened.
Depends on: 1026790
Keywords: relnote
Blocks: 1161646
See Also: → 854007
Flags: sec-review?(mgoodwin)
You need to log in before you can comment on or make changes to this bug.