Open Bug 1907659 Opened 4 months ago Updated 1 month ago

Disable "ad measurement" by default

Categories

(Core :: DOM: Core & HTML, enhancement)

Firefox 128
enhancement

Tracking

()

ASSIGNED

People

(Reporter: andi.m.mcclure, Assigned: a_bit_camera_shy)

References

(Blocks 1 open bug)

Details

Attachments

(2 files)

User Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:128.0) Gecko/20100101 Firefox/128.0

Steps to reproduce:

Today I learned that in Firefox 128, Firefox has secretly added and enabled by default a feature named "privacy-preserving ad measurement" (https://support.mozilla.org/en-US/kb/privacy-preserving-attribution?as=u&utm_source=inproduct). I am familiar with this technology and I do not believe the "privacy-preserving" aspects of the technology are real. This is the exact sort of pro-advertiser, anti-user feature I am currently moving from Google Chrome to Firefox to avoid.

I discovered this morning that this feature had been enabled without my consent in my personal copy of Firefox, and I feel violated. It is disabled now, but I have no way of knowing how much tracking Firefox performed and transmitted to ad firms before I got the chance to turn it off. This should never have happened.

Actual results:

I understand that technologies such as fingerprinting or cookies can be used to track me. But those features are not specifically designed for tracking in advertising; I accept those features because I get some benefit from them. My browser should not be voluntarily gathering and sending information to advertisers for the intended benefit of advertisers and at no benefit to me. That is not preserving my privacy. That is violating it.

I was previously upset to see this sort of feature added in my previous browser, Google Chrome. However, your behavior in this regard is actually less privacy-conscious/more user-hostile than Google Chrome's. Google Chrome has ad-snitching features similar to Firefox (in a similarly misleading manner to Firefox's labeling, they call their ad reporting feature "Ad Privacy") but they do not enable the features without first giving the user a chance to turn them off. When Chrome is ready to turn on the ad-snitching features for a profile, they pop up a box and inform the user they are turning on the tracking features. Chrome's box does not in my opinion adequately explain what is happening, but this is still a massive improvement over Firefox's behavior because the change does not happen secretly, and the user is given a chance to provide/revoke consent before the browser starts sending tracking data.

Expected results:

"Expected behavior": I do not believe you should have implemented this feature at all. However if you must implement it it needs to either be disabled-by-default, or otherwise obtain active user consent for the feature through a method such as, for example, a popup box before enabling like Chrome uses (although nobody will be happy to see such a nagware box— because nobody would want this feature if they knew it existed).

Notes: I have not marked this as a security bug because it concerns entirely public information and does not need to be hidden. However, I consider this a security bug. This is because (1) With the feature enabled personal browsing information is leaked, despite your claims to be obfuscating it (2) By implementing anti-user, anti-privacy features on a software update and giving people no means of finding out other than word of mouth, you are introducing a disincentive to enable automatic software updates. For a web browser, damaging user trust in automatic software updates creates a very, very serious security issue. Incidentally, I am using the Ubuntu Snap version of Firefox on Ubuntu Linux, but I believe that my configuration is not relevant to this issue.

The Firefox download page currently includes:

Get the browser that protects what’s important
No shady privacy policies or back doors for advertisers. Just a lightning fast browser that doesn’t sell you out.

... and I don't understand how the state of this feature doesn't make those statements false.

Attached image Firefox.png

The Bugbug bot thinks this bug should belong to the 'Core::Widget: Gtk' component, and is moving the bug to that component. Please correct in case you think the bot is wrong.

Component: Untriaged → Widget: Gtk
Product: Firefox → Core
Component: Widget: Gtk → Privacy: Anti-Tracking
Component: Privacy: Anti-Tracking → DOM: Core & HTML
See Also: → 1907678

Fwiw the feature is very intentionally an experiment, see the explainer in https://github.com/mozilla/explainers/tree/main/ppa-experiment#opt-out, gated by origin trials. Only some MDN ads are part of the experiment, and it's not accessible to the wider web other than that.

I do not like being experimented on. I do not like it when Google does it, I do not like it when Facebook and Twitter do it, and I definitely don't like it when Firefox, which makes specific promises those other users don't about protecting the user, does it.

Anyway, Firefox appears to have a specific checkbox for "sure, go ahead and experiment on me"— the "studies" feature. I do have that feature enabled (because up to this point Firefox seems to have used it responsibly), actually, but I do not find privacy sandbox listed in the running studies, and I have yet to find a desktop 128 user who did not have "Ad measurement" turned on by default. So if this is an experiment, it appears even people who have explicitly opted out of being experimented on (to the extent that is what unchecking "studies" means) are being experimented on here.

Also, as far as the justification for default-on/opt-out in the anchor at the Github link above… this is just facially ludicrous. You need to violate the privacy of as many people as possible so that it will be harder to pick one single person in the crowd of people who had their privacy violated? What? I cannot take this seriously.

If anything, the Github PPA document there seems to be admitting the possible existence of a statistical attack by which the behavior of one single user within the DAP aggregate can be reconstructed. Let's take the argument made in the document seriously: That there is some critical number of users for the feature, and above that number the feature is safe but below the number the feature is unsafe and users can be to some degree deanonymized. If that's true, how can you be sure you got it right? What if it turns out the critical number for safety is larger than the number of people who use Firefox period? What if the product ships opt-out, but people like me do public education campaigns and so many Firefox users do opt out that the population using the feature falls below the critical number? If the feature is too dangerous to be opt-in, then the feature is too dangerous to exist.

If this "feature" is really an experiment then it should at least respect the existing setting that enables or disables Studies. I had Studies disabled and I had to opt-out of this, I see this issue as a lack of respect towards users preferences so I think it should be reverted in the next update if Mozilla still wants to develop a browser that really respects the privacy and preferences of it's users.

If Mozilla still wants to leave this settings as-is there should be at least a notice on launch that talks about the feature, I'm also wondering if not giving such notice is in contradiction of European regulation but don't take my word on it, I'm not a lawyer.

Duplicate of this bug: 1907763

Whether this feature is an "experiment" or not, the idea behind it is fundamentally flawed. Building "less abusive" tools for ad network use will not help browser users, as we have an overwhelming amount of historical evidence that the ad networks will still use every single surveillance tool that exists, no matter how invasive it may look. Instead of trying to meet them half way and then be surprised that they still want more, we should realize that they are the adversary and treat them as such.

Type: defect → enhancement
Duplicate of this bug: 1907791

(In reply to JORGETECH from comment #7)

If this "feature" is really an experiment then it should at least respect the existing setting that enables or disables Studies. I had Studies disabled and I had to opt-out of this, I see this issue as a lack of respect towards users preferences so I think it should be reverted in the next update if Mozilla still wants to develop a browser that really respects the privacy and preferences of it's users.

If Mozilla still wants to leave this settings as-is there should be at least a notice on launch that talks about the feature, I'm also wondering if not giving such notice is in contradiction of European regulation but don't take my word on it, I'm not a lawyer.

I was quite surprised to find myself being included in this experiment after having opted out of all studies. Ironically, the settings to opt out of experiments and studies are placed right above the section where this experiment is enabled by default. It's like reading comprehension isn't a thing.

(In reality, it's pretty obvious that Mozilla only respects user privacy whenever it isn't an inconvenience to them, q.v. Pocket, Looking Glass etc.)

No longer duplicate of this bug: 1907791
Assignee: nobody → zuqy2h2ctx8d
Status: UNCONFIRMED → ASSIGNED
Ever confirmed: true
Assignee: zuqy2h2ctx8d → nobody
Status: ASSIGNED → UNCONFIRMED
Ever confirmed: false
Assignee: nobody → zuqy2h2ctx8d
Status: UNCONFIRMED → ASSIGNED
Ever confirmed: true
Duplicate of this bug: 1908975

It’s still relevant today, it seems to me??

Please don't set these flags for these kind of issues.

You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: