Closed Bug 1308340 Opened 8 years ago Closed 6 years ago

checkbox in about:preferences#privacy for privacy.resistFingerprinting (Tor 20244.1)

Categories

(Firefox :: Settings UI, defect)

defect
Not set
normal

Tracking

()

RESOLVED WONTFIX

People

(Reporter: arthur, Assigned: arthur)

References

(Blocks 1 open bug)

Details

(Keywords: stale-bug, Whiteboard: [tor][fingerprinting][fp-backlog][fp-triaged])

Attachments

(1 file)

For Tor Browser, we have created a UI patch to add a checkbox that exposes the "privacy.resistFingerprinting" pref in about:preferences#privacy. When this pref is enabled, it causes a number of system properties to be hidden. For specifics, see
https://dxr.mozilla.org/mozilla-central/search?q=regexp%3A%5BrR%5DesistFingerprinting&redirect=false

We would like to propose uplifting the UI patch to standard Firefox.
Here's a patch adapted from the one for Tor Browser.
Attachment #8798671 - Flags: review?(dolske)
I'd suggest renaming this item in the UI to:

"Expose user preferences from the operating system to web content" and make it opt-out (for Firefox).
From a purely front-end perspective, I don't have any significant issue with the patch here (aside from possible wordsmithing on the string :).

But I'd questions about this from more of a feature/product perspective...

If we offer fingerprinting protection, why wouldn't we turn it on by default?

Is it complete enough to offer as a supported/exposed pref? (i.e., there's a lot of potential fingerprinting surface, if people file bugs saying it's not blocking [insert possibly long list here?], are we planning on fixing those or is there risk of this being perceived as a impotent feature?)

I assume it's possible that the various things this disables in DOM have the potential to break web sites. (And I mean those that are using things for legitimate reasons, not just for tracking ;). There's no notification UI (as there is for tracking protection) -- how much breakage is expected, and how severe? Is there data on impact?

overholt: Do you have an opinion here (or know who would, from a DOM perspective)?
Flags: needinfo?(overholt)
> I assume it's possible that the various things this disables in DOM have the potential to break web sites.

I can only speak about how I plan to use it in context of bug 1308329 - The set of options that we'd like to expose to the web content (either explicitly or implicitly - both ways allow for fingerprinting) is not going to break sites, and in result the fact of blocking will not be detectable.

But I would argue that when the user launches a web app they expect the app experience to match native app experience. If we make this option opt-out, then we basically lose the value of it because it's not significant enough to make any user go and turn it on, the app will not want to bother user requesting permission for it, and the web experience will remain inferior to native apps on that front.

On the other hand, if we turn it on, the user experience for most users will be increased, and the users who do care about fingerprinting and are worried about it, are usually the ones who will most likely look at their Preferences and see that they can turn the fingerprinting protection on.

Once they do, if they notice reduced user experience, they are more likely to link it to their action and decide what they want their browser to do.

I can't speak for any other features that this feature is supposed to block.
(In reply to Justin Dolske [:Dolske] from comment #3)
> From a purely front-end perspective, I don't have any significant issue with
> the patch here (aside from possible wordsmithing on the string :).
> 
> But I'd questions about this from more of a feature/product perspective...
> 
> If we offer fingerprinting protection, why wouldn't we turn it on by default?

Some anti-fingerprinting protections behind this pref are completely harmless, and probably should be turned on by default. Other protections have a (usually minor) usability tradeoff, so I would suggest reserving the "privacy.resistFingerprinting" pref for those and leaving it off by default. Mozilla can always migrate more protection features out from behind the pref in the future.

> Is it complete enough to offer as a supported/exposed pref? (i.e., there's a
> lot of potential fingerprinting surface, if people file bugs saying it's not
> blocking [insert possibly long list here?], are we planning on fixing those
> or is there risk of this being perceived as a impotent feature?)

I think the text will need to call it "resistance" rather than "blocking" to avoid suggesting it provides perfect protection. :) It does offer some significant protection (including hiding system colors, screen orientation, screen dimensions, device pixel ratio, window position on screen, screen click coordinates, supported mime types and installed plugins), but at the same time I hope having this checkbox will help to motivate further anti-fingerprinting efforts by people who are interested. Fingerprinting resistance obviously works best when protections are comprehensive and turned on all together, so I think having a single checkbox is a good way to emphasize that for users and developers.

Tor Browser currently ships a number of anti-fingerprinting features that we hope will be uplifted to Firefox over the coming months, including keyboard layout spoofing, per-platform font whitelists, canvas image extraction protection, JS clock jitter, timezone spoofing, rounded window dimensions, etc. I envision these will be behind this pref as well.
 
> I assume it's possible that the various things this disables in DOM have the
> potential to break web sites. (And I mean those that are using things for
> legitimate reasons, not just for tracking ;). There's no notification UI (as
> there is for tracking protection) -- how much breakage is expected, and how
> severe? Is there data on impact?

I'm not aware of any systematic data collection. In my personal experience using Tor Browser, noticeable breakage due to fingerprinting protections (including those yet to be uplifted) is rare. Probably the most annoying fingerprinting protection in Tor Browser is the user prompt requesting canvas image extraction, which happens (for good reason) depressingly often!
This sounds like a feature we would want enabled in private browsing mode by default, just like tracking protection is. Is there a downside to that? We are currently running a Test Pilot experiment to understand the breakage TP causes in normal mode and this sounds like something we could test there as well.

Furthermore from a user's point of view, this is at the same level of intuitiveness as the DNT signal, which is almost zero. Most users will not know or care what either of them do, but they will want the maximum privacy protection we can give them. For this reason Firefox automatically sends the DNT signal when in a private browsing session or when tracking protection is explicitly turned on. I think we should do the same thing for this setting and relegate the checkbox to an advanced dialog, next to the DNT checkbox.
(In reply to Justin Dolske [:Dolske] from comment #3)
> I assume it's possible that the various things this disables in DOM have the
> potential to break web sites. (And I mean those that are using things for
> legitimate reasons, not just for tracking ;). There's no notification UI (as
> there is for tracking protection) -- how much breakage is expected, and how
> severe? Is there data on impact?

At least some of the things should really be feature-detected so will hopefully not break anything. Ehsan will know better.
Flags: needinfo?(overholt) → needinfo?(ehsan)
(In reply to Andrew Overholt [:overholt] from comment #7)
> (In reply to Justin Dolske [:Dolske] from comment #3)
> > I assume it's possible that the various things this disables in DOM have the
> > potential to break web sites. (And I mean those that are using things for
> > legitimate reasons, not just for tracking ;). There's no notification UI (as
> > there is for tracking protection) -- how much breakage is expected, and how
> > severe? Is there data on impact?
> 
> At least some of the things should really be feature-detected so will
> hopefully not break anything. Ehsan will know better.

This pref for the most part causes us to lie to the web page about some of the system information which may be fingerprinting vectors.  For example, instead of giving any screen coordinates to the page (for example about where the browser window is), we always pretend it's at the top-left corner.  These types of lies are pretty benign, and I seriously doubt that they can break webpages.

But on the flip side, for example this pref causes us to lie about the installed plugins to the web page.  While this is probably OK for most websites, it breaks https://www.mozilla.org/en-US/plugincheck/ for good reason.  :-)  (But in the specific case of plugins, as we're phasing them down this isn't an issue.)

Therefore as far as the risk of breaking web pages as a result of checking this pref goes, I think the breakages are minimal.

About the idea of turning this behavior on for private browsing mode, we _could_ do that, but I don't think doing that (or making  a decision on that) needs to block this bug.
Flags: needinfo?(ehsan)
Note that I also think it's important how we describe this feature in the UI.  We should avoid using terminology that suggests that checking this checkbox somehow magically makes fingerprinting impossible.  :-)
(In reply to :Ehsan Akhgari (Away Oct 25 - Nov 9) from comment #8)
> Therefore as far as the risk of breaking web pages as a result of checking
> this pref goes, I think the breakages are minimal.

I'd want to see evidence that the breakage is minimal before we decided on shipping anything on by default, including in Private Browsing mode.  
When would we see the underlying pref in release?  We could potentially extend the Test Pilot experiment for this pref as well.
Flags: needinfo?(arthuredelstein)
(In reply to Peter Dolanjski [:pdol] from comment #10)
> (In reply to :Ehsan Akhgari (Away Oct 25 - Nov 9) from comment #8)
> > Therefore as far as the risk of breaking web pages as a result of checking
> > this pref goes, I think the breakages are minimal.
> 
> I'd want to see evidence that the breakage is minimal before we decided on
> shipping anything on by default, including in Private Browsing mode.  

That makes sense to me.

> When would we see the underlying pref in release?  We could potentially
> extend the Test Pilot experiment for this pref as well.

Most of the code behind the pref is already released. (The pref is off by default.) The latest two features, (1) hiding the contents of navigator.mimeTypes and navigator.plugins and (2) spoofing screen.orientation, are currently in beta.
Flags: needinfo?(arthuredelstein)
Thanks.

jgruen, what are your thoughts on adding this resisting fingerprinting capability to the existing Test Pilot experiment as a follow-on?
Flags: needinfo?(jgruen)
FYI: For reasons I listed in comment 4 - if this setting will be on by default in non-private mode, I'll be advocating to look for a separate option for exposing OS preferences to content (that will be off by default).
Summary: checkbox in about:preferences#privacy for privacy.resistFingerprinting (Tor 20244) → checkbox in about:preferences#privacy for privacy.resistFingerprinting (Tor 20244.1)
See Also: → 1312655
Whiteboard: [tor] → [tor][fingerprinting]
pdol: seems possible, though of course we'd have to make the changes to the add-on and figure out changes to the telemetry ping in the experiment. 

Another thought: I'm not totally up on all of the different privacy patches being proposed for Firefox RN, but there seem to be quite a few. I wonder if we could put together a bundled privacy test for Test Pilot that randomly assigns users to different cohorts with one or another privacy feature to see how each performs.
Flags: needinfo?(jgruen) → needinfo?(pdolanjski)
(In reply to [:jgruen] from comment #14)
> Another thought: I'm not totally up on all of the different privacy patches
> being proposed for Firefox RN, but there seem to be quite a few. I wonder if
> we could put together a bundled privacy test for Test Pilot that randomly
> assigns users to different cohorts with one or another privacy feature to
> see how each performs.

That's a good suggestion.  Let's discuss this after next week as there are a bunch a features proposed.

Now, with respect to this bug, I looped in Philipp from the UX team.  He and I both feel that exposing this feature in settings as a separate preference likely doesn't make sense since it's value will be very difficult to convey and it'll only get enabled by power users (who can do it from a pref if they really want).  Instead, we favor bundling it with tracking protection so that more users will benefit - which needs some validation before it happens.
Flags: needinfo?(pdolanjski)
(In reply to Peter Dolanjski [:pdol] from comment #15)
> really want).  Instead, we favor bundling it with tracking protection so
> that more users will benefit - which needs some validation before it happens.

Hrm. Could you describe how this would look like in detail? Like, the option would not be visible separately but users would need to activate Tracking Protection in order to get the fingerprinting resistance as well? Or would it be possible to get the latter but not the former?
Flags: needinfo?(pdolanjski)
I haven't spoken with Peter, but I assume he's essentially talking about it being hard to explain both "tracking" and "fingerprinting" to users, and why there would be separate preferences for both. And, ultimately, for this setting to really matter for users, it would be great to have something that's either enabled by default or integrated with an existing feature with visibility (like tracking protection). Otherwise it risks languishing as a setting buried down in prefs that users don't either know about or ever enable.

But the context for that is as a high-level product feature... I certainly assume there will continue to be an about:config setting (as there already is today).
Comment on attachment 8798671 [details] [diff] [review]
0001-Bug-1308340-Add-privacy.resistFingerprinting-checkbo.patch

Clearing review since it seems there's since some product and UX thinking about how to expose this in Firefox.

But...

I'm assuming this doesn't impact Tor, since the backend pref (privacy.resistFingerprinting) is already there, and pref UI is trivial for an addon (?) to implement? Or should we think more about how to get something landed somehow for Tor? (e.g. put Tor UI behind a pref, have a special Tor pref page, or something...)
Attachment #8798671 - Flags: review?(dolske)
(In reply to Justin Dolske [:Dolske] from comment #17)
> I haven't spoken with Peter, but I assume he's essentially talking about it
> being hard to explain both "tracking" and "fingerprinting" to users, and why
> there would be separate preferences for both. And, ultimately, for this
> setting to really matter for users, it would be great to have something
> that's either enabled by default or integrated with an existing feature with
> visibility (like tracking protection). Otherwise it risks languishing as a
> setting buried down in prefs that users don't either know about or ever
> enable.

That's exactly what we were thinking.  We'd want UX to come up with a proposal, but I'd imagine that when a user turns on tracking protection, fingerprint resistance gets enabled too. (so long as we validate it doesn't make the experience materially worse)
Flags: needinfo?(pdolanjski)
See Also: → 1320801
Please do not ENFORCE tracking protection in order for privacy.resistFingerprinting to be enabled (or vice versa). Putting it on about:preferences#privacy seems ideal but as a separate distinct item (just like "Use Tracking Protection" and "History" are). Perhaps call it "Use Fingerprinting Resistance", with a Learn More link. This Learn More link could educate users about the subtle differences. Yes there is blurring/overlap between tracking and fingerprinting, but fingerprinting is a field in it's own right. Fingerprints do not have to leak private info, but may lead to unmasking someone. Yes you can track someone without fingerprinting, and yes you can fingerprint someone but not be able to track them. PLEASE do NOT tie these two (tracking and fingerprinting) together. Just picture a Venn diagram with an intersection of two sets.

Yes, the aim of fingerprinting is "to track" and "tracking protection" as a name fits the bill and both concern "privacy" (hence it should go on about:preferences#privacy), but these really are two quite different beasts. The current "Tracking protection" is more about blocking, whereas "Fingerprinting" is more about spoofing. We all leak a locale and timezone, we all leak screen res and so on. We don't all have to connect to some domains or allow cookies etc. One is blocking, one is spoofing (mainly).

Comment 17: "being hard to explain both "tracking" and "fingerprinting" to users, and why there would be separate preferences for both"
^^ Just because most users do not understand the differences, does not mean their distinctions should be blurred. Making them separate entities and separate processes (i.e having one on or off does not preclude the other) actually helps educate the masses. See my comment above about the Learn More link. Please do not perpetuate or start this "blurring". Cookies can be used to track, so by that argument cookie settings should be under "Tracking Protection" - right? Do you see where I am going with this.

Comment 17: "Otherwise it risks languishing as a setting buried down in prefs that users don't either know about or ever enable"
^^ Not when it's in the Options and visible to all users. Not when it can be blogged about by numerous sites including ghacks.net and arstechnica. Not when it is included in numerous user.js files and so on. It really is a power user preference, but you are in a prime position to mnake it more prominent through education, not through enforcement. A same page, same prominence as Tracking Protection is hardly hiding it.

From my perspective, and no disregards to tracking protection, but one of the many benefits of Firefox is that end users have choices. Many people choose to not use Tracking Protection and instead use extensions such as uBlock Origin. Tracking Protection is shipped on by default and this is great. But when you have ALREADY given users the means to turn it off and substitute it with something else eg through extensions (which can actually enhance the end result), then PLEASE do NOT ENFORCE it on them if they wish to use other features you produce, especially features like this.

==
Comment 5: "Some anti-fingerprinting protections behind this pref are completely harmless, and probably should be turned on by default. Other protections have a (usually minor) usability tradeoff, so I would suggest reserving the "privacy.resistFingerprinting" pref for those and leaving it off by default. Mozilla can always migrate more protection features out from behind the pref in the future."
^^ Seconded. Leave the pref off by default. This allows more scope for future fingerprinting resistance without upsetting users and probably a flurry of bugs. This makes more sense for power users, and let's face it, not only is anti-fingerprinting rather complex and a currently "niche" specialized field that requires knowledge to get right, but also a lot of potential additions to the pref (in future) WILL break some web site functionality.

Currently as of FF50 the only effects are screen/window related, plugin and mime types. But upcoming changes proposed to be tied to this pref include:
- https://bugzilla.mozilla.org/show_bug.cgi?id=863246 : resource://URIs
- https://bugzilla.mozilla.org/show_bug.cgi?id=1217290 : fingerprinting resistence to WebGL
- https://bugzilla.mozilla.org/show_bug.cgi?id=1222285 : keyboard fingerprinting
- yet to see a ticket issued, but time zone/locale

Some of these such as keyboard and time zone/locale can have profound changes and repercussions. By turning the pref on by default, you may limit what can be tied behind the pref in the future.

==
Whether it is on or off by default is up to you guys, my main point was to not enforce tracking protection in order to use resist fingerprinting.
Priority: -- → P2
Priority: P2 → P1
Whiteboard: [tor][fingerprinting] → [tor][fingerprinting][fp:m1]
Whiteboard: [tor][fingerprinting][fp:m1] → [tor][fingerprinting][fp:m3]
Remove this bug from the MVP of anti-fingerprinting.
(https://wiki.mozilla.org/Security/Fingerprinting)

Adding UI checkboxes for First Party Isolation and fingerprinting resistance will be the next steps
after we finish all the Tor Uplift works.
Whiteboard: [tor][fingerprinting][fp:m3] → [tor][fingerprinting][fp-backlog]
We will add this checkbox in Private Browsing mode.
Status: NEW → RESOLVED
Closed: 6 years ago
Priority: P1 → --
Resolution: --- → WONTFIX
Whiteboard: [tor][fingerprinting][fp-backlog] → [tor][fingerprinting][fp-backlog][fp-triaged]
(In reply to Ethan Tseng [:ethan] - 57 Regression Engineering Support from comment #22)
> We will add this checkbox in Private Browsing mode.

Ethan, I assume by this you will handle the UX under Bug 1404017 ?

You may also want to close Bug 1312655 in favor of handling it under Bug 1397624 << exact same thing as RFP but for FPI
(In reply to Simon Mainey from comment #23)
> Ethan, I assume by this you will handle the UX under Bug 1404017 ?
Bug 1404017 is only for adding the preference.
We will file a new bug for the UX/UI.

> You may also want to close Bug 1312655 in favor of handling it under Bug
> 1397624 << exact same thing as RFP but for FPI
Yes, we can close it.  Thanks!  :D
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: