Open Bug 1401440 Opened 4 years ago Updated 3 months ago
.resist Fingerprinting into multiple options
User Agent: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:55.0) Gecko/20100101 Firefox/55.0 Build ID: 20170919185010 Steps to reproduce: 1. Open Firefox. 2. Maximize the window. 3. Close Firefox. 4. Open Firefox. Actual results: The window isn't maximized. Expected results: The window should be maximized. Additional information: This worked in version 54.0 but broke after upgrading to version 55.0.2.
WFM in 55.0.3 on Win10. Try https://mozilla.github.io/mozregression/.
Has Regression Range: --- → no
Has STR: --- → yes
OS: Unspecified → Linux
Hardware: Unspecified → x86_64
Running a bisect or something similar sounds like a lot of work and commonly I'm out of time so I have to reject this for now. Maybe this issue is Linux-specific and it would be better if somebody could confirm this first. As an additional information I remember to have seen the window to unmaximize after it has been maximized and the last session has been restored (menu bar -> History -> Restore Previous Session) but this behavior seems to not always reproduce.
I can't reproduce this issue, please create a new profile in the safe mode of Firefox, you have the steps here: https://support.mozilla.org/en-US/kb/troubleshoot-firefox-issues-using-safe-mode https://support.mozilla.org/en-US/kb/profile-manager-create-and-remove-firefox-profiles?redirectlocale=en-US&redirectslug=Managing-profiles#w_starting-the-profile-manager
On testing this with a temporary new profile this issue does not appear there. Not sure what setting causes this behavior on my old profile. It also looks like changing the size of the unmaximized window does not get stored at all and I also noticed that recovering the session on an unmaximized window did move it a bit down and to the right.
On investigating a bit more I figured out that this issue only happens if privacy.resistFingerprinting is set to true. Also on testing this with Firefox in a Windows 10 VM I'm seeing this issue there too.
OS: Linux → All
I tried to get the regression window wanted with mozregression but did not work, so I did it manually and the result is: on a Nightly version 55.0a1 from 03/29/2017 with build ID 20170329071901 the bug isn't reproducible, but starting with Nightly version 55.0a1 from 03/30/2017 with build ID 20170330030213 the bug is reproducing. Updated flag accordingly.
Tim, is there a way to get this window information using APIs that always return a truthful result when privacy.resistFingerprinting = true?
Priority: -- → P3
Pretty sure this is just a dupe of bug 1402557 and by design. There's extensive discussion about this in that bug.
(Or bug 1402557 is a duplicate of this bug) Users that do still maximize their window after having enabled privacy.resistFingerprinting probably still want the window being maximized (even if the planned warning should land in) while still taking advantage of all the other anti-fingerprinting techniques the setting provides. But I think instead of implementing a solution like simply not affecting maximized windows I think this needs another view on it: The main issue seems to be that privacy.resistFingerprinting does bind all its fingerprinting techniques into one setting but as soon as an user doesn't want to apply specific ones he can either enable them all and live with the issues he encounters or disable them all and thus having a negative impact on his privacy. For this reason I think the most easiest way to solve this issue is to split up privacy.resistFingerprinting into multiple options (privacy.resistFingerprinting.* or such) to give the user detailed control over it. :arthuredelstein seems to be familiar to such things, so what is your opinion here? Is this idea practicable or could there be done something better? At least I believe something needs to be done here as with the current approach privacy.resistFingerprinting does not provide as good real-word user privacy as it could be as users can (and several will) be easily lean towards disabling it completely as soon as a single technique does annoy them somehow.
Severity: normal → enhancement
Summary: Maximized state for the window isn't saved anymore if privacy.resistFingerprinting = true → Split privacy.resistFingerprinting into multiple options
(In reply to Mike de Boer [:mikedeboer] from comment #7) > Tim, is there a way to get this window information using APIs that always > return a truthful result when privacy.resistFingerprinting = true? Actually, the session store will still store the state of the maximized window. But, we don't maximize the window when 'privacy.resistFingerprinting' is true during restoring. And you can get real screen resolution when it is called from the chrome .  http://searchfox.org/mozilla-central/source/browser/components/sessionstore/SessionStore.jsm#4105  http://searchfox.org/mozilla-central/source/dom/base/nsScreen.cpp#338
Hi, sworddragon2 -- thank you for your question. I greatly appreciate this kind of user feedback. We purposely don't split privacy.resistFingerprinting into multiple options, because fingerprinting protections are only effective when they are all used together. (Similarly, if you want to secure a car, you lock all the doors.) The dimensions of a maximized window, in particular, are a big source of distinguishing information and so it would defeat our intention of offering good fingerprinting protection to make window dimension protections optional. That said, I understand these window constraints can be annoying. There are two tickets I think might help reduce the annoyance: * Bug 1404017 proposes an option that would allow users to apply fingerprinting resistance to Private Browsing windows only. That means PB windows would have their dimensions constrained, but normal windows would not. So the user can chooseLe, if they care more about privacy for a given website, they can use a Private Browsing window; if instead they care more about usability for that website they can use a normal window. * Bug 1407366 proposes a dynamic window constraints. Part of that would include a special maximizing behavior which would add margins outside the content page to ensure scripts or CSS on the page cannot detect the available screen size. While the margins might still be annoying, at least it could allow the user's chosen window size to be remembered at the next session. I hope this helps to clarify our approach!
data point: I enabled to preference as soon as I became aware of it. Was extremely annoyed by the non-maximized windows. Finally figured out the cause, and disabled the preference. I want all the other protection it offers, but not at the cost of this one big usability regression. iirc, usability research indicates there's two types of users, namely those who use many maximized windows, and those who don't. If you're someone who doesn't personally use maximized windows, you might not fully grok that a maximized window is something vastly different than a non-maximized window of any size, due to most of its controls being easy to reach at the screen's corners and sides (Fitt's Law). I understand that splitting up the preference in many options would be counterproductive, but I do strongly believe there should be an override/exception for this one usecase, because the alternative for me (and, I suspect, a very large percentage of users) would be to not have its benefits at all.
Thanks for your reply Arthur. (In reply to Arthur Edelstein (Tor Browser dev) [:arthuredelstein] from comment #11) > We purposely don't split privacy.resistFingerprinting into multiple options, > because fingerprinting protections are only effective when they are all used > together. I think there is a tiny flaw in your logic here. - Sites do not always use all possible tracking techniques for various reasons. It also requires a significant amount of effort to implement them and also to update them as the web evolves. Partially tracking protection is already significantly useful. - Even privacy.resistFingerprinting in its current state does not prevent fingerprinting to its full extent. There are still huge entropy sources that websites can untilize and probably there are even much more as even current fingerprinting tests utilize. Given your logic we would currently not need privacy.resistFingerprinting - but this is false. (In reply to Arthur Edelstein (Tor Browser dev) [:arthuredelstein] from comment #11) > * Bug 1407366 proposes a dynamic window constraints. Part of that would > include a special maximizing behavior which would add margins outside the > content page to ensure scripts or CSS on the page cannot detect the > available screen size. While the margins might still be annoying, at least > it could allow the user's chosen window size to be remembered at the next > session. > > I hope this helps to clarify our approach! (Personally for me) that seems to sound useful. I know that resolution sizes are a good source of entropy and as far as I remember I appear already to be unique to fingerprinting tests with just this data and would prefer to change this at some point. But the issues bundling all techniques to privacy.resistFingerprinting still exist and other users might not be happy with them and future changes might introduce new annoyances. Also if privacy.resistFingerprinting would be split into multiple options it should not be an issue at all. If users want to disable it partially why not? Users can also still utilize all of it if they want. If the idea to split privacy.resistFingerprinting is not fully dead yet an approach could be having privacy.resistFingerprinting.enabled as master-switch which defaults to false and privacy.resistFingerprinting.use* for all its techniques which default to true. Similarly to the current state if somebody sets privacy.resistFingerprinting to true enabling privacy.resistFingerprinting.enabled would at default also utilize all anti-fingerprinting techniques. But now users are able to opt-out specific ones if they have an issue with them.
(In reply to Sander from comment #12) > I understand that splitting up the preference in many options would be > counterproductive, but I do strongly believe there should be an > override/exception for this one usecase, because the alternative for me > (and, I suspect, a very large percentage of users) would be to not have its > benefits at all. Hi, Sander -- thanks for your comment. I understand what you're saying, but we need to weigh the privacy loss against the usability benefits of making this change. I think the benefits of fingerprinting protection will be hugely reduced for users who disable the window size constraints. That's because unconstrained window sizes provides a lot of fingerprinting entropy. And on top of this, some users already have other fingerprinting entropy that we cannot mask, so the damage can be quite serious for those users. So I don't think it's a good idea to allow this protection to be disabled and still give users the impression we are resisting fingerprinting. The difficulty is that, below a threshold unique to each user, fingerprinting protection is totally ineffective. To demonstrate with a contrived example: say I'm in a set of 1000 users, and, unprotected, the browser exposes 20 bits of entropy. To make all users look the same, we need to hide all 20 bits. But if I decide to mask only 10 bits, then I am still unique in the set. Those 10 bits of protection aren't enough to provide any remaining privacy benefit. Above the threshold, each bit of protection we provide doubles size of the set of indistinguishable users, and thus doubles the cost of, say, physically searching every user's house. I do think that Bug 1407366, or something like it, could make this protection less of a usability regression. At least then the controls (tabs, menus, title bars, etc.) would be at the screen's edge or corners as usual. So I think your observation underlines the need for us to work hard on improving the implementation of this protection so it's as usable as possible, especially for maximized windows. At least I think we should make an attempt before we take the drastic step of making a big retreat in our efforts to provide effective fingerprinting resistance.
(In reply to sworddragon2 from comment #13) > I think there is a tiny flaw in your logic here. > - Sites do not always use all possible tracking techniques for various > reasons. It also requires a significant amount of effort to implement them > and also to update them as the web evolves. There's no need to implement them yourself -- just use a fingerprinting library. Plus, window dimensions are well known and trivial to measure: [window.innerWidth, window.innerHeight]. > Partially tracking protection is already significantly useful. Yes, but only above a threshold, which is different for each user, depending on their other fingerprintable characeristics. (Please see my previous comment about the nonlinearity of fingerprinting protections.) My point is, window dimensions are a pretty large source of entropy. If we make that optional, I fear we're going to cause total deprotection of a lot of users. > - Even privacy.resistFingerprinting in its current state does not prevent > fingerprinting to its full extent. There are still huge entropy sources that > websites can untilize and probably there are even much more as even current > fingerprinting tests utilize. What huge entropy sources do you have in mind? We are attempting to protect against everything we can, so please let us know if we're missing something. > Given your logic we would currently not need > privacy.resistFingerprinting - but this is false. Well, to correct my wording slightly, fingerprinting protections are "much more" effective when they are used together. So, yes, if we are able to discover more fingerprinting vectors, we should protect against those also. And we certainly shouldn't ignore well-known, trivially observed, large sources of entropy such as window dimensions. > If users want to disable it partially why not? Users can also still utilize all of it if they want. Because it introduces complexity for users and developers. And because the ramifications for privacy are extremely difficult to analyze, but are probably worse for privacy than most people expect, because they don't anticipate the nonlinear effects. Many people assume, "if I disable 10% of my bits of protection, then I still have 90% of the privacy" which isn't true at all.
(In reply to Arthur Edelstein (Tor Browser dev) [:arthuredelstein] from comment #15) > What huge entropy sources do you have in mind? We are attempting to protect > against everything we can, so please let us know if we're missing something. On checking https://browserprint.info/test (I have done this a few months ago and assume that this adds 1 in the number of occurrences (keep this in mind for the information below)) the highest threats are: - Screen size (any of both measured) with 2 occurrences. - Character sizes with 2 occurrences. - Canvas with 2 occurrences. - System fonts (not flash) with 3 occurrences. - Date/Time format with 8 occurrences (I'm on a german localized Linux but would not expect in first place the beginning of the unix timestamp with "1.1.1970, 00:00:00" to give that high entropy. But I wonder if bug 1401696 actually influences this negatively). - Audio with 11 occurrences. - WegGL Renderer with 12 occurrences (My GeForce GTX 650 is not the newest graphics card but also not the oldest. The WebGL Vendor has 1857 occurrences and is more harmless. I wonder for which reasons these values are mainly used. They might be useful to detect specific bugs and apply wotkarounds but for generic purposes I think they should be not that important. Also I wonder if these values could be masked as they seem to have the potential for high entropy. But I'm not familiar with WebGL to precisely tell this. Also maybe this question applies to the audio part above). However, on my last anti-tracking check a few months ago I remember to have seen some Firefox settings about Canvas, WebGL and Fonts. I have not fully applied all settings on this check as I estimated major breakage and am not sure how many these settings would have been contributed to anti-fingerprinting. But there are some anti-fingerprinting settings Firefox has which privacy.resistFingerprinting does not cover for example the possibly most simple one - Cookies. It seems to be more or less obvious why privacy.resistFingerprinting does not cover those but this also speaks a bit against the concept of a unified anti-fingerprinting setting to make it very useful based on your concept. But this also points out why it actually makes sense to split those settings: We would have much more claims here if privacy.resistFingerprinting would disable cookies completely and do similar things. Also I have a theoretical construct in mind by testing specific behaviors: Browsers have countless very specific bugs that could be probed for to get detailed information about the version, used toolkit, etc. Such behavior is also often dependent on specific settings and eventually timing techniques can be used for these binary probes to get very reliable information. While I'm also developing software for a long while (including the web-part) and have encountered over the years several bugs that could be utilizied for such I can't tell how practicable a real-world solution would be. Also I notice that pricavy.resistFingerprinting does normalize the user-agent to Firefox 50 with a Windows 7 64 bit system. - I have seen somewhere a hint that faking the user-agent may actually increase the entropy because from probing browser behavior sites can more or less easily notice this. That sounds plausible but I had never a strong opinion about this. That the user-agent is normalized here instead of the user applying a custom one might mitigate this a bit. - If I remember correctly in the browser statistiks of NetMarketShare the most users of alternative browsers switch pretty fast to the current version. Wouldn't it be better to always point to the current or previous version with the user-agent? Or maybe pointing to an ESR version could make sense.
Edit: Forgot to mention something in the last post: For the record, the test tells me that 42061 fingerprints were tested so far and that I'm unique with 15,36 bits of identifying information.
Hi, sworddragon2 -- all the APIs you mention generally already have protections under privacy.resistFingerprinting, or will in a forthcoming release. Cookies and are stateful things are handled in Tor Browser under a separate mechanism, first-party isolation, which is enabled by the "privacy.firstparty.isolate" pref.
@sworddragon Online tests for uniqueness are heavily skewed 1) by old data 2) by the nature of the people who visit them 3) by repeated visits by such people 4) they are not real time or real world 5) the sets are not large enough. Science and math are your friends. Arthur is right on the money when he says there is no point locking **some** of the car doors. The resistFingerprinting pref is a work in progress, and there is a lot more to come - including a forward facing UI pref and information for users With regards to maximized windows, did you know that there are two hidden prefs which you can use to control the size of your rounded windows. These are: - user_pref("privacy.window.maxInnerWidth", 1600); // (hidden pref) - user_pref("privacy.window.maxInnerHeight", 900); // (hidden pref) Width will STILL round DOWN to multiples of 200's and height to 100's - so you can use any values. You could use these to set values as close as possible to maximized for your setup. When you open a new instance of FF, it shouldn't be such a hassle to just click the maximized button if you need it (I get that it's annoying, but if you want privacy, then adapt is my motto) Note: Bug 1403747 re warnings on maximizing => may lead to warning fatigue (@arthur: I hope this comes with an option to disable that warning - we should talk about that in the other ticket) @arthur Users have mentioned to me that the positioning of the window when opening (from remembered maximized state) causes the window to be off-screen - can the code be reviewed to make sure that when resized to multiples of 200/100s that the x,y top-left coordinates are in the positive?
When flipping the fingerprinting prefs, do we lie in the result values for screen.height/width ? If not, what's the point in disabling maximization specifically? Seems like an adversary can get those sizes anyway... Also, if we're going to do rounding, why not round to multiples of 128, which would work much better for widths that match the screen width (like maximized windows)? Right now, for maximized windows with (screen) widths of 1024/1280/1440/1920 pixels, the rounding doesn't actually reduce entropy (or therefore exposed fingerprintable bits) at all. (In reply to Simon Mainey from comment #19) > Width will STILL round DOWN to multiples of 200's and height to 100's - so > you can use any values. You could use these to set values as close as > possible to maximized for your setup. When you open a new instance of FF, it > shouldn't be such a hassle to just click the maximized button if you need it > (I get that it's annoying, but if you want privacy, then adapt is my motto) I'm a bit confused by this. As far as I can tell, the anti-fingerprinting argument is thus: 1) we shouldn't maximize the window because it reveals data 2) we won't make an option to allow maximizing the window on startup 3) we still allow you to maximize the window immediately after startup by clicking the 'maximize' button. This doesn't seem logical. Right now what is happening is that the people using this pref get no explanation about what's going on, and so the result is they are frustrated and either: a) turn fingerprinting protection back off b) click the 'maximize' button immediately after startup both of which defeat the protection offered anyway. Either we have to be consistent and actually disable maximization (and probably add some kind of messaging to explain to users why this is) or we shouldn't disable restoring the browser window as maximized (and probably still have some kind of messaging the first time the user maximizes or restores as maximized, if we genuinely believe this is important). The current implementation is just roadblocking users for no practical benefit, because there's no education and it doesn't actually fully close off fingerprinting concerns. It just makes them jump through more hoops. That doesn't seem productive. The worst is that it doesn't seem like the feature is working as designed per the last few comments in bug 1330882, so we're causing hoop-jumping without any benefit at all... From a quick look at the patch (I could be wrong!), it also looks like it doesn't take sidebar or devtools usage into account. It seems to me that having a (temporary) pref while the "sub-feature" of masking inner window sizes from content is worked on (via e.g. bug 1407366) would be useful -- much like we had a pref to turn off bits of photon while they were being worked on, and we've now removed that pref. I don't think the current state of the "sub-feature" offers value to most users right now, and I think making it work well, providing both value and education to users, as well as dealing with numerous factors that influence inner window size, is going to be tricky and not something we should attempt to do while impacting core usability of the released browser for everyone in the way that this is doing right now.
(In reply to :Gijs (slow, PTO recovery mode) from comment #20) Hi, Gijs -- thanks for your comments. > When flipping the fingerprinting prefs, do we lie in the result values for > screen.height/width ? If not, what's the point in disabling maximization > specifically? Seems like an adversary can get those sizes anyway... We spoof screen.width/.height to be the same as content window.innerWidth/innerHeight. > Also, if we're going to do rounding, why not round to multiples of 128, > which would work much better for widths that match the screen width (like > maximized windows)? Right now, for maximized windows with (screen) widths of > 1024/1280/1440/1920 pixels, the rounding doesn't actually reduce entropy (or > therefore exposed fingerprintable bits) at all. We're rounding the viewport width and height, not chrome window width and height. So while you're right that maximized windows aren't rounded correctly, using multiples of 128 won't help, unfortunately. > Either we have to be > consistent and actually disable maximization (and probably add some kind of > messaging to explain to users why this is) or we shouldn't disable restoring > the browser window as maximized (and probably still have some kind of > messaging the first time the user maximizes or restores as maximized, if we > genuinely believe this is important). I share your concern about this, and I think we should make two bugs a high priority: 1. Bug 1403747 will show a warning when the user first maximizes a window. I think this should be easy to implement. 2. Bug 1407366 will allow maximizing windows but constrain the viewport dimensions to protect users. More challenging but the best approach I am aware of. > It seems to me that having a (temporary) pref while the "sub-feature" of > masking inner window sizes from content is worked on (via e.g. bug 1407366) > would be useful That's possible, but I would prefer to fix Bug 1403747 as soon as possible and retain window size rounding on startup until Bug 1407366 is implemented so we continue to get early adopter feedback. (Of course all fingerprinting resistance remains behind a pref.)
(In reply to Arthur Edelstein (Tor Browser dev) [:arthuredelstein] from comment #21) > (In reply to :Gijs (slow, PTO recovery mode) from comment #20) > > Also, if we're going to do rounding, why not round to multiples of 128, > > which would work much better for widths that match the screen width (like > > maximized windows)? Right now, for maximized windows with (screen) widths of > > 1024/1280/1440/1920 pixels, the rounding doesn't actually reduce entropy (or > > therefore exposed fingerprintable bits) at all. > > We're rounding the viewport width and height, not chrome window width and > height. So while you're right that maximized windows aren't rounded > correctly, using multiples of 128 won't help, unfortunately. Why not? On Windows, at least, there are no horizontal borders on maximized windows (so, for instance, on my machine a maximized Firefox window (fingerprinting not disabled) has content.window.innerWidth == screen.width == 2176). Same on OSX, I believe. (For both of these, with the Dock/taskbar in the default (bottom) position, of course.) I expect on Linux it depends on the distro, window manager and theme, but it would be a shame if the majority of our users got strictly worse behaviour just because improving the behaviour wouldn't help on some Linux distros...
(In reply to :Gijs (slow, PTO recovery mode) from comment #22) > Why not? On Windows, at least, there are no horizontal borders on maximized > windows (so, for instance, on my machine a maximized Firefox window > (fingerprinting not disabled) has content.window.innerWidth == screen.width > == 2176). Same on OSX, I believe. (For both of these, with the Dock/taskbar > in the default (bottom) position, of course.) I expect on Linux it depends > on the distro, window manager and theme, but it would be a shame if the > majority of our users got strictly worse behaviour just because improving > the behaviour wouldn't help on some Linux distros... Ah, sorry, I think you're right about that. My mistake. It's a good suggestion and I would definitely consider it. Height is more problematic, but one option is just to extend the vertical chrome area such that the viewport height is a multiple of 128 pixels (or whatever we choose).
3 years ago
Duplicate of this bug: 1431909
Tom, you could probably also close Bug 1364261 as a duplicate of 1431909 as well then
3 years ago
3 years ago
Whiteboard: [fingerprinting] → [fingerprinting] [fp-triaged]
3 years ago
Priority: P3 → P5
Component: Session Restore → General
Hardware: x86_64 → All
Version: 55 Branch → Trunk
You need to log in before you can comment on or make changes to this bug.