No user notification when Safe Browsing is essentially disabled if Google dependency services are unavailable

RESOLVED WONTFIX

Status

()

Toolkit
Safe Browsing
RESOLVED WONTFIX
3 years ago
2 years ago

People

(Reporter: Sam Hall, Unassigned)

Tracking

({uiwanted})

Trunk
uiwanted
Points:
---
Dependency tree / graph

Firefox Tracking Flags

(Not tracked)

Details

(Reporter)

Description

3 years ago
User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.9; rv:33.0) Gecko/20100101 Firefox/33.0
Build ID: 20140902214533

Steps to reproduce:

Block access to Google's Safe Browsing services via hosts file...
127.0.0.1 safebrowsing.google.com
127.0.0.1 safebrowsing.clients.google.com
127.0.0.1 sb-ssl.google.com

Visit a site that matches a hash in the local databases (these will change all the time, but I believe their is a local test database).


Actual results:

In versions prior to 33 see bug 891289.

In version 33+, Firefox allowed the page to load without warning. This is despite the fact that Firefox could not get a response from Google's Safe Browsing service to confirm or deny the possibility that this site was reported as a malware or phishing site.


Expected results:

Due to the Safe Browsing API terms of service, probably nothing much differently at that particular point. However, you could warn the user upfront.

Can I recommend that Firefox performs a Safe Browsing service connectivity check on startup. When Firefox is not in "Offline Mode", if it can not reach the Safe Browsing service, a banner to alert the user that Safe Browsing will be disabled for this session unless connectivity to the service is restored.

Suggested wording: "Firefox can not access the Safe Browsing service. Safe Browsing will be disabled for this session unless connectivity to the service is restored." (and any translations for that).
(Reporter)

Comment 1

3 years ago
(In reply to Sam Hall from comment #0)

Oops, I referred to the wrong bug. Should have been bug 832056.
(Reporter)

Updated

3 years ago
OS: Mac OS X → All
Hardware: x86 → All

Updated

3 years ago
Component: Untriaged → Phishing Protection
Product: Firefox → Toolkit
Version: 33 Branch → Trunk

Updated

3 years ago
Blocks: 832056, 1023767
>Can I recommend that Firefox performs a Safe Browsing service connectivity check on startup.

If the connection goes down during the browsing session the problem is the same but the user would get no warning. This doesn't look like a good idea. I don't think there's a need either. We'll notice the server is down when we fail to connect at any point we need the connection (updates/verifying positives).

I'll put this as uiwanted to get some input from UX. 

Personally, I don't see the point in warning: there's nothing the user can do about it[1], there's nothing we can do about it, and recommending the user to "be more careful now" also makes little sense as SafeBrowsing's coverage is far from total and due diligence is always required.

[1] Unless he blocked it intentionally, in which case the warning just annoying.
Keywords: uiwanted
(Reporter)

Comment 3

3 years ago
> Personally, I don't see the point in warning: there's nothing the user can
> do about it[1], there's nothing we can do about it, and recommending the
> user to "be more careful now" also makes little sense as SafeBrowsing's
> coverage is far from total and due diligence is always required.
> 
> [1] Unless he blocked it intentionally, in which case the warning just
> annoying.

Sorry, the bug report was entered in a bit of a rush. I should have mentioned, it's easy enough to disable Safe Browsing in Firefox and doing so would imply any warning wouldn't be needed/possible.

I can't think of all the reasons that the service may be unavailable, but some could surely be nefarious. If you are connected to wifi at an internet cafe or something, then it may be handy to have some kind of notification like this. Personally, I'd like to know in that case.

The issue in earlier versions - where Firefox would simply fail to load the page when it could be verified - impacted users within our organisation because we have many people that haven't got internet access but still use Firefox to access internal sites which managed to trigger a safebrowsing request. I'm confident in that case you are right, these users don't need to know. It would probably cause confusion and annoyance. But only because they are on a trusted network.

It might be nice for corporate environments to have to learn that the safe browsing service is cool and they should let that traffic flow freely. Or otherwise setup a safe browsing proxy service or something. We control the majority of the Firefox installations here and could configure them to use our own safe browsing proxy/cache thing instead. I'm sure Google would be happier with any reduction of load. However, I do totally understand that Firefox is geared towards the home user.

In the extreme case, apparently many users in China were blocked from a lot of high profile sites due to bug 832056, I'm assuming that's because the government blocked access to the Safe Search service? If so, it could be a can of worms. Perhaps it's not the responsibility of your web browser to report a networking issue like this.

Too hard basket? I wont be offended if this one gets shelved, my gut feeling was that something still wasn't quite right with how this works now that the previous behaviour has been resolved. But it's clearly never going to be a perfect science. Some level of Safe Browsing coverage is certainly better than nothing.
I mostly agree with what you say. When I said "intentionally disabled", I was thinking of cases where the user or corporate (or government...) firewall just blocks Google outright, rather than people toggling the setting. You also have to deal with the case where the user is connected to the "internet" but is actually still being intercepted and redirected to a login portal (very common on "free" wifi). I believe such cases are common enough that it's hard to strike a good compromise between a security warning being prominent enough that the user won't just click it away, and be subtle enough not to be annoying. If the warning is too subtle (Browser Console log, greying out the options in preferences, ...) you may not see it until it is too late so it doesn't help you much either.
(In reply to Gian-Carlo Pascutto [:gcp] from comment #2)

> If the connection goes down during the browsing session the problem is the
> same but the user would get no warning. This doesn't look like a good idea.
> I don't think there's a need either. We'll notice the server is down when we
> fail to connect at any point we need the connection (updates/verifying
> positives).

What's the current experience if Google's servers are unavailable due to, say, a bad/slow network connection to them? Sounds like Firefox doesn't say or do anything? (And so browsing to a malware site known to Google but not local SB data would not be blocked?)

I'd agree that the "I blocked Google in my hosts file" case, specifically, isn't interesting to support. It's relatively uncommon, and these kinds of problems are exactly why some people argue against doing site blocking that way.

I'm more sympathetic to maybe doing something about transient failures, if they're long enough. But I think the first step would be to add a telemetry probe to see how many users / how often such failures occur. For example, a probe that measures the interval between first failed attempt and first successful retry (or browser shutdown if there wasn't one).
(Reporter)

Comment 6

3 years ago
Worth considering also that this December, the version of the Safe Browsing API used by all versions of Firefox thusfar is going to be switched off. I assume by then the latest version will be using v3 of the API. But at that point all the old versions of Firefox will no longer have any Safe Browsing protection and the users will be unaware.
(In reply to Justin Dolske [:Dolske] from comment #5)

> What's the current experience if Google's servers are unavailable due to,
> say, a bad/slow network connection to them?

If the connection is slow, browsing anything that gets a hit (which could be a 
false positive or a real attack site) will be slow. If it's bad and cutting out, 
updates will be delayed.

> Sounds like Firefox doesn't say or do anything? (And so browsing to a malware site
> known to Google but not local SB data would not be blocked?)

The latter case never warns, the local databases are always consulted first, and
then verified remotely. If the remote verification fails, there will be no warning.
If the updates fail, we might be behind on getting sites in locally and are remote.
But there's no situation where we could not find anything locally and then do a remote
lookup. (The new download protection with remote lookups in Firefox ~34 is another story...)

> I'd agree that the "I blocked Google in my hosts file" case, specifically,
> isn't interesting to support. It's relatively uncommon, and these kinds of
> problems are exactly why some people argue against doing site blocking that
> way.

Please reread comment 4 carefully, and consider whether you really want to deprecate...
China. (https://bugzilla.mozilla.org/show_bug.cgi?id=832056#c11 which lead to an emergency
https://bugzilla.mozilla.org/show_bug.cgi?id=1023767)

> I'm more sympathetic to maybe doing something about transient failures, if
> they're long enough. But I think the first step would be to add a telemetry
> probe to see how many users / how often such failures occur. For example, a
> probe that measures the interval between first failed attempt and first
> successful retry (or browser shutdown if there wasn't one).

We can file bugs on that and start collecting.
(In reply to Gian-Carlo Pascutto [:gcp] from comment #7)
> (In reply to Justin Dolske [:Dolske] from comment #5)
> 
> > What's the current experience if Google's servers are unavailable due to,
> > say, a bad/slow network connection to them?
> 
> If the connection is slow, browsing anything that gets a hit (which could be
> a 
> false positive or a real attack site) will be slow. If it's bad and cutting
> out, 
> updates will be delayed.

I think the site may not load until the gethash request times out.
 
> Please reread comment 4 carefully, and consider whether you really want to
> deprecate...
> China. (https://bugzilla.mozilla.org/show_bug.cgi?id=832056#c11 which lead
> to an emergency
> https://bugzilla.mozilla.org/show_bug.cgi?id=1023767)

Note that bug 1023767 fixed one bug but not the underlying issue, which is https://bugzilla.mozilla.org/show_bug.cgi?id=1024555.
(In reply to Sam Hall from comment #6)
> But at that point all the old versions of Firefox will no longer have any 
> Safe Browsing protection and the users will be unaware.

Those old versions will also have known and published security holes, so not
having SafeBrowsing is really the least of your worries.
(In reply to Justin Dolske [:Dolske] from comment #5)
> I'm more sympathetic to maybe doing something about transient failures, if
> they're long enough. But I think the first step would be to add a telemetry
> probe to see how many users / how often such failures occur. For example, a
> probe that measures the interval between first failed attempt and first
> successful retry (or browser shutdown if there wasn't one).

We now have a probe that collects the HTTP response code for both endpoints:

https://telemetry.mozilla.org/new-pipeline/dist.html#!cumulative=0&end_date=2016-04-25&keys=__none__!__none__!__none__&max_channel_version=nightly%252F48&measure=URLCLASSIFIER_UPDATE_REMOTE_STATUS&min_channel_version=null&product=Firefox&sanitize=1&sort_keys=submissions&start_date=2016-03-07&table=1&trim=1&use_submission_date=0
https://telemetry.mozilla.org/new-pipeline/dist.html#!cumulative=0&end_date=2016-04-25&keys=__none__!__none__!__none__&max_channel_version=nightly%252F48&measure=URLCLASSIFIER_COMPLETE_REMOTE_STATUS&min_channel_version=null&product=Firefox&sanitize=1&sort_keys=submissions&start_date=2016-03-07&table=1&trim=1&use_submission_date=0

(In reply to Sam Hall from comment #6)
> Worth considering also that this December, the version of the Safe Browsing
> API used by all versions of Firefox thusfar is going to be switched off.

Google has told us they won't turn it off until we've switched to V4 of the protocol, which we have started working on.
While I can appreciate the desire to do something when users aren't protected, I think that this is likely to end up like the old "Firefox couldn't reach the Sync servers" warnings. There's nothing that the average user can't do if their network is blocking safebrowsing.google.com.

I'm going to go ahead and mark this as WONTFIX and if UX disagrees, then feel free to reopen the bug.
Status: UNCONFIRMED → RESOLVED
Last Resolved: 2 years ago
Resolution: --- → WONTFIX
You need to log in before you can comment on or make changes to this bug.