With ETP enabled, jameshardie.com shows a warning "Google believes that a bot is making this request, and so your submission cannot be processed" (from low "recaptchaScore" from their POST to /api/RecaptchaData/ValidateRequest)
Categories
(Core :: Privacy: Anti-Tracking, defect, P3)
Tracking
()
People
(Reporter: dholbert, Unassigned)
References
(Blocks 1 open bug)
Details
Attachments
(1 file)
788.66 KB,
image/png
|
Details |
STR:
- Be sure you have Enhanced Tracking Protection enabled
- Visit this page in Fenix (Firefox Beta or Nightly on Android), or in Firefox with RDM (390px wide viewport or less):
https://jameshardie.com/forms/request-for-inspiration
(I've been able to repro in Firefox Nightly in RDM mode with a 390px-wide viewport to trigger the mobile UI, too.)
- Dismiss any first-load notifications/popups (e.g. there's a location one which I reject, and a cookie notification from the website that I just hit "x" on)
ACTUAL RESULTS:
Within a few seconds of me just looking at the form-to-fill-out, red text appears that says:
Google believes that a bot is making this request,
and so your submission cannot be processed.
Please call [phone number] for assistance.
EXPECTED RESULTS:
No such notification.
I managed to catch the JS that's adding this label. It's just inline script in the page:
<script type="text/javascript">
var onloadCallback = function() {
grecaptcha.execute('6LcAoNQUAAAAAO2ys6suz1uAo-nn3h360e7Sr5ZN', { action: 'rfinsp' }).then(function (token) {
recaptchadata = {
'Token' : token
};
$.ajax({
url: '/api/RecaptchaData/ValidateRequest',
type: 'post',
data: JSON.stringify(recaptchadata),
dataType: 'json',
contentType: 'application/json',
success: function (datastring) {
data = $.parseJSON(datastring);
$('#recaptchaSuccess').val(data.success);
$('#recaptchaScore').val(data.score);
recaptchaSuccess = $('#recaptchaSuccess').val();
recaptchaScore = parseFloat($('#recaptchaScore').val());
if (recaptchaSuccess == 'false' || recaptchaScore < 0.5) {
$('#recaptchaWarning').append('<span class="errorDesc">Google believes that a bot is making this request, and so your submission cannot be processed. Please call 1-888-J-HARDIE (1-888-542-7343) for assistance.</span>');
}
},
error: function (xhr) {
console.log(xhr);
$('#recaptchaWarning').append('<span class="errorDesc">Google believes that a bot is making this request, and so your submission cannot be processed. Please call 1-888-J-HARDIE (1-888-542-7343) for assistance.</span>');
}
});
});
};
</script>
In my case (of one particular load on desktop where this happened), my recaptchaScore
was "0.3" (which is lower than the hardcoded 0.5 threshold in the above script, so it showed the warning).
My first impression is that this happens more often when I have ETP enabled (as it is, by default), so filing this under Privacy:Anti-Tracking for now. Though it might end up being a WebCompat bug and/or something to do outreach to Google/Recaptcha folks about, if this is a script snippet that came directly from them (not sure).
Reporter | ||
Comment 1•5 years ago
|
||
Reporter | ||
Comment 2•5 years ago
|
||
It seems like the script from comment 0 is invoked by a response to a POST to /api/RecaptchaData/ValidateRequest.
In network devtools, I do indeed see an entry for a POST to https://www.jameshardie.com/api/RecaptchaData/ValidateRequest , whose "Response" panel shows the following:
"{\n "success": true,\n "challenge_ts": "2020-07-14T18:32:29Z",\n "hostname": "www.jameshardie.com",\n "score": 0.3,\n "action": "rfinsp"\n}"
Note the score of 0.3, which is the score that I'm getting which triggers this error.
Reporter | ||
Comment 3•5 years ago
|
||
If I disable ETP, then that POST has a score of 0.7 instead, and I don't get the warning message (because 0.7 is high enough).
And in Chrome, that POST has a score of 0.9 (even if I use Chrome with Firefox UA string, Mozilla/5.0 (Android 10; Mobile; rv:79.0) Gecko/79.0 Firefox/79.0
-- so it's not simple UA-sniffing).
Note: for the investigation in comment 2 and this comment, I'm using RDM in Desktop versions of Firefox & Chrome, emulating a mobile browser with a viewport of 390px by 700px.
Reporter | ||
Updated•5 years ago
|
Reporter | ||
Comment 4•5 years ago
|
||
FWIW I hit this in three separate not-entirely-brand-new profiles (and the first time was basically the only time I had loaded jameshardie.com today, so it happened without any user-initiated mass-reloading). Here's where I've seen this:
(1) Nightly on my phone
(2) Beta on my phone
(3) Nightly on my Desktop
In a fresh Firefox profile on Desktop (with RDM mode 390x700px), I got a score of 0.9, just like in Chrome. But then after reloading a few times and then adding a Firefox-for-Android UA (from comment 3), my score went down to 0.7. Not sure if the reduction was from the custom UA or from reloading. And a few more tweaks & loads, my score went down to 0.3 again.
Now I'm wondering if this is just partly a byproduct of mass-reloading and noise, and perhaps GeckoView's thumnail-updating-service is reloading the site in the background to get an updated thumbnail in a way that looks like a bot? (That's my only explanation for how I might've hit this the first time today -- because I do have several open background tabs on various jameshardie.com pages in my mobile browser, which might have all reloaded their thumbnails at the same time in a way that made me look like a bot, I'm guessing.)
Comment 5•5 years ago
|
||
Thanks for the detailed STR and investigation. I tried reproducing on Nightly Desktop in RDM in a fresh profile, a fairly fresh profile (a few hours old), and my personal profile and wasn't able to. I didn't check my score for all of them, but the ones I checked were 0.9. I also tried my personal profile for Firefox Beta on my phone (Fenix-based), and did not receive the warning message.
It's worth noting that Fenix and Nightly have different ETP settings, so the fact that you can reproduce on both makes me think it was your network. In Fenix we block cookies from trackers on the level 2 blocklist (which includes *.google.com
). But we do have an exemption for google.com/recaptcha/
, meaning cookies will not be blocked for requests that include that path. On Nightly we block all of the same things, and then partition the remaining cookies by the first-party site. This means the cookies that are sent with requests to google.com/recaptcha/
are site-scoped. If anything I'd expect this to only reproduce in Nightly desktop for that reason.
I'll block the dFPI breakage bug to make sure we keep track of this.
Daniel: Can I ask you to retest now that it's been a few days. I wonder if this was something transient based on your network.
Reporter | ||
Comment 6•5 years ago
•
|
||
I'm getting 0.9 today (after loading the site a few times), and haven't seen the error message (all using Firefox Nightly [Fenix] on Android).
So: could easily have been something transient. I'll resolve as WFM for now, but will reopen if it recurs.
Description
•