Open Bug 1206443 Opened 10 years ago Updated 3 years ago

We immediately warn about reload prevention even for reloads scheduled for a long time from now (e.g. 20 minutes) with no indication to the user

Categories

(Firefox :: Disability Access, defect, P5)

47 Branch
Other
Linux
defect

Tracking

()

UNCONFIRMED

People

(Reporter: Nick_Levinson, Unassigned)

References

(Blocks 1 open bug)

Details

I have Edit > Preferences > Advanced > General > Accessibilty > Warn Me When Websites Try to Redirect or Reload the Page checkmarked (turned on). This can produce an information bar that says, "Firefox prevented this page from automatically redirecting to another page." It's aparently common, at least with the websites I experience, that if this happens once for a website then it will likely happen many times for that website (probably because of that site's design philosophy). I like the protection, but I'd rather the page did not jump up and down like a rabbit while I'm tryng to click something or just trying to read, especially when many pages are jumpy anyway because they're assembling pieces from many files and servers. Perhaps, instead of an info bar, an icon could show up in an existing menubar, either as an option or altogether instead of the bar. Or, if you're keeping the bar, once the bar appears in a tab, it should stay present even when blank (cropping the viewport at the top) so the page doesn't jump because of the bar. Bug 680565 is likely related. I plan to add this to the tracker at bug 685496.
Can you provide a screencast of what you mean by the page jumping up and down? I'm confused why the infobar repeatedly appearing would cause that. What sites are you seeing this on?
Flags: needinfo?(Nick_Levinson)
I'll have to wait until I see the effect again, to identify a specific website. The effect is because the infobar occupies a certain height, the page is rendered below that, if I select the infobar's Allow button the infobar goes away, if the website tries to automatically redirect or reload again the infobar comes back, if I select the infobar's Allow button again the infobar goes away again, if the website tries . . . . and so on through many times. I suspect the jumpiness is due to new page components (such as ads) appearing at or near the top of the page, pushing everything below down, then the Allow button (selectable by alt-a) causing the infobar to go away, so the page jumps up by the amount of the infobar's height. If I'm reading below all that, what I'm reading jumps down and up. If the infobar appears 20 times (and it has), that can be as much as 2x20 jumps. If I'm not just reading but trying to click something, the jumpiness means I may accidentally click something else instead, perhaps irreversibly. Ads may be the most suspect, because they usually come from different servers, some of which can be busy and a bit slow. It's possible in HTML/CSS to reserve rectangles for ads and other replacement content but probably not all websites do that, so the lack of space followed by content arriving means things get pushed down. I don't want to turn the feature off. It's usually useful. It's just on some websites that this problem shows up. Sometimes I stop Allowing and just use the page as partly drawn or close the tab, but that's not always a good idea.
Flags: needinfo?(Nick_Levinson)
A testcase would be helpful, even a case where this only happens once or twice, and not for the main page (ie the main page content loads completely and it's frames that break). Do you know if this is a regression, ie did it used to work? I somewhat suspect that the code that was added in bug 1055464 might have made this happen for subframes, when the previous code might have only happened for toplevel navigations. Not sure off-hand if that's true though.
Component: General → Disability Access
Flags: needinfo?(Nick_Levinson)
I don't think I can write a page that would satisfy it, although I write HTML, since I think frames are deprecated and I never write them, and also I'd have to write broken frames, which has too many implications. The effect doesn't happen on my websites and, if the problem is with ads, I don't want to set up a fake website and put Google ad scripts on it; Google might penalize me for creating a bad website. Bug 1055464 is about 2 years old. I might have experienced this for 2 years or more or maybe not; I don't remember. The bug report itself is too opaque for me to understand, including an attachment. It's also possible I experienced for more than 2 years but for different reasons no longer applicable.
Flags: needinfo?(Nick_Levinson)
Priority: -- → P5
philly.com is a fairly good example. That's what I typed into the address bar; at some point, it forwarded to <http://www.philly.com/beta?c=r>. I clicked Allow at least 20 times Sunday when I stopped counting, probably 40 when I stopped clicking. It wasn't as rabbbit-jumpy as some sites but it did move up and down while redrawing the page. I couldn't figure out what part of the page was changing after the first 20; page elements seemed to be the same, but I may have missed something, and, while I saw multiple places for ads, I'm not sure I saw any ads, and the word "Advertisement" is rendered 9 times but cannot be found with ctrl-f in caseless mode on the rendered page or in the source code. The source code is over 2200 lines and may have 55 scripts; the page is not in my style of coding but it may be competent or good. The site serves a large city and probably has relatively high traffic. The browser version is now 47.0 and I don't think it was updated since I visited <philly.com>, so it was probably 47.0 then.
Version: 40 Branch → 47 Branch
AFAICT this isn't a regression (e.g. I see the same behaviour on that site on Firefox 34) but the gist of it seems to be that this website has an autorefresh (<meta refresh) builtin with a delay of 20 minutes. Rather than showing that delay or not showing the notification until the refresh actually happens, we show it immediately and give no indication of the delay, and clicking "Allow" also refreshes the page immediately. That's a bit dumb. The sensible thing to do would be one or more of: - show the notification after 20 minutes instead of immediately - make the notification say it will only do something after 20 minutes - make the notification permanently allow refreshes from the same place to the same place if that's what the user allowed (so you don't get prompted for the same refresh/redirect more than once) All of these are not trivial to implement. TBH, I don't know why we ship with this feature accessible through the prefs because it has loads and loads of problems, and I would sooner remove it completely than prolong its suffering. Still keeping this as P5 because of inertia. I'd take a not-too-invasive patch, but it's unlikely I'll work on this myself.
Summary: preventing automatic redirection makes many pages jumpy → We immediately warn about reload prevention even for reloads scheduled for a long time from now (e.g. 20 minutes) with no indication to the user
I'm not sure about that. I revisited the site tonight. The meta refresh tag varies. After clicking Allow just once: meta http-equiv="refresh" content="0;url=http://searchguide.windstream.net/search/?q=http://www.philly.com/beta&t=0"/ After clicking Allow just one more time: meta http-equiv="Refresh" content="1200;url=/beta?c=r" / (Angle brackets are omitted here.) I don't know enough to diagnose it myself.
(In reply to Nick Levinson from comment #7) > I'm not sure about that. I revisited the site tonight. The meta refresh tag > varies. > > After clicking Allow just once: > > meta http-equiv="refresh" > content="0;url=http://searchguide.windstream.net/search/?q=http://www.philly. > com/beta&t=0"/ At first glance, this looks like your ISP or something else MITM'ing your request, if it's sending you via windstream.net which seems unrelated to philly.com. :-\ > After clicking Allow just one more time: > > meta http-equiv="Refresh" content="1200;url=/beta?c=r" / This is the one I was talking about. The 1200 there is a delay in seconds, so 1200 / 60 = 20 minutes delay. You get prompted immediately, which as comment #6 said is "a bit dumb". :-) If you turn off the setting, you'll just see that page, and if you then have the patience of waiting 20 minutes, it'll refresh after the 20 minutes have passed, so that the news site always updates!
I'm not sure which WiFi network I was using the night of comment 7 above, but I'm using another one now, and this time at philly.com I don't get the info bar at all, even though my Firefox setting described in the opening post is still in effect and I've cycled power since comment 7 so FF with its privacy settings wouldn't have a cache from then. I agree that the presence of windstream.net doesn't make much sense and maybe a "man in the middle" was playing a game, but I hadn't seen any windstream page until today when I explicitly visited that site, so that suggests that MITM either failed or didn't happen. Since at philly.com today I couldn't get the info bar, I couldn't see what the source code would say after clicking Allow just once. My guess is that philly.com did something to get rid of the refresh on that page in the last few days, in which case we may have to await finding another afflicted website.
(In reply to Nick Levinson from comment #9) > My guess is that philly.com did something to get rid of the refresh on that > page in the last few days, in which case we may have to await finding > another afflicted website. Yes, it seems to be gone. But it's OK, it's pretty easy to write another such a page if we wanted to. At this stage the problem is well-understood, the question is simply about time/impetus/willingness to work on this feature (see comment 6).

For whatever reason this bug seems to have become a spam magnet, so locking.

Restrict Comments: true
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.