Closed
Bug 175104
Opened 22 years ago
Closed 22 years ago
disable Link Prefetching by default
Categories
(Core :: Networking, defect)
Tracking
()
VERIFIED
WONTFIX
People
(Reporter: grigorig, Assigned: darin.moz)
References
Details
I think Link Prefetching should be disabled by default(yes, I know it can be disabled in prefs.js). There should be a GUI accessible preference, so it can be re-enabled if wanted. the reasons are following: -it can be abused, so that big images(or similar objects) are prefetched eating bandwidth on client side -especially traffic/hit-orientated ad-companies can abuse this -many people(me included) don't like the fact that his/her browser does some network activities in the background -it'll occur that content is preloaded which will never be shown/visited. -because of this, if lots of users use preloading, the bandwidth will increase on server side. If you don't agree, just close this bug.
Comment 1•22 years ago
|
||
> it can be abused
So can <iframe> and <img style="position:absolute; visibility: hidden">
Your other arguments are a lot more sane...
Comment 2•22 years ago
|
||
see bug# 166648 for gui pref. Darin, was there any discussion on a default for this setting?
Status: UNCONFIRMED → ASSIGNED
Ever confirmed: true
QA Contact: benc → tever
Assignee | ||
Comment 4•22 years ago
|
||
see the comments i posted to the netscape.public.mozilla.netlib newsgroup:
> Yeah, a number of folks have commented about the volume-billed connection
> issue. Basically, there are two ways of looking at this issue.
>
> 1- websites can already cause things to be silently downloaded, so what good
> does disabling this particular "silent download" mechanism buy you?
>
> 2- prefetching is a browser feature; users should be able to disable it
> easily.
>
> I think it is important that websites adopt <link> tag based prefetching
> instead of trying to roll in silent downloading using various DOM/JS hacks.
> The <link> tag gives the useragent the ability to know what sites are up to,
> and we can use this information to better prioritize the prefetching. I
> suppose my half-concern is that a preference would encourage websites to stick
> with DOM/JS hacks.
>
> Ultimately, I think we will probably end up with a visible user preference
> just because that seems to make people feel comfortable. But, in the end it
> only partially solves the problem people think it solves :-/
>
> I suppose it matters to what extent <link> prefetching takes off.
Bottom-line:
websites want to do prefetching, and some are already doing it using JS/DOM
tricks. Disabling link prefetching in the browser will only encourage websites
to use more of these tricks, which gives the browser (the "user agent") less
control over prefetching. Is that really what you want?
this browser feature is about rationalizing prefetching, so websites don't have
to resort to hacky, bandwidth-abusive techniques. it gives the browser control,
allowing the browser the option to do smart things like limiting bandwidth
allocated to prefetching.
for these reasons, i'm marking this bug WONTFIX.
Status: NEW → RESOLVED
Closed: 22 years ago
Resolution: --- → WONTFIX
Comment 5•22 years ago
|
||
a rfe bug for something which will address parts of this bug: http://bugzilla.mozilla.org/show_bug.cgi?id=175403
Comment 7•19 years ago
|
||
*** Bug 295550 has been marked as a duplicate of this bug. ***
Prefetch should be disabled by default, allowing the user not a third party to decide if they want prefetching Prefetching can and will be abused by webmasters aiming to increase ad revenue a simple variation that will solve all problems is to only allow prefetching of pages within the same domain The iframe argument is also a valid one I frames should only be allowed to fetch from a 3rd party domain if the iframe is ultimately visible. ie all parent nodes are visible. iframes fetching from within the same domain can operate as normal of course a clever webmaster could position an image above an iframe and therefore block the view of the iframe. but that can aslo be checked
Assignee | ||
Comment 9•19 years ago
|
||
> The iframe argument is also a valid one
>
> I frames should only be allowed to fetch from a 3rd party domain if the iframe
> is ultimately visible. ie all parent nodes are visible.
>
> iframes fetching from within the same domain can operate as normal
>
> of course a clever webmaster could position an image above an iframe and
> therefore block the view of the iframe. but that can aslo be checked
We cannot change the behavior of iframes, or we would break the web. Therefore,
you're argument doesn't hold.
Comment 10•19 years ago
|
||
break the web, how? if an iframe only loads when it is made visible why does that break the web? you have added cookie blocking, that "breaks the web". and correctly so as it is a percieved (not real) security issue. is this any different? if an iframe is being used as a webbug then it is a security issue. I don't see the prefetching of web pages as any different. If the prefetching of pages and iframes is limited to the host domain. then that does not break the web, it may make a few spyware operators unhappy, but I thought we were against them. In the speech regarding submitting a bug a lot is said about providing explicit details. perhaps the same could be asked for responses to bug reports "breaking the web" is an inadequate response. And I do apologise for submitting a duplicate. I did do the required search but failed to extend the search from "prefetch" to "prefetching" perhaps a feature request is required for bugzilla such that it automatically extends words in its search
Assignee | ||
Comment 11•19 years ago
|
||
> break the web, how? > if an iframe only loads when it is made visible why does that break the web? There are web apps that require the ability to load hidden iframes in order to do stuff. > you have added cookie blocking, that "breaks the web". and correctly so as it > is a percieved (not real) security issue. is this any different? Yes, it is very different. Cookie blocking is something web browsers have always exposed to users. Web apps generally behave gracefully when cookies are disabled because it is reasonable behavior for a browser. However, if iframes stopped working, then Mozilla would be blamed for not being compatible with the web (a.k.a. Internet Explorer). Anyways, consider the following scenario: load a hidden iframe from the same origin, and inside that iframe run some script that submits a HTML form, which results in the iframe's contents being replaced by some other domains content. Would you deny that form submission from running? Or, would you refuse to let the response load? Neither approach is very good. Hidden iframes are an essential component of the web, used to implement ad-hoc forms of RPC for ages.
Comment 12•19 years ago
|
||
> Would you deny that form submission from running? Or, would you refuse to let
> the response load? Neither approach is very good. Hidden iframes are an
> essential component of the web, used to implement ad-hoc forms of RPC for ages.
Yes I would deny that response from loading if it was from a third party, at
least until the iframe became visible.
But you are correct in saying that it is very easy for the same process to occur
by other means, turning off cookies is really little more than a panacea for a
nervious world. in reality it does nothing.
I myself am about to implement a simple session state system that does not rely
on cookies or ip tracking and only uses the bare minimum of url rewriting. it
only lives the duration of the session, but thats all I need.
Still that leaves the original argument. prefetch should be off by default. it
is an unnecessary load on the browser, the network and the users bandwidth, it
is not something that would break the web by turning it off. if it does break
things then it should be broken now and not when it becomes standard practise.
I don't like the idea of a webmaster or Mozilla telling me what I should load or
not load. prefetch is unnecassary. and I certainly didn't think much of having a
cookie dialogue popup in the middle of a google session.
My VERY FIRST THOUGHT was that Firefox has been compromised. and that sent a
chill down my back. it wasn't until I viewed the source code that I realised
otherwise. how many users view the source? not many lots would simply worry
about it and perhaps go back to IE.
Assignee | ||
Comment 13•19 years ago
|
||
Not many users enable the cookie dialogs. Users who do are more tech savy. You still haven't convinced me that supporting <link rel="prefetch"> is somehow worse than not supporting it. Without this mechanism, people will just resort to more ad-hoc approaches. Granted, google search would probably not try to prefetch top results without a clean mechanism like this in the browser, but google isn't trying to do you any harm in that case. If you're worried about people abusing this, then you ought to worry about much more than just link prefetching.
Comment 14•19 years ago
|
||
There is no advantage in having it enabled. Webmasters and designers will use it if it is there. The more that become aware of it the more will do it. (I only became aware today) If the option to enable it is visible to the user, then they can enable it with the promise of possibly a better surfing experiance. Unless there is only one link on the page. what is the purpose of it. if the user chooses to click another link or go back or close the page, then they have wasted bandwidth for no apparent reason. In essence its technology that serves very little purpose and may do more damage than good. Google is already using it. which means a huge increase in network activity. What about if all browsers used this technology. google serves millions of pages a day. automatically you are DOUBLING the number of pages served. Now imagine if every webmaster did it? I mean its good enough for google! Mozilla reverses years of hard work improving bandwidth by doubling the network activity. And for what? so the user gets a slightly better experiance. It is near meaningless to broadband users, dialup users may benefit more, so they can enable it. but if their costs increase by doing so, they can leave it disabled. Disabled by default is the only sensible solution. And then add it to the GUI as is outlined in another feature request.
Comment 15•19 years ago
|
||
One more comment. saying that if you don't support it webmasters will simply work around it is like saying we may as well turn off cross domain cookie protection or cross domain javascript security because webmasters will work around it. Well they do, but is that a reason not to implement them? No its not. Most webmasters won't do it if it requires a hack. especially if its the users choice not to accept prefetching. what you are doing is setting in to motion is something that may be looked upon in the future as a mistake. Think of all the mistakes made in the past that end up giving us grief now. just look at smtp for a good example. if it was done right the first time we wouldn't have so much spam to contend with now.
Comment 16•19 years ago
|
||
>Google is already using it. which means a huge increase in network activity.
>What about if all browsers used this technology. google serves millions of pages
>a day. automatically you are DOUBLING the number of pages served.
huh? why "double"? most people probably do visit the first result of their
google search. for those cases, there was zero change in the number of served pages.
as for your pay per click thingy, servers can find out whether a request is a
prefetch one (check for an X-Moz header containing "prefetch")
Comment 17•19 years ago
|
||
> huh? why "double"? most people probably do visit the first result of their
> google search. for those cases, there was zero change in the number of served
> pages.
Do They?
Is that what you do?
I don't. I read down the page and pick the item that seems most relevant.
sometimes thats number one sometimes its on page 3, sometimes I perform another
search, sometimes I close the page and make a coffee.
And thats only google.
what if every webmaster does it on every page. putting a prefetch on the link
they think you will follow next. sometimes they will be right sometimes wrong
for every page with a prefetch there is a minimum of three options
1/ follow the prefetched link
2/ close the page
3/ go back
by my reckoning thats a 1 in 3 chance the webmaster will get it right. add more
links to the page and those odds go down
I don't get it, aren't we for a better web? or are you guys hooked on gimmicks?
Do you think firefox is sold on gimmicks, no its sold on security. oh people
seem to like tabs and I love some of the addins, but if IE wasn't so BAD firefox
would be nowhere to be seen. and that means the mozilla core would still be an
educational tool only.
prefetch is a gimmick with no *real* advantage.
Apart from trying to undo the arguments, no-one has yet to put forward a valid
reason for enabling prefetch by default.
The only argument put forward is webmasters will do it anyway, by other means.
thats hardly a valid reason as I have already pointed out.
Comment 18•19 years ago
|
||
Yes, most people use the first hit. Indeed, personally I almost exclusively use the first hit, to the extent that I mostly just use "I'm feeling lucky" these days, and don't bother looking at the search results at all. You can disable prefetching if you are paranoid about something bad happening (what, exactly, could happen, I have no idea, but there you are).
Comment 19•19 years ago
|
||
Again no technical reason is given for having prefetch enabled. If you use the first item exclusivally and even more so the I feel lucky button then your use of the search engine is just for blind surfing. not for serious use of any sort, so your comments are irrelavant and out of place. most people I believe use a search engine because they are looking for something. not because they are bored and want google to find something interesting for them to do. Explain in technical terms why prefetch is an important feature. and not just a gimmick. My opinion is its a gimmick, one that increases bandwidth unnecesarilly. if people want to prefetch a site their is dedicated software out there to do that. it doesnt rely on what the webmaster thinks should be prefetched. it doesnt force other peoples idea on what path you should follow on you.
Assignee | ||
Comment 20•19 years ago
|
||
link prefetching is a tool for websites to leverage. how they leverage it is up to them. not enabling link prefetching by default effectively eliminates the feature as web sites will not be able to utilize it. there are plenty of sites that prefetch mouse over images, etc., and this mechanism gives them a better way technically to achieve that result. informing the browser of a desire to prefetch items allows the browser to optimize the process (e.g., only prefetch one item at a time, kill prefetches immediately if the user does something in another window, etc.).
Comment 21•19 years ago
|
||
> Again no technical reason is given for having prefetch enabled. I'm sorry, I thought this was obvious. It makes the browsing experience faster. > If you use the first item exclusivally and even more so the I feel lucky > button then your use of the search engine is just for blind surfing. not for > serious use of any sort, so your comments are irrelavant and out of place. If we're going to insult each other then this conversation is over.
Comment 22•19 years ago
|
||
> link prefetching is a tool for websites to leverage. how they leverage it is > up > to them. not enabling link prefetching by default effectively eliminates the > feature as web sites will not be able to utilize it. there are plenty of > > > sites > that prefetch mouse over images, Thank you for providing the relevant answer Although I strongly disagree that the advantages outweigh the disadvantages, at least I now understand better the thinking behind it. Perhaps as a "middle road" a users can be made aware when a prefetch is occuring. that way they can choose to abandon it if they require and users who were unaware of the existance of this 'feature' become aware of it. a simple flag in the status bar would possibly help, although many may not see it. but thats more of an GUI feature I guess than a core feature >> serious use of any sort, so your comments are irrelavant and out of place. > If we're going to insult each other then this conversation is over. Your comments were irrelevant. unless you have access to googles statistics, saying most people use the first returned result is just a wild guess designed to counter my comments Perhaps google has done the studies and perhaps they found X percent do take the first link and thats why they added prefetch. but I don't know that and you don't know that and must I say that if that was truly the case why do webmasters strive to get to the first page if only first position counts, also the percentage of first position hits must decline on repeated searches as people try to narrow down to the most relevant result. Not even google will tell you they get it right first time every time
Comment 23•19 years ago
|
||
(In reply to comment #22) > Your comments were irrelevant. unless you have access to googles statistics, > saying most people use the first returned result is just a wild guess We may not have access to the google statistics but we *do* have access to http://www.google.com/webmasters/faq.html#prefetching where they say "This tag is only inserted when it is likely that the user will click on the first link." Do some searches -- I had to bring up several before I found one with a prefetch.
Comment 24•19 years ago
|
||
I had started to wonder if this was the case. After my comments about the declining need for prefetch based on the number of times a user has searched for a particular phrase On an interesting side note there is an apache Mod designed to prevent google and others from prefetching mod_noprefetch http://modules.apache.org/search.php?id=858 it seemes even webmasters are being adversely effected by it
Assignee | ||
Comment 25•19 years ago
|
||
Yes, link prefetching is designed to be something web sites may easily block if desired. It's not surprising to see that some sites would choose to block it.
You need to log in
before you can comment on or make changes to this bug.
Description
•