Closed
Bug 109750
Opened 23 years ago
Closed 23 years ago
[RFE] prepare the pages for displaying before you actually click on them
Categories
(SeaMonkey :: UI Design, enhancement)
SeaMonkey
UI Design
Tracking
(Not tracked)
VERIFIED
WONTFIX
People
(Reporter: hhielscher, Assigned: trudelle)
Details
(Whiteboard: dupeme)
.. can't you gain a huge improvemnet in visible browser performance when you prepare the pages for displaying before you actually click on then? simple example: browsing through in image galery with a next and a back button: when I enter one page I use a few seconds to few the picture. during this time mozilla is doing nothing so why dont we load the next page during this time into the history? and when I actually click it: is already there. Like in the fairy-tale of the hare and the hedgehog. [1] I know that preloading / prerendering is becoming much more difficult and complex when there are several links on one page. Then I suggest to order the pages after the following criterias: - the link(s) under the mouse or near the mousecursor - the link(s) most visited during last visits (URL, Mouse Position?) - how often the same link is used on the same page - the importance of the link (e.g. load <H1><A href ... before <H3><A href .. ) There might also be problems with dynamic pages? We also need an option to limit this chaching behaviour to specific adresses (e.g. only for the intranet). -- [1] German version is available at http://gutenberg.aol.de/grimm/maerchen/haseigel.htm
This is defiitely a duplicate, although I can't seem to find the bug in question. As far as I remember, it was decided that this feature would be a very bad idea for a number of good reasons. There is no guarantee that the user will want to view the pages being downloaded in the background. This wastes bandwidth, spams servers, messes up server logs and other badness. In addition, there are umpteen "download accelerators" out there that do this sort of thing. Mozilla should try to do its designated job well, and not replace every other browsing aid out there. If I can't find the dupe, I'm tempted to mark wontfix...
Whiteboard: dupeme
Reporter | ||
Comment 2•23 years ago
|
||
> This wastes bandwidth, spams servers, messes up server logs and other badness. This is no argument against it. It is always useful browsing local files and you can choose if you want to use it in your intranet. And if you use it for browsing the internet: you pay for the GBytes so it is your choice (or your providers). > In addition, there are umpteen "download accelerators" out there that do this sort of thing. > Mozilla should try to do its designated job well, and not replace every other browsing aid out > there. Can you give me some URLs for tools that work with mozilla and linux please? Do they prepare the pages for viewing or do they just download the data. If they do only the second, then they are useless on a fast local network.
If you mean pre-process the data rather than pre-cache, then you still need to download the pages yet to be viewed, and all the above problems remain. In any case, it seems like a bad hack when an actual reduction in page load time is what we should be aiming for. I don't understand your arguments about browsing over a local intranet. For most people, the bottleneck in page rendering is the downloading of the actual information and not the rendering. Over an intranet, this problem is greatly reduced, and current page load time is more than acceptable in this case. The probably minor benefits of the proposed feature do not warrant the time that would be spent implementing it. Our efforts should be directed toward an actual reduction in page load and rendering time, rather than on creating the illusion of speed.
Reporter | ||
Comment 4•23 years ago
|
||
First: there is no "illusion of speed". What count is the user and the time he or she has to wait. The page load time shouldnt be "acceptable". It should be close to zero. My RFE is to pre-process and pre-cache the pages. I understand that the work should go into reducing the normal page load and rendering time now. But once most of the improvements are done it's time to think about other ways to reduce the page load and rendering times.
The method suggested is a 'brute force' way of tackling the problem. Everything that might be needed is downloaded in case the user needs it. This approach is not a good solution, and this is the sense in which I meant it was an 'illusion' of speed. Unfortunately, the user is not the only consideration when designing a browser. If we were to implement this, expect sys admins to look less favorably on a browser that constantly spams their servers, messing up their log file analysis and hogging their bandwidth. As an example, the favicon feature in IE5+ famously fills server logs with requests for (possibly non-existant) favorite icons. We could be guilty of the same behaviour, on a much larger scale, if this feature were implemented. In any case, this discussion is probably best kept to the newsgroups. We've already cluttered this bug report enough! I'm going to leave this unconfirmed for the moment - I expect it to be a duplicate. In the meantime, you could start a discussion on this in the peformance newgroup, if you like, where others could probably help you out. Thanks for using bugzilla and reporting bugs!
Comment 6•23 years ago
|
||
Of course this has been requested previously, one example of such a request is bug 12274.
Comment 7•23 years ago
|
||
If you think you're going to look at all the images in an image gallery, you should just use the linked images bookmarklet <http://www.squarefree.com/bookmarklets/pagelinks.html#linked_images>. Pre- caching every link that you *might* click on would be a huge waste of bandwidth.
Comment 8•23 years ago
|
||
This would additionally require, I think, a way for mozilla to prioritise its downloads, so that the opportunistic download-ahead didn't starve the page downloads that the user _explicitly_ asked for.
Comment 9•23 years ago
|
||
I have my commercial server software set for 40 concurrent connections. Does this mean that I would have to re-set it for 80 now to accomodate this ?? Don't think so !!
Comment 10•23 years ago
|
||
->cache
Assignee: asa → gordon
Status: UNCONFIRMED → NEW
Component: Browser-General → Networking: Cache
Ever confirmed: true
QA Contact: doronr → tever
Comment 11•23 years ago
|
||
See also bug 107160, "Mouseover link information".
Comment 12•23 years ago
|
||
This is not a cache feature. This feature would need to be implemented by a component that can parse html and knows what to do with links. That is outside of the networking library.
Assignee: gordon → asa
Component: Networking: Cache → Browser-General
QA Contact: tever → doronr
Comment 13•23 years ago
|
||
who's driving this thing?
Assignee: asa → pchen
Component: Browser-General → XP Apps
QA Contact: doronr → sairuh
Assignee | ||
Comment 15•23 years ago
|
||
resolving as wontfix. IMO, the browser has no business using bandwith, processor time and disk space for data the user did not request. If every client behaved this way, the net result would likely be worse performance for all.
Status: NEW → RESOLVED
Closed: 23 years ago
Resolution: --- → WONTFIX
Reporter | ||
Comment 16•23 years ago
|
||
Why is this Wontfix? I dont request this to be turned on for internet access. I would love to have instant page view on the intranet. To come to your points Peter: bandwith: does this really matter in times of 10 Gb Ethernet? processor time: this is irrelevant b/c on most desktop systems the cpu is in the idle mode when the user is viewing pages disk space: i dont think that the requested pages should be disk-cached by default, but offline users may think different (you know there are tools like wwwoffle, http://www.gedanken.demon.co.uk/wwwoffle/, to cache pages in advance for offline use later) How the (intra)net behaves under the higher load should be left to the network administrator, he/she can decide whether to turn precaching/prerendering on or off.
Assignee | ||
Comment 17•23 years ago
|
||
Helge: You may have infinite capacity in these areas, most people do not. Most of our target users still have a PentiumII/400 with 128MB RAM or less, a modest disk drive, and a 56K modem or 10MB Ethernet. As Jay, Cormac and others have already commented, this would increase loads on servers and other infrastructure by an unacceptable factor, and offers only the illusion of better performance overall. This is a research topic, not something appropriate for a production browser. However, if someone wants to attach a patch that actually achieves what you claim, I'm sure we'd all love to see it.
Updated•20 years ago
|
Product: Core → Mozilla Application Suite
You need to log in
before you can comment on or make changes to this bug.
Description
•