Closed
Bug 1518476
Opened 6 years ago
Closed 5 years ago
Average time spent downloading a page for googlebot is too high
Categories
(developer.mozilla.org Graveyard :: Performance, enhancement, P2)
developer.mozilla.org Graveyard
Performance
Tracking
(Not tracked)
RESOLVED
WONTFIX
People
(Reporter: atopal, Assigned: rjohnson)
Details
(Keywords: in-triage)
Currently the average time spent downloading a page stands at 430ms, with a high of 1200ms and a low of 267ms over the last 3 month. The recommended average is at 300ms. Values above that can reduce crawling frequency and lead to lower rankings.
We should identify ways to reduce the download time
Comment 1•6 years ago
|
||
I would like to work on this. Is anything required to start working on this?
Updated•6 years ago
|
Comment 2•5 years ago
|
||
MDN Web Docs' bug reporting has now moved to GitHub. From now on, please file content bugs at https://github.com/mdn/sprints/issues/ and platform bugs at https://github.com/mdn/kuma/issues/.
Status: NEW → RESOLVED
Closed: 5 years ago
Resolution: --- → WONTFIX
Updated•5 years ago
|
Product: developer.mozilla.org → developer.mozilla.org Graveyard
You need to log in
before you can comment on or make changes to this bug.
Description
•