Add crawl-delay directive to robots.txt

RESOLVED FIXED

Status

()

RESOLVED FIXED
a year ago
a year ago

People

(Reporter: dylan, Assigned: umohm12)

Tracking

Production

Details

Attachments

(1 attachment)

PR
45 bytes, text/x-github-pull-request
dylan
: review+
Details | Review | Splinter Review
I think there's an instruction that can be added to robots.txt that causes reasonable search engines to crawl more slowly -- can you look this up and add it?

thanks!
(Reporter)

Updated

a year ago
Status: NEW → RESOLVED
Last Resolved: a year ago
Resolution: --- → FIXED
(Reporter)

Updated

a year ago
Summary: See if we can add a directive/rule to robots.txt to get search engines to crawl more slowly → Add crawl-delay directive to robots.txt
You need to log in before you can comment on or make changes to this bug.