We should serve a robots.txt file to control which parts of the dashboard get hit by search engines. Until the site is more stable, including perf fixes and a permanent URL, I'd suggest to disallow all.
Found (r'^robots\.txt$', lambda r: HttpResponse("User-agent: *\nDisallow: /*", mimetype="text/plain")) on http://news.e-scribe.com/431, that should be good enough for now.
yup, that's matching my experience with robots.txt. Additionally, we may want to block via .htaccess spiders that are ignoring robots.txt. I'll work on a magical recepe for that.
Status: NEW → RESOLVED
Last Resolved: 8 years ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.