What problem would this feature solve? ====================================== Naive scrapers that ignore robots.txt will eventually find views requiring login. This requires creating a new OAuth session, and quickly becomes one of the top requests. Normal users log in once per session, so this is a good candidate for rate limiting Who has this problem? ===================== Staff contributors to MDN How do you know that the users identified above have this problem? ================================================================== The OAuth view is generally takes 0.2% of total request time on the site. During a recent period with a scraper, it increased to 18% of the total time, a 90x increase How are the users identified above solving this problem now? ============================================================ Scrapers contribute to lower performance, possibly downtime, and staff is alerted. The scraper are identified by IP and blocked. Do you have any suggestions for solving the problem? Please explain in detail. ============================================================================== Add a rate limit, such as 3 requests per minute, for initializing an OAuth login. Is there anything else we should know? ====================================== Login is provided by django-allauth, and it will require customization to add this rate limit, above the usual ratelimit view decorator.
You need to log in before you can comment on or make changes to this bug.