Open Bug 1377985 Opened 8 years ago Updated 3 years ago

Put some reasonable limits on urlclassifier.gethashnoise

Categories

(Toolkit :: Safe Browsing, enhancement, P3)

enhancement

Tracking

()

People

(Reporter: francois, Unassigned)

Details

We should put some reasonable limits on the number of noise entries that can be sent in fullhash requests, both to prevent performance problems in the client code, but also to prevent server-side problems: https://groups.google.com/d/topic/google-safe-browsing-api/WnsEQ9mEvrc/discussion The default is 4, I propose: - minimum of 0 - maximum of 32
Keywords: good-first-bug
I'm happy to take this one, Francois. Can you give me some more details about this issue?
Hi Toms, Thank for helping! The entry point of adding a noise entry can be found in nsUrlClassifierDBServiceWorker::DoLookup[1]. We could probably think where is the best place to check the limit and what to do if the value is over the limit. [1] https://searchfox.org/mozilla-central/rev/dc3585e58b427c3a8a7c01712fe54ebe118a3ce2/toolkit/components/url-classifier/nsUrlClassifierDBService.cpp#257
Thanks Dimi. I didn't realize this was a c++ file. I don't know this language so maybe we should let someone else handle this one.
Keywords: good-first-bug
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.