Open
Bug 1377985
Opened 8 years ago
Updated 3 years ago
Put some reasonable limits on urlclassifier.gethashnoise
Categories
(Toolkit :: Safe Browsing, enhancement, P3)
Toolkit
Safe Browsing
Tracking
()
NEW
People
(Reporter: francois, Unassigned)
Details
We should put some reasonable limits on the number of noise entries that can be sent in fullhash requests, both to prevent performance problems in the client code, but also to prevent server-side problems:
https://groups.google.com/d/topic/google-safe-browsing-api/WnsEQ9mEvrc/discussion
The default is 4, I propose:
- minimum of 0
- maximum of 32
| Reporter | ||
Updated•8 years ago
|
Keywords: good-first-bug
Comment 1•6 years ago
|
||
I'm happy to take this one, Francois. Can you give me some more details about this issue?
Comment 2•6 years ago
|
||
Hi Toms,
Thank for helping!
The entry point of adding a noise entry can be found in nsUrlClassifierDBServiceWorker::DoLookup[1].
We could probably think where is the best place to check the limit and what to do if the value is over the limit.
[1] https://searchfox.org/mozilla-central/rev/dc3585e58b427c3a8a7c01712fe54ebe118a3ce2/toolkit/components/url-classifier/nsUrlClassifierDBService.cpp#257
Comment 3•6 years ago
|
||
Thanks Dimi. I didn't realize this was a c++ file. I don't know this language so maybe we should let someone else handle this one.
| Reporter | ||
Updated•6 years ago
|
Keywords: good-first-bug
Updated•3 years ago
|
Severity: normal → S3
You need to log in
before you can comment on or make changes to this bug.
Description
•