Closed
Bug 428545
Opened 17 years ago
Closed 12 years ago
add phishing/malware protection to Firefox test plan
Categories
(Toolkit :: Safe Browsing, defect)
Tracking
()
RESOLVED
INVALID
People
(Reporter: marria, Unassigned)
References
Details
I think we should have add some testing for phishing/malware protection in the Firefox test plan. Looking here, it doesn't seem like there is a test plan for this yet:
http://wiki.mozilla.org/MozillaQualityAssurance:Home_Page:Firefox_3.0_TestPlan
Specifically, it would be nice to take a set of phishing urls that we know should be protected and test a sample to ensure that there is a warning. Alternatively, this could be automated to test all urls that should be protected.
Comment 1•17 years ago
|
||
I'm pretty sure that the test pages of:
http://www.mozilla.com/firefox/its-an-attack.html
and
http://www.mozilla.com/firefox/its-a-trap.html
Are already included in our FFTs or even the BFTs.
Marria: can you provide these URLs that you're referring to in comment 0? Are they the ones listed at
http://sb.google.com/safebrowsing/update?client=firefox-testing&version=goog-black-url:1:-1
Tony: can you confirm that we've got some testing in our normal (automated or manual) test suites? Marria (in an email) suggested the following tests:
- Check a sample of urls to ensure that the warning shows
- After 24 hrs on a new profile, all of the test urls should show warnings
^^ these sound like litmus tests, or perhaps chrome tests?
- Test that after visiting a test url and seeing a warning, a return
visit should not send a new gethash request to the server since the
result is cached from before.
^^ dcamp: any way to test this?
- After pointing the download url to a dummy server that always
returns something in the 4xx-5xx range, the client backs off requests
properly.
^^ dcamp: any way to test this?
OS: Linux → All
Reporter | ||
Comment 2•17 years ago
|
||
Correct, you can find test urls here:
http://sb.google.com/safebrowsing/update?client=firefox-testing&version=goog-black-url:1:-1
Comment 3•17 years ago
|
||
Yes, we have a manual FFTs that cover these test pages, along with a handful of other malware protection testcases.
https://litmus.mozilla.org/show_test.cgi?searchType=by_category&product_id=1&branch_id=15&testgroup_id=56&subgroup_id=876
As far as chrome tests, i dont know how they can be automated since we only use test data. Carsten or Dcamp can tell you better.
Reporter | ||
Comment 4•17 years ago
|
||
Can you give me some info on what the schedule is to run these tests? I think it's important to make sure that the safebrowsing tests are part of the regular release cycle.
Also, I might have missed it but I only see a test for checking 1 url. Could we add one that is more comprehensive, testing at least a few urls?
To test that all urls show a warning after 24 hours seems like it would be better automated, since there are a fair number of urls.
Comment 5•17 years ago
|
||
(In reply to comment #1)
> - Test that after visiting a test url and seeing a warning, a return
> visit should not send a new gethash request to the server since the
> result is cached from before.
>
> ^^ dcamp: any way to test this?
Not a particularly easy way. You can either just sniff http traffic and see if we're asking, or run with NSPR_LOG_MODULES=UrlClassifierHashCompleter:5 and NSPR_LOG_FILE=gethash.log. You will see a "nsUrlClassifierHashCompleter::OnStartRequest" in the log every time we do a hash completion request.
> - After pointing the download url to a dummy server that always
> returns something in the 4xx-5xx range, the client backs off requests
> properly.
>
> ^^ dcamp: any way to test this?
Another tough one to test. We can set browser.safebrowsing.provider.0.updateURL to a server that returns the 4xx/5xx errors. With:
NSPR_LOG_MODULES="UrlClassifierStreamUpdater:5", the log will have "Fetching update from".. every time we try to fetch an update (though it might just be easier to check server logs).
Comment 6•17 years ago
|
||
For both of these it seems like having a test server that logs the requests and sends a canned response would be the most reliable way to test, since it would verify the actual traffic and doesn't depend on the specific logging format in the safebrowsing client code. Maybe you could model something based on netwerk/test/TestServ.cpp?
Comment 7•17 years ago
|
||
(In reply to comment #4)
> Can you give me some info on what the schedule is to run these tests? I think
> it's important to make sure that the safebrowsing tests are part of the regular
> release cycle.
Hi Marria,
we run this Basic Functional Tests for every Release of Firefox 2.x and also
Firefox 3 Releases, its part of the Release Process :-)
For the Firefox 3 RC1 Release Testing (scheduled for next week), we plan a Full
Functional Test Run (that cover more Testcases and also SafeBrowsing) for every
Plattform.
We can add also any time testcases, like a more comprehensive one to Litmus
(our Testsuite). A Problem with "real" URL's might be that a lot of providers
take such Phishing/Malware Sites/Domains relative fast offline.
As for the Firefox 3 Release i will run also random tests with "real"
Phishing/Scam Mails i get in my Emails Accounts to test the response of the
Safebrowsing System/Server.
Reporter | ||
Comment 8•17 years ago
|
||
Carsten - thanks for the info. Hopefully the link I gave above helps with giving you a large set of "real" urls to work with.
Comment 9•12 years ago
|
||
Doesn't seem needed at this point.
Status: NEW → RESOLVED
Closed: 12 years ago
Resolution: --- → INVALID
Assignee | ||
Updated•11 years ago
|
Product: Firefox → Toolkit
You need to log in
before you can comment on or make changes to this bug.
Description
•