User-Agent: Opera/9.10 (X11; Linux i686; U; en) Build Identifier: 188.8.131.52 for Linux and 184.108.40.206 for Windows XP It is possible to bypass Phishing Protection by add some characters to URL address. URL will be still valid and will work properly but we are not aware of Phishing warning. When we add "/" char at the end of domain in URL field - for Phishing Protection it will be another site than original and Phishing Protection Test will fail. Example: When my URL is on Phishing List: http://kaneda.bohater.net/phish.html - warning will be displayed http://kaneda.bohater.net//phish.html - warning will NOT be displayed Of course we can add more "/". Like live shows [Firefox HexEncoding Anti-Phishing bypass URL: http://sla.ckers.org/forum/read.php?13,2253 ] Phishers can use this technique in near future to abusive actions. Reproducible: Always Steps to Reproduce: 1. Open url with more /// in URL address which is added to Phishing BlackList 2. 3. Actual Results: FireFox dosnt display AntiPhishing warning. Expected Results: Its should display AntiPhishing warning
Im not sure this is really a problem. How many people add extra text to the url they follow from a phishing email or such? Sure the phishers can add extra text to get past the filter, but then they can just change the url completely just as easily.
Like in URL: http://sla.ckers.org/forum/read.php?13,2253 we can see phishers are using this kind of technique. When they are making mail with phishing address - they can add few "/" and they are sure that FireFox users [and Opera btw, but not IE] will be not protected via AntiPhishing technique. They can add "/" char every day and still use just one URL address, and this mean that this security is buggy IMHO.
I cant actually see anything there that mentions that though I admit I havent taken the time to read it in full. However what it does point out is that the phishing list uses regex's so really this is something google can counteract with the list they provide.
The IP normalization was fixed in bug 356355 and live in Firefox 220.127.116.11. The other cases (like adding a slash or query params) are handled by the enchash table which contains regular expressions. Using the goog-black-url list as an example doesn't reflect all the checks that take place. More information is on the wiki: http://wiki.mozilla.org/Phishing_Protection:_Server_Spec#Encrypted_Hash_Format
Well, maybe it could be resolved using the google service, but what if you don't want to use it? The problem persist, for example try these ones: Normal one: http://www.mozilla.com/en-US/firefox/its-a-trap.html URL modified: http://www.mozilla.com/en-US//firefox//its-a-trap.html
A "real" example: http://18.104.22.168/www.paypal.com//cgi-bin/webscr_cmd=_login-run2652/ *Notice the double // in the url, no phising alert... apparently but: http://22.214.171.124/www.paypal.com//cgi-bin/webscr_cmd=_login-run2652/ Same url without the double (intended) // in the url, and we get a phishing alert. You can check the same behavior with any phishing site: http://sb.google.com/safebrowsing/update?version=goog-black-url:1:7753 IMO the bug must be reopen.
Sorry the correct phishing url without the double / is: http://126.96.36.199/www.paypal.com/cgi-bin/webscr_cmd=_login-run2652/ As you can see the phishing alert pops up.
Reopening. Just tested with Firefox 188.8.131.52 with several real examples from my spam box : http://184.108.40.206/.bankofamerica.com/sas/profile/step1.htm triggers an alert http://220.127.116.11/.bankofamerica.com//sas/profile/step1.htm Does not trigger an alert Comment #4 says that this is solved while having the google service activated, it is not, with google automatics url checking activated, the double-slash *does* bypass google antiphishing protection. Furthermore, even if it were solved with a better server-side sanitization of the url, it wouldn't solve the problem for most of our users since Google URL check is not the default behaviour. The URL should be sanitize with the removal of successive slashes *before* comparing it with the local list or google's online service.
(In reply to comment #8) > Comment #4 says that this is solved while having the google service activated, > it is not, with google automatics url checking activated, the double-slash > *does* bypass google antiphishing protection. Sorry, let me try to explain better. Urls are checked against two different lists. The first list (goog-black-url that you're getting the example from) is basically a simple string compare. Adding slashes will get around that list. The second list contains regular expressions keyed on site domain names. This is the goog-black-enchash table that you're not including and looks like this: http://sb.google.com/safebrowsing/update?version=goog-black-enchash:1:17971 A detailed explanation is provided in the wiki link above, but suffice it to say, since it uses regular expressions, it could be used to catch double slashes. > The URL should be sanitize with the removal of successive slashes *before* > comparing it with the local list or google's online service. I believe that how double slashes are handled is a feature of the web server. I don't think it's always true that foo.com/bar/betz.html and foo.com/bar//betz.html will always resolve to the same page. Because of that, I think it's dangerous to do such canonicalizations.
>I believe that how double slashes are handled is a feature of the web server. >I don't think it's always true that foo.com/bar/betz.html and >foo.com/bar//betz.html will always resolve to the same page. Because of that, >I think it's dangerous to do such canonicalizations. Indeed, in the case of using the mod_rewrite apache module, slashes can mean different pages but that does not invalidate my point since we are not directing users to a different page but checking a sanitized URI to the locally/remotely stored list of phishing sites to provide the best information available to our users. The reporter is demoing a documented way to easily bypass Firefox' antiphishing, you are saying that indeed this method works but that it is not a bug since it could *potentially* be fixed server-side. Either it is not a bug and then no improvement is required in the remote regex at all, or it is a bug and then it should not be resolved as INVALID. At worst, the bug should be resolved as WONTFIX as in "this is a known limitation to the antiphishing system and we do not intend to protect users from Phishing sites using this kind of URI masking technique". Basically, my point is that if the user visits a page that we *know* is a phishing page, the phishing alarm should ring, whatever the syntax of the URI he clicked on to get to the page. If we know a page is a phishing site and we don't alert the user because it has an unusual request-URI, IMO we are doing the wrong thing and there is room for enhancement in this area.
Tony said, > A detailed explanation is provided in the wiki link above, but suffice it to say, since it uses regular expressions, it could be used to catch double slashes. It could be used or it will be used to catch them? I think this problem is more serious than we think, phishing attackers use all in their hands to bypass phishing filters, url obfuscation or the double slashes are two examples. Firefox is the only browser that fails with this, Opera's latest compilation has corrected this issue and IE is immune. So, what we are going to to?
(In reply to comment #11) > It could be used or it will be used to catch them? > > I think this problem is more serious than we think, phishing attackers use all > in their hands to bypass phishing filters, url obfuscation or the double > slashes are two examples. I think the plan going forward is to move everything from goog-black-url to goog-black-enchash and have regular expressions for everything. This will catch more small variations in URLs. This can be done on the server and doesn't require changes to the client code.
(In reply to comment #12) > I think the plan going forward is to move everything from goog-black-url to > goog-black-enchash and have regular expressions for everything. This will > catch more small variations in URLs. This can be done on the server and > doesn't require changes to the client code. And this will solve the problem to users who don't use/don't want to use the antiphising feature using google service?
(In reply to comment #13) > > And this will solve the problem to users who don't use/don't want to use the > antiphising feature using google service? Yes, this will work for anyone with phishing protection enabled. The "check using a downloaded list of suspected sites" option uses the goog-black-url and goog-black-enchash tables.
This bug is definitely not fixed! It is still possible to add additional slashes to circumvent the phishing filter. Tested with Mozilla/5.0 (X11; U; Linux i686; en-US; rv:18.104.22.168pre) Gecko/20070221 BonEcho/22.214.171.124pre ID:2007022104. I opened an URL from the blacklist (which was mentioned in comment 6) and got the phishing alert message: http://www.deutsche-bank.de.pbcank_id06997591.allroe.biz/kunden/ After adding a slash no more warning is visible: http://www.deutsche-bank.de.pbcank_id06997591.allroe.biz//kunden/ This should be fixed immediately. And please don't close this bug again until it's really fixed.
Mike and Benjamin, who is responsible to fix this issue? There is no response from Google for a longer time and this issue is still present!
Hi, I'm using iceweasel 126.96.36.199-1 in Debian unstable. I've checked it with phishing URI, http://carovoip.oriontel.com.ar/a2billing/Public/wells/mn1_da2_on/cgi-bin/session.php and it warns. Even if I've added a slash with that URI, it also warns to me. http://carovoip.oriontel.com.ar//a2billing/Public/wells/mn1_da2_on/cgi-bin/session.php ...and visited that URI but changed it from domain name to ip address, http://188.8.131.52/a2billing/Public/wells/mn1_da2_on/cgi-bin/session.php , no more warning.
Now its depends from something probably, for example: dont warn: http://alumnisec.com.ve//circulares/new.egg.com/security/customer/ warn: http://alumnisec.com.ve/circulares/new.egg.com/security/customer/ warn: http://184.108.40.206:81/cgi-bin-sk/webscr.php?dispatch=5885d80a13c0db1fa28a1d10438a80ebc99745074d warn: http://220.127.116.11:81//cgi-bin-sk/webscr.php?dispatch=5885d80a13c0db1fa28a1d10438a80ebc99745074d
Lookups in goog-black-enchash (the regular expression list) now handle double slashes. I'll be making a patch to have lookups in goog-black-url also handle double slashes.
Created attachment 259472 [details] [diff] [review] check url and url with multiple slashes collapsed This patch also removes the extra lookup that was needed for looking up non-normalized IP addresses (see bug 356355). It's no longer needed because the server normalizes all numeric addresses.
This still seems to be a problem in nightlies (for instance, try http://www.mozilla.com/firefox//its-a-trap.html), should this make Firefox 3?
IMO this should be critical. User don't get warned when they open such a modified website and can lose identity data. It should be fixed ASAP. This bug is still open too long. No idea if phishing sites already use that way to circumvent the built-in filter.
The majority of our users who can be affected by this bug are using Firefox 2. Asking approval for 18.104.22.168 to get closed this hole ASAP.
Comment on attachment 259472 [details] [diff] [review] check url and url with multiple slashes collapsed This patch is obsolete. We no longer use multiple table types (url, enchash, domain), it's all in a single table hashed table now. See bug 387196.
Comment on attachment 259472 [details] [diff] [review] check url and url with multiple slashes collapsed But it wouldn't be obsolete for fixing this issue on the 1.8 branch
Tony: should we land this patch on the 1.8 branch
Comment on attachment 259472 [details] [diff] [review] check url and url with multiple slashes collapsed I think it's pretty low risk, so sending to dave for review.
Comment on attachment 259472 [details] [diff] [review] check url and url with multiple slashes collapsed looks good to me
Comment on attachment 259472 [details] [diff] [review] check url and url with multiple slashes collapsed approved for 22.214.171.124, a=dveditz for release-drivers
Checking in content/enchash-decrypter.js; /cvsroot/mozilla/toolkit/components/url-classifier/content/enchash-decrypter.js,v <-- enchash-decrypter.js new revision: 126.96.36.199; previous revision: 188.8.131.52 done Checking in content/trtable.js; /cvsroot/mozilla/toolkit/components/url-classifier/content/trtable.js,v <-- trtable.js new revision: 184.108.40.206; previous revision: 220.127.116.11 done
I see no difference of behavior between nightly branch builds of 18.104.22.168 and released 22.214.171.124 with http://www.mozilla.com/firefox//its-a-trap.html.
The test URLs are hardcoded and go through a different code path. They were originally put in place to test the UI, not the url classifier. Try picking one of the URLs from http://sb.google.com/safebrowsing/update?version=goog-black-url:1:-1
Thanks Tony. Phishing protection works with Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:126.96.36.199pre) Gecko/20080122 BonEcho/188.8.131.52pre But why I don't get the warning with latest Firefox 3 nightly builds at all? I believe the same list of websites is used for different browser versions? Firefox 3 never warns the user when opening a website listed on your mentioned black list. => REOPEN
Firefox 3 uses a completely different protocol for sending the phishing/malware data. The list I pointed out above isn't used in Firefox 3. If there are bugs in FF3, they should be in a separate bug.
Tony, where can I find the list of phishing sites which should be used for Fx3? If I can reproduce it that Fx3 doesn't show a warning, I'll file a new bug.
When I go to: http://inferplus.com/cache/www3.netbank.commbank.com.au/www3.netbank.commbank.com.au/www3.netbank.commbank.com.au/www3.netbank.commbank.com.au/www3.netbank.commbank.com.au/www3.netbank.commbank.com.au/www3.netbank.commbank.com.au/update/logon.htm in Firefox 184.108.40.206, I get told it is a forgery. If I go in the 220.127.116.11 nightly from last night, I do not get told. I just did this side by side with clean profiles.
Ok, profile must have been downloading the list. If I use http://classifiedsphilippines.com/userimgs/dert.htm, I get warnings in both 18.104.22.168 and the 22.214.171.124 nightly. I can't make the bug happen by adding '/' in 126.96.36.199 though.
As mentioned before this was a Firefox 2 issue only. Adjusting flags to reflect reality.