Need "Hall of Shame" feature to encourage uncooperative sites



16 years ago
14 years ago


(Reporter: Keith Briscoe, Assigned: Ben Goodger (use ben at mozilla dot org for email))


Firefox Tracking Flags

(Not tracked)




16 years ago
This is a bit weird, but here goes:

Let's say we have some sites out there that sniff for "Mozilla" and declare that it doesn't support X or Y (think banking sites).  We could create a pref that says:

[ ] Allow Mozilla to work around bugs at Bank of America's web site

If checked, this pref causes Mozilla to send a spoofed user-agent string, but ONLY to the site in question.  "Bank of America" is just an example, by the way.

This could be used for more than just spoofing.  It could turn on/off nonstandard DOM elements, etc, but only for the site in question.  Anyway, this would accomplish two things: 1) Allow Moz to work on buggy sites WITHOUT a global user-agent switch or weird DOMness, not damaging Mozilla's marketshare or standards, and 2) It serves as a "Hall of Shame" for sites that are just plain broken (MSNBC anyone?), and makes it VERY clear that it's the site's fault, not Mozilla's.  Obviously this should only be used on sites that are particularly unwilling to correct their pages.
confirming rfe....
Ever confirmed: true

Comment 2

16 years ago
I am not sure What if Mozilla were to just give an error when it encountered
something like the <layer> tag?

Right now when Mozilla users try to view a website utilizing <layer> tags, they
generally just see that the page does not render properly.  Most users,
including many web developers and programmers, will automatically blame Mozilla
for the problem.  However, if Mozilla were to just pop up a message saying "This
website uses outdated, non-standards compliant, <layer> tags which may cause
problems when viewing this page." then more people would be inclined to realize
that the web page needs updated.  They may still blame Mozilla for not being
more backwards compliant, but at least they will know that the reason is because
the website uses non-standard code.

Perhaps the this error message idea should be filed as its own separate bug, but
I think that it would also be beneficial in conjunction with this one.  Rather
than having to search out a lot of sites and create a lot of site specific prefs
which would hopefully become obsolete after a while, the error message could
alert the user to the non-standard issues and then give them the choice of
trying to work around the non-standard code.  If they choose to try then Mozilla
can enable nonstandard support and/or spoof the user-agent string; theis
approach may not enable Mozilla to work with all non-standard sites, but it
would work with some and regardless of whether or not it worked, the user would
be more aware of the situation.

Comment 3

16 years ago
Okay, I've thought about this for a while now.  I DON'T think a generic error message for nonstandard elements is the way I'd want to go, because 1) Mozilla could never actually know the difference between a a broken DOM and a DOM element introduced just recently without a similar "Hall of Shame" of bad elements (document.all, document.layers) and 2) document.layers is used quite a bit just to sniff for NS 4.x, and we shouldn't pop-up an error message when a web site just tries to measure browser capabilities (we'll get a lot of false positives).

As far as the "bad site list" though, it's got some bonuses that aren't covered elsewhere.  For example, just by looking through the prefs you see a Top 10 List of buggy web sites--without ever having to visit them!  Think (in particular) of how this might play in reviews of the product!  Also, my primary intent in filing this bug was to offer a "middle ground" in the user-agent spoofing discussion.  Right now, we can globally spoof user-agent strings, but not through an easy-to-use GUI.  We don't do this because it skews web statistics if too many people do it.  However, if we have per-site spoofing, the statistics are the same everywhere except the spoofed site, which is only spoofed because the site breaks if it isn't spoofed.

The list falling out of sync is a big concern, however.  Mostly, how do we easily reward sites who fix their bugs when they're called "buggy" in our programs long after they've cleaned up their act?  My suggestion is to keep the list in a totally separate prefs file.  Hitting a button on the browser automatically downloads the most current form of the list.

The list contains:
A time/datestamp
An entry for every site, which contains the URL(s), the sort of workaround needed (spoofing, nonstandard DOM), and additional data (such as the actual user-agent string needing to be sent)

Comment 4

16 years ago
<OT> Why don't your comments wrap?  </OT>

1) Managing a "Hall of Shame" for non-standard elements would be fairly easy to
do.  It is a finite list and it is unlikely to change very often; old
non-standard elements are not going to someday become part of a new standard; if
someone introduces new non-standard elements and the list is not updated right
away, there  is virtually no negative fallout. It is also extremely unlikely
that the W3C would adopt a new standard element that looks just like an old
non-standard element, so I do not see where Mozilla would get confused.

2) I realize that document.layers and document.all are frequently used for
browser sniffing, but it should be easy to distinguish between their usage
inside an if condition statement and when they are not.  The error message would
not be triggered if Mozilla only encountered them inside a condition.
As for the "bonuses" that you mention :

Why do you want to advertise the buggy sites to people who do not go to them? 
The owners of those sites definitely would not be appreciative and the users who
do not go there really will not care.  

I also tend to doubt that this "Hall of Shame" for websites would benefit
Mozilla in regards to how it gets reviewed by the media.  A reviewer may decide
to present the list as the top ten sites that Mozilla fails to work with
(reviewers do not necessarily understand or agree with Mozilla's position on
standards).  If there are many major sites on the list then the reviewer may
recommend that people not use Mozilla based browsers until Mozilla becomes more
compatible with the Internet.

Regarding maintenance of the "Hall of Shame" for websites :

Sometimes Mozilla browsers are used on intranets and kiosks where there is no
Internet access or Internet access is limited to specific websites or where the
user does not have the necessary access privledges to modify browser prefs.  In
these cases the "Hall of Shame" could not be updated as you described.  

Also websites can be redesigned on a very frequent basis; thus the top ten list
could require daily maintenance.  Who is going to volunteer to try to keep up
with all the changes?  If the root list falls behind for any length of time,
Mozilla could face some major negative press and potentially a lawsuit from
sites on the list.

Many users will also have issues with Mozilla downloading the "Hall of Shame"
whenever they hit a button on the browser.  Even if it only downloads the list
upon initialization, people will have problems with it.  

If it ever were to get implemented, the feature would probably default to being
off because of these various issues.  It is also not a feature that most users
would not probably seek out to turn on.  Perhaps the "Hall of Shame" would be
more successful as an independant add-on project? 

Comment 5

16 years ago
<ot>Blame Lynx for the wrapping problem.  That's what I do ;-)</ot>

That's a good point about non-standard DOM elements being unlikely to be recycled as new standards, and that context can separate sniffing from usage.  I guess that's another bug, though.  What remains is just the user-agent part then.

I don't see the maintenance of this list as a difficult project.  Bugzilla can produce a pretty decent list by searching for unfixed Tech Evangelism bugs.  If we have a keyword for user-agent-based denials (is that [DENY]?), we're in the home stretch.  Anyway, I think it is worth mentioning that although this is a list of sites that reject Mozilla, it is only a list of sites Mozilla doesn't work with if we don't implement this pref!  The phrasing of the pref would be something like "[ ] Work around bugs at site X?" Without the spoofed user-agent string, Mozilla would NOT work around the bug and would fail.  While a reviewer might think it to be an odd pref (and it is), I doubt anyone would object to a pref that allows sites to work that otherwise wouldn't!  Also, if we do limit this to only user-agent spoofing, we don't need to worry about site redesigns.

Admittedly this bug would be totally unneccessary if we could actually get people to update their code.  This would be a temporary feature that would allow users to access sites that they can't right now, and put a little teeth into tech evangelism so that this pref can eventually make itself obsolete.

The key concept of this is that Mozilla currently DOESN'T work on many sites with the only reason being its user-agent string.  This is an awfully stupid reason not to work (not our fault, but still stupid).  Add-on project or otherwise, I think we need it.  Another add-on could be to employ some people to write dynamic translations of nonstandard pages, but that's a much harder job.  Both would cause the site in question some PR problems.

Comment 6

16 years ago
My prior comments look like ****, sorry.  I'll refrain from using Lynx for this
stuff in the future.  Bleah.

Anyway, here's a modification:

The pref is the client-side portion of a client-server app.  By default, Mozilla
does NOT try to work around site bugs.  However, if the user checks the pref to
work around bugs at site X, attempting to go to site X does the following:

- Retrieves a rule from
- Implements that "rule" when accessing the page

A rule can be "set useragent='MSIE spoof string for Mozilla'".

It can also be "redirect site through dynamic translator at"

Or finally "remove site from buggy site list"

That way, the list of buggy sites can be stored on the browser and updated from
the client, but the actual behavior used to work around the buggy site is stored
in a single always-up-to-date location on a central server.

There is a privacy hit from doing things this way to be sure. could
potentially see who is looking at what page (the buggy ones anyway).  This
would, however, be immediately fixed the second the page was fixed and it was
removed from the database.

For kiosk machines, we could simply put in a pref for "automatically update
buggy site list every week", and a checkbox for "work around all known buggy web

I agree that this pref will be rarely used and will also put itself out of
existence as sites become standards-compliant.  I guess the point is that a user
trying to get some banking done can call AOL and say "Hey!  My bank says my
browser doesn't support 128-bit encryption" and the AOL guy can say "Oh, you use
bank X?  Their site is buggy.  Just go into your prefs and check this box and it
will now work fine." and the user goes away knowing his bank's website, not his
browser, is buggy.

As a tech evangelism tool, Moz can also say to bad sites "We notice that your
web site is still using JavaScript which incorrectly identifies our browser's
capabilities.  We first notified you about this two years ago, and then we
notified you again last year.  We plan to add your web site to a list of buggy
web sites that will be made available to our five million subscribers sometime
within the next two weeks.  If you have plans to immediately change your site to
comply with web standards, please feel free to contact us at 555-1212 and we
will delay this decision.

Comment 7

16 years ago
To keep the prefs simple, you could make it just say something like 
 [ ] Attempt to work around sites that utilize improper browser sniffing
 [ ] Attempt to work around sites that utilize outdated DOM elements

Then you could still have your server side "Hall of Shame" list, but it would
not necessarily have to be visible to the user.  This way website developers are
less likely to make a fuss about being on the list.  However, if we implement a
popup warning for when the user actually visits the site, then we are still able
to point the finger of blame at the web developer.

Also if someone was to implement the auto-sniffing of offending sites as
described in previous comments then these same prefs could turn on both the
"Hall of Shame" list and the auto-sniffing.  

I am not exactly sure when/how best to update the client side list; everything I
think of has signifigant negatives.  If it is a small simple file (as I expect
it would be) then the download time for the whole thing would be fairly minimal.
You could download the list when the pref is checked and at browser
initialization if the pref is checked.  If you download the whole list each time
then there is no way for to know which of the sites, if any, you are

When the user actually visits one of the offending sites, I still like the idea
of a popup warning stating that Mozilla is aware that the site uses buggy
browser sniffing (or that it uses outdated proprietary tags) and is going to
attempt to make adjustments for it.  That way the user knows there is a
potential problem with the site and will be less likely to blame Mozilla.  I
think website developers would be more accepting of this than they would be
towards a public Hall of Shame list.


Comment 8

16 years ago
Well, okay, your way certainly IS less antagonistic ;-)

If the database is kept, and popups are given only when the site is visited,
then the database doesn't have to be user-visible.

I'd say the pref would be like this:

[ ] Attempt to work around bugs in sites that utilize improper or outdated features

NOTE: Mozilla will contact a Mozilla server whenever it contacts a site
using improper or outdated code in order to receive information on the correct
way to work around the bug.

Mozilla's current list of buggy sites is dated 5/3/2002.

[ ] Update list of sites with improper/outdated code every week from a Mozilla
  | Update list now |

That way, we get no formal "Hall of Shame" listing, but the AOL tech support
still has a pref they can lead people to, and the database is still kept, albeit
a little less publicly.  We also get popups that these sites will want to avoid
their users seeing, so it's less coercive, but still strong encouragement.  The
button for updating the list is in the pref itself.  There is no auto-update
unless they choose the weekly update checkbox.  And as far as I see it, there's
no reason to separate DOM bugs from sniffing as long as we do it this way, so I
combined the pref checkbox.
(a) Why would you ever want this pref turned off?
(b) Who is going to actually implement the code to work around these bugs?
(c) Why  not just put that code it quirks mode?

(regarding the wrapping problem: that's not a bug in Lynx, it's a bug in the
HTML spec and in Bugzilla. Lynx is correct.)

Comment 10

16 years ago
Just had another thought, although this could potentially raise privacy issues,
it  might be worth considering.

If a user visits a site that gets sniffed out as being non-standard then its URL
could be logged and then when a "Hall of Shame" database update is performed,
the log could be uploaded to the server.  The log data could then be
automatically periodically crossreferenced with other users' log data to
determine frequently visited offending sites which could be potential contenders
for your "Hall of Shame" database.

Hmmm, if the feature were popular, this might require a bit of server space, CPU
time, and other server resources.

Oh well, I am rather sleep deprived and I may not be very realistic at the
moment.  It was just an idea..

Comment 11

16 years ago
Hey Hixie, I did not see your comment until after I committed my last one.

>(a) Why would you ever want this pref turned off?

I think the idea was that some people may be opposed to the periodic downloading
of the "Hall of Shame" database.  Depending on the implementation, it may not be
necessary to have the prefs UI.  I suppose there would need to be a means of
disabling this in prefs.js or somewhere like that though.

(b) Who is going to actually implement the code to work around these bugs?

Don't know yet, at this point I think we were just discussing how it
theoretically should be done.  We will get around to who is going to do the real
work later. :) 

Part of what Keith has been talking about are those sites that would work if
Mozilla UA string were spoofed as something else; the work around there is
fairly simple: we pop up the warning and spoof the UA for that URL (specific
URL, domain, subdomain, whatever) and continue to use the correct Mozilla UA string.

As for the "working around" outdated stuff like <layer>, someone would probably
have to actually implement some support for <layer>.  I am not the man for that
job and I do not presume to know how much work it would entail, but I seem to
recall someone (who exactly escapes me at the moment, but for some reason I am
pretty confident that he knew what he was talking about) saying somewhere in a
bug report or in a forum that it would not be particularly hard to implement
support for layers because they could be translated somehow into <div> tags or
somthing like that faorly easily, but he just did not think it should be done.
Maybe he would be more suppotive of it if we popped up the warning message
before doing it.  

(c) Why  not just put that code it quirks mode?

I am probably in favor of putting it in the quirks mode; I reserve the right to
reconsider it though when my thought processes are more functional.  It seems
like there is some reason why it should be separate from quirks mode, but at the
moment I can not think of what it is; so maybe there is no reason not to.
Inventing UI for an undefined feature is not the way to do things. IMHO, this
bug should be closed INVALID until we actually have something to have a pref for.

Comment 13

16 years ago
The assignment of this to the Preferences component is an incorrect bug
designation; it does not mean that the bug is INVALID.

If you read the comments here by Keith and I then I think it should be obvious
that we are defining a feature and not just talking about adding a pref for a
undefined feature.  Of course, as part of how the feature will manifest itself,
we have discussed what pref(s) might be desirable, but the majority of the
discussion has been about how the feature should work.

I am not really sure which component this bug realy belongs to. Possibly XP Apps
since that is where bug 80658 and bug 46029 are?  Perhaps "DOM Other" for the
implentation of support for non-standard DOM elements?  Maybe it should be
broken up into more than one bug with dependancies?  

Keith, you should probably take the word pref out of the summary too.

Comment 14

16 years ago
Changed "pref" to "feature".  Left component unchanged 'cause I don't know any
better.  Hixie--quirks mode may work for the DOM stuff, agreed.  However, the
DOM stuff was always a secondary concern for me--we definitely don't want
Mozilla to send an MSIE user-agent every time it encounters a quirks-mode site.

This is primarily a user-agent sniffing workaround bug.  The DOM stuff sneaked
in there and may be welcome to stay here if it seems to fit.
Summary: Need "Hall of Shame" pref to encourage uncooperative sites → Need "Hall of Shame" feature to encourage uncooperative sites

Comment 15

16 years ago
Okay, this is my last spam for a while.  We've been talking about databases,
client/server synchronization, dynamic translation, etc.  How about a super-easy
thing that could be coded in a day or two by someone who really knew their way
around the code?

Here's the pref:
User-Agent Spoofing

Spoof Mozilla's user-agent string for the following sites:

|*           ^|  (Properties)
|* v|  (Add new site)

In the listbox, you've got the sites.  If you click the "Add new site" button,
you add a new site & user-agent string.  Clicking "Properties" on the site
allows you to change the user-agent string for that site.

We'd need to do something to prevent people from doing *.* or *.com, but that's
pretty easy I'd think.

Now people have a pref to go to when they can't access a site.  People on tech
support (AOL, CompuServe, RedHat, etc) simply tell them to type the site into
this pref and type "MSIE" in user-agent.

Thus we would have a UI to change the user-agent string, but this wouldn't alter
site statistics for most web sites.  The best of both worlds.  And the code for
this has gotta be pretty easy (I say in my cocky English major voice).
Let's see if I understand what you are suggesting, then.

Someone would host a file somewhere that contains a list of URI patternss and a
list of User-Agent strings.

Mozilla would download this file on a periodic basis.

Mozilla, before going to any page, would look through this file to see if the
URI of the page being opened matches any of the patterns in the file.

If it does, it would change the User-Agent string on the fly to match the User-
Agent string given in the file, otherwise it would use the default User-Agent.

Is that a complete description of the proposal?

(I wrote that before reading your last comment. If your last comment is indeed
the suggestion here, then this is a duplicate of another bug.)

Comment 17

16 years ago
Yes, that's exactly it (except my last comment made it entirely client-side for
easier coding).  Dupe away as you see fit.  Do you have suggections for Mark,
who seemed interested in doing something similar with DOM?
Reporter's latest description is a duplicate of bug 80658.

*** This bug has been marked as a duplicate of 80658 ***
Last Resolved: 16 years ago
Resolution: --- → DUPLICATE
marking verified as a duplicate.

if you decide to reopen this bug, please clarify why.

search string for bugspam removal: SalviaGuaranitica
Product: Browser → Seamonkey
You need to log in before you can comment on or make changes to this bug.