Closed Bug 301375 (xss) Opened 15 years ago Closed 4 years ago
[meta] Ideas for mitigating XSS holes in web sites
*** Bug 312964 has been marked as a duplicate of this bug. ***
Depends on: 324253
Ok, here's an idea. The problems of XSS, imho, are due to lack of separation in HTML btw metadata (incl. scripts) and data. The idea I'll present here will require some support from the server side, to help separate between metadata and data; however, the change is small enough, and the problem important enough, to make this reasonable, I think. Also I believe the method can be extended to provide (limited) client-only defense as well, but I won't cover this in this note to keep its length bearable. Specifically, I suggest sites use special markup to define permitted and forbidden areas, for different kinds of markup. This could take multiple forms, and careful evaluation should determine best forms, but let me give just two examples to make the idea concrete: <NoScript id=xxx>here goes HTML without any scripts, in either <script>(an ignored script)</script> or attributes (e.g. <a href=xx onsubmit="ignored"> </NoSrcipt id=xxx> <!-- notice use of random id attribute, matched between beginning and end NoScript tags, to avoid fake end NoScript by malicious markup--> <MarkupValidationOn id=xxx> rest of HTML document where _all_ tags are ignored, unless they contain the validating identifier, e.g. <Img src='webbugger.com'> is ignored while <img src='cow' id=xxx> is applied. I am thinking of prototyping something along these lines, so comments are most appreciated...
Marking all tracking bugs which haven't been updated since 2014 as INCOMPLETE. If this bug is still relevant, please reopen it and move it into a bugzilla component related to the work being tracked. The Core: Tracking component will no longer be used.
Status: NEW → RESOLVED
Closed: 4 years ago
Resolution: --- → INCOMPLETE
You need to log in before you can comment on or make changes to this bug.