Closed Bug 547437 Opened 14 years ago Closed 5 years ago

Suggestion: Enable User To Allow Discrete Cross-Site-Scripting

Categories

(Firefox :: General, enhancement)

3.6 Branch
x86_64
Windows 7
enhancement
Not set
normal

Tracking

()

RESOLVED WORKSFORME

People

(Reporter: brille1, Unassigned)

Details

Attachments

(8 files)

User-Agent:       Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2) Gecko/20100115 Firefox/3.6
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 6.1; de; rv:1.9.2) Gecko/20100115 Firefox/3.6


In the course of enforcing Same Origin policy, Firefox (like other browsers) blocks attempts to access content from other websites through, e.g., iframe elements or XMLHttpRequest calls.

Because this particularly stops Firefox from making us of web services by using the XMLHttpRequest object, I'd like to suggest to enable the user to create a white list of web sites (or URL paths) that are allowed to access a list of foreign websites (or URL paths).


Here are the details:

(I've created a couple of sample dialogs and added them as attachments. I'm running the German version of Firefox so they are all in German. Most content is taken from the current pop-up configuration dialog.)


*  Like with pop-up dialogs, Firefox should provide a dialog where the user can edit a white list [see CSS1.png].

*  This white list should allow to enter websites (or URL paths, I can't tell what's more appropriate).

*  For each of these websites (or URL paths) the user should be able to enter a number of websites (or URL paths) that the website may address through an <iframe> element or the XMLHttpRequest object (or any similar means) [see CSS2.gif, which is animated]. In the following the former is called "source websites", the latter "destination websites".

*  [CSS2a.png] shows the dialog when the user is to enter a new source website. [CSS2b.png] shows the dialog when the user is to enter a new destination website for the selected source website ("mozilla.org" in this example).

*  The user should be able to grant access to ANY foreign destination content for a source website (or URL path, I can't tell what's more appropriate). The asterisk ought to be used to denote that a source website (or URL path) may access any foreign destination content [see CSS2d.png].

*  The user might want to grant access to certain web services to ANY source website without restriction (e.g. package tracking services). So entering an asterisk into the list of source websites (or URL paths) would allow the destination websites (or URL paths) listed in the destination list to be accessed by any arbitrary source [see CSS2e.png].

*  To inform the user of a blocked foreign request attempt, Firefox should display a yellow bar above a document when such request(s) has or have been blocked. The yellow bar should allow to enter the currently blocked request(s) into the white list an re-attempt to execute these requests [see CSS3.png].



Reproducible: Always
If using URL paths instead of domain names, some valid values might be:


file:        http:
(= local files can access any http: destination)


file:        *
(= local files can access any destination)


*            http://www.ups.com/WebTracking/
(= files from any sources can access any resource at or below this http: path)


*            https://www.ups.com/WebTracking/
(= files from any sources can access any resource at or below this https: path)
is this related to the work on Content Security Policy (in which case it would be the website itself that determines if a remote script is allowed or not) ?

http://blog.mozilla.com/security/2009/09/30/a-glimpse-into-the-future-of-browser-security/
Excellent article! Thanks for providing me with this Hyperlink.

Yep, CSP is a far better solution than mine. It keeps the user from maintaining huge lists of white list entries.
Unfortunately the Commments section is closed...

I'm very curious to know how CSP is going to work for local files (file:).

Will CSP consider the HTML <meta http-equiv="X-Content-Security-Policy"> tag?
...one more question:

How about auxiliary files like images and JavaScript files? How can I secure them from being loaded into foreign web pages?

I couldn't find a directive I could add to a JavaScript file so that it could only be executed from web pages originating from the same domain.
CSP is about protecting websites against scripts coming from an external source, like when someones managed to insert them in a comment section of a webblog for instance (the nromal CSS attack vector).

It's not protecting the scripts or images themselves. If you need that, you can use the Referer HTTP-headers. But they're easy to steal anyway.
I see... So it might be an interesting extension to the standard to protect these files as well.

Could you perhaps enlighten me on the usage of the <meta> tag and how CSP works for local files?
Have you looked at CORS (http://dev.w3.org/2006/waf/access-control/) and/or postMessage (http://www.w3.org/TR/html5/comms.html#crossDocumentMessages)?  Are there reasons those couldn't work for your use cases?

Providing UI for users to override fundamental web security boundaries seems like something that's generally unappealing as a core product feature, because its risky and something most people really should not mess with anyway.

If there are scenarios where doing so actually improves user security, I'd suggest writing an add-on that allows that very limited set of users to have that functionality.
(In reply to comment #16)
Thanks for pointing me to these interesting recommendations! I guess I'll take some time to read these specs. May become a few days before I can respond....

I've had a chance to read through the Mozilla CSP recommendation (see comment #10), and it's very promising, although in its current state it doesn't allow <meta> tags, which would provide abovementioned functionality for pure client-side cross-domain web-service access.
Thanks Jo. Yes, I know, that's actually my thread there ;)

Currently I'm discussing this @ http://groups.google.com/group/mozilla.dev.security/browse_thread/thread/bce3ee90653f3564/028751419dc6d6a4
Who wants to do the cross-site scripting? There are at least three parties involved here and depending on who wants to do the sharing there are different solutions.

Let's assume VisitedSite and DataSite aren't directly cooperating, because if they were things are easy -- either their servers talk to each other on the back end or they do what ad networks do and include each other's scripts.

If DataSite has publicly available data and cooperatively shares it then VisitedSite can use CORS/cross-site XMLHttpRequest. The user doesn't need to do anything.

If the User wants to mash them up and the two sites know nothing about each other then the user could install a Greasemonkey userscript or a JetPack. Otherwise merely saying the sites _can_ do cross-site scripting isn't going to make them do it.

The only case I can think of where your suggestion comes into play is if DataSite is not cooperative (hostile or uninterested) and the VisitedSite is coded specifically for some future browser that will have this feature, and expects to have a mass of savvy users capable of doing this kind of management. ("please whitelist DataSite so this site can function properly"). I'm not sure that's a plausible scenario.

Much more likely in that case VisitedSite will publish a user script and ask users to install it (supported by GreaseMonkey in Firefox, and something like that in Opera and Chrome at least). That can be done today and seems quite a bit easier to get users to do than manage a list of sites.

I think this is a wontfix, imho.
Well, I see a plausible use case though.

First of all: One of the buzz-words regarding XML, SOAP and web services is that these standards are versatile an may be used between sites/programs/anything *without* one knowing each other or providing "hints" like taylored skripts for each other. If that wouldn't be the case, there wouldn't be a reason for using them because they have a huge overhead not repaying their use.

So, it should be possible to use a company's web services by client script *without* providing crutches like Greasemonkey, JetPack, server-side script or anything.

Next, a use case: Imagine you'd be a small business, posting packages by some postal service, and you'd like to provide tracking to your customers as a value-added feature of your site. So you'd like to provide JavaScript to your web pages to allow for accessing the postal service's tracking feature to provide your customers with that information through *your* business channel. Just like you do with the postal service's RSS feed. You don't use Greasemonky for reading RSS feeds, don't you? Accessing a web service is quite the same, it's just more personal...

Creating a technical bypass channel like server-side script is not desired. Because there already exists one: The web service.
Version: unspecified → 3.6 Branch
Status: UNCONFIRMED → RESOLVED
Closed: 5 years ago
Resolution: --- → WORKSFORME
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: