Closed Bug 667312 Opened 13 years ago Closed 13 years ago

XMLHttpRequest send function throws 0x80004005 exception when called from local html file as of Gecko 5 even with UniversalBrowserRead requested

Categories

(Core :: DOM: Core & HTML, defect)

x86
Windows XP
defect
Not set
major

Tracking

()

RESOLVED WONTFIX

People

(Reporter: morac, Unassigned)

References

Details

(Keywords: regression)

Attachments

(3 files)

I have a local html file that contains XMLHttpRequest calls that I use to pull in and parse data from web sites.  Prior to upgrading from Firefox 4.0.1 to Firefox 5 this script ran fine.  After upgrading to Firefox 5 it always generates an exception.  It appears that as of Firefox 5, local files cannot make XMLHttpRequest calls.  This means I can't use Firefox 5 to load my local file.

I'm attaching a html file which is a simple implementation of the XMLHttpRequest example from MDC.  To test with it, simply save the file locally and than open it in Firefox and click "Allow" when prompted (allows cross-scripting and local XMLHttpRequest access).

Under Firefox 4.0.1 and earlier, it loads the contents of http://www.mozilla.org.

Under Firefox 5 and up the error event is triggered by the following exception (which displays in the error console):

Error: uncaught exception: [Exception... "Component returned failure code: 0x80004005 (NS_ERROR_FAILURE)"  nsresult: "0x80004005 (NS_ERROR_FAILURE)"  location: "JS frame :: file:///.....test.html :: <TOP_LEVEL> :: line 26"  data: no]


It appears the request is actually sent, since looking at the request and response data I see:

Request
----------------
Host=www.mozilla.org
User-Agent=Mozilla/5.0 (Windows NT 5.1; rv:5.0) Gecko/20100101 Firefox/5.0
Accept=text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language=en-us,en;q=0.5
Accept-Encoding=gzip, deflate
Accept-Charset=ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection=keep-alive
Origin=null
DNT=1

Response
---------
Status=OK - 200
Server=Apache
X-Backend-Server=pm-web04
Content-Type=text/html; charset=UTF-8
Date=Sun, 26 Jun 2011 18:01:18 GMT
Keep-Alive=timeout=20, max=992
Expires=Sun, 26 Jun 2011 04:11:18 GMT
Transfer-Encoding=chunked
Connection=Keep-Alive
X-Powered-By=PHP/5.2.9
X-Cache-Info=not cacheable; response has already expired


But an exception is thrown before the results can be returned, none the less.  This problem affects both synchronous and asynchronous connections.
Note, you can also run the file directly from this web site, by setting the "signed.applets.codebase_principal_support" preference to "true" in about:config.

The same error occurs.
Summary: XMLHttpRequest send function throws 0x80004005 exception when called from local html file as of Gecko 5 → XMLHttpRequest send function throws 0x80004005 exception when called from html file as of Gecko 5
I'll mention that the same error occurs in Firefox 4.0.1, if I remove the netscape.security.PrivilegeManager.enablePrivilege("UniversalBrowserRead") line, so my guess is that this line no longer works as of Firefox 5.

Any reason why the functionality was removed?
Summary: XMLHttpRequest send function throws 0x80004005 exception when called from html file as of Gecko 5 → XMLHttpRequest send function throws 0x80004005 exception when called from local html file as of Gecko 5
The behavior here changed in this range: http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=e11c2f95f781&tochange=bf68a4a3ef6a

Looks like the changes in bug 641706 took out the IsCallerTrustedForRead special-case, quite on purpose.

> Any reason why the functionality was removed?

Well, apart from the fact that enablePrivilege in general is being removed?
Blocks: 641706
Summary: XMLHttpRequest send function throws 0x80004005 exception when called from local html file as of Gecko 5 → XMLHttpRequest send function throws 0x80004005 exception when called from local html file as of Gecko 5 even with UniversalBrowserRead requested
It looks like they replaced it with something else.  Is the new method callable when not using Mochitest?  If not, why change things?
> It looks like they replaced it with something else.

"they" just removed the special-case: the only XHRs allowed cross-site are ones happening with the system principal or ones the target site allows via CORS.

> If not, why change things?

Because enablePrivilege is being removed.  I thought I made that clear in comment 3... and it's been the plan of record for over 2 years now.
The enablePrivilege API is going away completely. Generally if you want trusted code, you should create a firefox addon or a full XULRunner application or something like that. It's hard to say what the exact replacement is without knowing exactly what type of thing you're trying to build?
It's a simple script which searches a few pages on a web site for text and displays links of pages that contain that the text.  Writing an addon or a full application for something so simple is overkill.  I guess i can just use IE, an older copy of FF or any other browser which still works.  Seems a shame though.
Same problem and same line of reasoning here. And, yes, it is a shame. Firefox once was the browser of the web specialist and I was proud using it. Then, around version 3.6, the developers more or less stopped listening to the community. Since then it is just another stupid mainstream browser. It is always the same: If people are too successful they start to screw up.

For the application I am having here, I will tell my users to go back to Firefox 3, since Fox 4 and 5 are broken. Not a big deal for me, since I never made the upgrade anyhow.
Note that Firefox 3.x is about to stop getting security updates... and that it was announced that enablePrivilege is being removed before 3.6 even shipped.  I realize it's only been about two years since then, but at some point you really do need to update your code to use an extension, which is the supported way to do what you're doing.
Status: NEW → RESOLVED
Closed: 13 years ago
Resolution: --- → WONTFIX
For anyone who finds this bug in the future, I was told a work around is to use the GreaseMonkey add-on and set the "greasemonkey.fileIsGreaseable" preference to true and then use the GM_xmlhttpRequest function instead of XMLHttpRequest.
(In reply to comment #9)
> Note that Firefox 3.x is about to stop getting security updates... and that
> it was announced that enablePrivilege is being removed before 3.6 even
> shipped.  I realize it's only been about two years since then, but at some
> point you really do need to update your code to use an extension, which is
> the supported way to do what you're doing.

Like if it was an easy information to find, expecting you were looking for it. 99% of developpers discovered it some days ago when FF5 came out. So now I have to write a plugin to call and parse html pages (and lose time) OR ask my clients to use IE (sic). 

In my opinion, it doesn't matter why you removed it, a FF upgrade should not completly break a website without a warning to developpers. FF should have display a message in FF5 then remove it in FF6. I'm stuck now and I have to work in emergency on old websites and forget about my current work.

Let me say it hurts my trust in FF
Hmm.  I thought we had a warning on enablePrivilege use, but apparently that never made it into the tree.  I'll make sure that happens, and my apologies for that...
+1 for me.
I have the same issue. I have wrote plenty of HTML pages, hosted on our wiki server, in my company, to fetch data from different webservices and aggregate results in a nice web page. After FF5 upgrade, my pages stopped working. I'm stuck with FF4 and can't upgrade until we redesign our webservices to implement a javascript callback feature and use <script> object instead of xmlHttpRequest. I have just discovered the privilege removal a couple of week ago.
> until we redesign our webservices to implement a javascript callback feature

Or just change the web services to use CORS, which is designed exactly for your use case?  That's assuming that you control those services.  It just requires sending headers saying your wiki server can grab data cross-site from them, not any redesigning.
Thanks for the hint. That could be one option, but CORS introduce a dependency between the client and the server which is counter productive. The server does not have to know all possible clients.
Is it possible to use file:// as a protocol instead of http:// ?
Is it possible to use wildcard in the URL when we define the allowed client ?
> Is it possible to use file:// as a protocol instead of http:// ?

Not sure what you're asking.

> Is it possible to use wildcard in the URL when we define the allowed client ?

Other than "*" (which you probably don't want), I don't think so.
But note that "client" in this case is a hostname, not a url.  So if all your stuff is on the wiki server, you just whitelist that hostname.
(In reply to comment #12)
> So now I have to write a plugin to call and parse html pages (and lose time) 
> OR ask my clients to use IE (sic). 

> I'm stuck now and I have to work in emergency on old websites and forget about
> my current work.
 
> Let me say it hurts my trust in FF

+1

Not breaking back-compatibility is a way of showing respect to your customer. Microsoft knows how to do that.

Not playing net-nanny is another way to show this respect. FF has grown into such a net-nanny. Changing config? I have to "promise I will be careful" [sic]. Checking out a forgery site? No. Only if Google allows it. Looking at a web page with old cert? I have to click on a dozillion of disclaimers that I "know what I am doing". Wanting to give my scripts more permission? NO. Can't do it since FF5. Need to consider all kinds of workarounds, extensions, greasemonkey add-ons.

FEATURE REQUEST: I want a single global "turn off net-nanny" switch or mode with the semantics of "do exactly what you were ordered to, without whining, complaining, requiring promises of being careful, making me install extensions, having me research for workarounds, asking Google whether I am allowed to, throwing exceptions and refusing orders for the mere reason that someone thought what I wanted was not a good idea". 

It isn't exactly helpful to only think of John Doe users and **** off developers making them develop workarounds, extensions and stuff, just because Mozilla thought they should not be doing what they want to do.  

So this bug here is "WONTFIX". As someone wrote above: The fix may be to use IE. 

I filed the feature request as bug 668190.
(In reply to comment #15)
> Or just change the ...

> It just requires ...

It's not about technical feasibility. It's about developer time and about breaking systems which worked.
I suggest people go an read the discussion that led to the decision to drop enablePrivilege support.  In brief, it's not really compatible with content processes, it existence is hurting optimization and development effort across the DOM and JS engine, and it leads to security holes.  So yes, the tradeoff was made to drop a proprietary technology that couldn't be used on the web anyway (intranets are another story), and provide a different means to accomplish the same goals (in this instance, extensions).
(In reply to comment #22)
> So yes, the tradeoff was made

That is an important point, Boris.

Why does Mozilla not use a more git-hub like way of social coding? It would make life easier for people to do their own fork in case they had different preferences. In the current process, a (skilled) minority decides for the rest of the world. That is not "open" but "control under the disguise of openness".
> Why does Mozilla not use a more git-hub like way of social coding?

Uh...  Anyone can pull the source.  Anyone can suggest changes.  Anyone can buil the source and use it or give it to anyone else.

Is the question seriously why some pull requests (in github-speak) are not accepted by a particular repository while others are?
(In reply to comment #24)
> Is the question seriously why some pull requests (in github-speak) are not
> accepted by a particular repository while others are?

Certainly not. It's rather, why you close the source rather than open it up. Git-hub makes it easy to roll your own fork. 

I rechecked and realized: I was a too critical. Mozilla moved to Mercurial - a fact I did not know and certainly a step into the right direction. I stopped updating & watching fox source near version 3.4 due to similar frustrations with the then development process, so I did not see this restructuring. 

In my understanding, open source means making it EASY to compete on your own approach by being as open as you can. With this fundamentalist understanding, hardly any project really is open source (it's usually only about getting your own market share; which is fine, but it's just not open).
The status has been set to RESOLVED WONTFIX and my status for FF is FAILED WONTUPGRADE.
I tried to create a testcase with createSystemXHR(), using the specialpowers extension, but it doesn't work. What am I doing wrong here?
Also, Firefox seems to be getting stuck as a process afterwards.

Btw, why does XMLHttpRequest throw this weird exception? Shouldn't it throw some kind of security error instead?
I also tried something like this from specialpowers.js:
let channel = NetUtil.ioService.
                 newChannel(aURL, null, null);
 
   // Open our channel asynchronously.
   NetUtil.asyncFetch(channel, function(aInputStream, aResult) {

 
     // Check that we got the right data.
     //do_check_eq(aInputStream.available(), TEST_DATA.length);
     let is = Cc["@mozilla.org/scriptableinputstream;1"].
              createInstance(Ci.nsIScriptableInputStream);
     is.init(aInputStream);
     let result = is.read(is.available());
     aWindow.alert(result);

That seems to work at least, but it only gives part of the website, it seems.
Ok, the xmlhttprequest stuff from Greasemonkey, as indicated in comment 11, seems to work.
The code for that is here: https://greasemonkey.googlecode.com/svn/trunk/chrome/chromeFiles/content/xmlhttprequester.js
And that is called from greasemonkey.js by calling:
      xmlhttpRequester = new GM_xmlhttpRequester(unsafeContentWin, 
                                                 appSvc.hiddenDOMWindow);

You have to set the "greasemonkey.fileIsGreaseable" pref to true to be able to use it for local files.

However, this is called from the chrome process and can't be called directly from content], afiact, so it's not really an easy alternative.

I wonder why it works for Greasemonkey, though, since I wasn't able to use xmlhttprequest from inside specialpowers.js.
I guess I should ask in dev-platform about this or something?
Oh, sorry, never mind, it actually seems to work now. Even in Firefox 6, so I guess I might have done something wrong.

So people, you can use the SpecialPowers extension and SpecialPowers.createSystemXHR():
https://developer.mozilla.org/en/SpecialPowers#SpecialPowers.createSystemXHR
well this worked in firefox 12, but not 16. does anyone care? i doubt it.

var xmlhttp = SpecialPowers.createSystemXHR();
xmlhttp.open("GET", url,false);     //synchronous

Error: Permission denied to access property 'open'
Please file a separate bug on that.
Component: DOM → DOM: Core & HTML
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: