Open Bug 1757836 Opened 2 years ago Updated 5 months ago

Creating a ReadableStream in an Addon fails with Error: Permission denied to access property "autoAllocateChunkSize"

Categories

(Core :: DOM: Streams, defect, P2)

defect

Tracking

()

People

(Reporter: mgaudet, Unassigned)

References

(Depends on 1 open bug, Blocks 2 open bugs, Regression)

Details

(Keywords: regression)

Attachments

(5 files, 1 obsolete file)

(I really need to figure out how to convert bespoke add-on code to a mochitest)

Create a content script for an addon with

    var controller;
    var rs = new ReadableStream({ start(c) { controller = c } })

and you'll discover the content script fails with

Error: Permission denied to access property "autoAllocateChunkSize"

Has Regression Range: --- → yes
Attachment #9266335 - Attachment description: WIP: Bug 1757836 - Test case for bug, and broken solution → WIP: Bug 1757836 - Test case for bug, and broken *partial* solution

Proximate cause of the error is that when we try to create the IDL specified UnderlyingSource dictionary from the object passed to the ReadableStream Constructor we are denied access to look for the autoAllocateChunkSize property on that object.

What’s happening is that the ReadableStream constructor is an Xray to the page’s ReadableStream. When invoking an X-ray constructor, we explicitly switch realms into the target function’s realm; ie, the page realm. This means that when trying to access UnderlyingSource, which is coming from the sandbox realm for the content script, it is a CCW and we’re unable to access because the security model says we can’t access things from a more privileged scope from a less privileged one.

Attached to this bug is a partial fix… but very unsafe, as it currently does an unchecked unwrap on all CCWs.

This further doesn’t successfully fix the issues; we then run into a different problem: Permission denied to access property "then" - false == true when trying to await the promise — this I think is a slightly different bug — see Bug 1750290


So: What happened in JS streams that allowed this to work?

This is a Pernosco trace of the JS streams running the same test case. It appears that in this case we’re invoking the ReadableStream function in the sandbox global. We change realm into the sandbox on invocation.

So I think this bug is a direct result of the switch to WebIDL; and furthermore, I suspect this bug would exist for other specifications that try to create IDL Dictionaries outside of the binding code, using [GenerateInit] — Olli points to WebCrypto as an example of something that might also be vulnerable to this.

peterv: It seems like there's a good chance the correct place to fix this is not one-of on a case-by-case basis, but rather inside of the [GenerateInit] generated methods.

The root cause seems to be the use of the '"converted to an IDL value"' verbiage in the Streams spec (and similar verbiage does appear a few times in WebCrypto's spec). If an IDL value is passed as an argument, it's correctly converted before we switch realms to the xray'd function's realm.

Flags: needinfo?(peterv)

This is far from complete, but is more complete than the previously posted patch, and has the advantage
of being runnable from --disable-dom-streams builds, which wasn't true of the other test case.

An update: Thanks to :lgreco we have a work around in the case this does turn out to be an issue for anyone:

If a ReadableStream is needed, first cloneInto the source argument, and use the window.wrappedJSOBject.ReadableStream constructor, and things seems to work out.

See test_clone_into in the attached testcase diff

(In reply to Matthew Gaudet (he/him) [:mgaudet] from comment #2)

So I think this bug is a direct result of the switch to WebIDL;

Yes, see the examples in https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/Sharing_objects_with_page_scripts#constructors_from_the_page_context for the difference between javascript built-ins and DOM APIs. Moving Streams to WebIDL switched it from the first class of objects to the second.

and furthermore, I suspect this bug would exist for other specifications that try to create IDL Dictionaries outside of the binding code, using [GenerateInit]

This isn't really related to GenerateInit, but more with how Xrayed DOM APIs (and thus contenscripts) work. I'm pretty sure that APIs taking dictionary arguments with object or any in them run into similar issues, even though they're converted inside the binding layer.

So I don't think it's such a good idea to add something to the generated Init methods, callers outside of generated binding code need to be careful when they get handed a JSObject as an argument from a binding method, and I'd rather not add magic that hides that. The WebCrypto case, while probably similarly broken, looks different, since it's conceptually trying to use a union type with multiple dictionary member types. I don't think it needs the JS object beyond that conversion, but Streams do. I was thinking we could maybe pass the original object along with the converted dictionary from the binding code, but if Streams is the only case that'll use this then that seems a bit overkill :-/.

(In reply to Matthew Gaudet (he/him) [:mgaudet] from comment #5)

If a ReadableStream is needed, first cloneInto the source argument, and use the window.wrappedJSOBject.ReadableStream constructor, and things seems to work out.

The cloneInto is the only thing that's needed to fix this bug though, and window.wrappedJSOBject.ReadableStream is for bug 1750290?

(In reply to Peter Van der Beken [:peterv] from comment #6)

(In reply to Matthew Gaudet (he/him) [:mgaudet] from comment #2)

So I think this bug is a direct result of the switch to WebIDL;

Yes, see the examples in https://developer.mozilla.org/en-US/docs/Mozilla/Add-ons/WebExtensions/Sharing_objects_with_page_scripts#constructors_from_the_page_context for the difference between javascript built-ins and DOM APIs. Moving Streams to WebIDL switched it from the first class of objects to the second.

Ah; I hadn't quite processed that properly. So this is partially an expected behaviour change. That's good to know.

So I don't think it's such a good idea to add something to the generated Init methods, callers outside of generated binding code need to be careful when they get handed a JSObject as an argument from a binding method, and I'd rather not add magic that hides that. The WebCrypto case, while probably similarly broken, looks different, since it's conceptually trying to use a union type with multiple dictionary member types. I don't think it needs the JS object beyond that conversion, but Streams do. I was thinking we could maybe pass the original object along with the converted dictionary from the binding code, but if Streams is the only case that'll use this then that seems a bit overkill :-/.

(In reply to Matthew Gaudet (he/him) [:mgaudet] from comment #5)

If a ReadableStream is needed, first cloneInto the source argument, and use the window.wrappedJSOBject.ReadableStream constructor, and things seems to work out.

The cloneInto is the only thing that's needed to fix this bug though, and window.wrappedJSOBject.ReadableStream is for bug 1750290?

Yes: I think that's a fair summary.

So I think really, both of these bugs can be resolved as 'works as expected' (where the expectation is just peculiar because of the nature of add-ons)? We could land the test case from this to make sure things keep working.

Does that sound right Peter?

(So far, we're 1 week into the beta cycle with no reports of broken addons yet)

Depends on: 1763240

I'm currently planning on repurposing this bug: We will land a test case ensuring that the behaviour we expect is tested.

However, this is currently blocked on Bug 1763240, which has revealed some bizarre path dependent behaviour that needs to be resolved first.

Severity: -- → S3
Priority: -- → P2

As I said here https://bugzilla.mozilla.org/show_bug.cgi?id=1772710 in Firefox 91.10.0esr it works fine.

More over autoAllocateChunkSize is a property for BYOB byte streams ("type": "bytes") only (https://streams.spec.whatwg.org/), which are not supported by Firefox.
Why does Firefox handle this property at all when I create a default ReadableStream object?

A couple of things.

  1. Byte Streams will be supported in Firefox 102, when it hits release.
  2. The issue this bug is tracking, and that you're running into, would apply to any property of the underlying source object; it just so happens that algorithmically autoAllocateChunkSize is the first one checked, but even if we didn't check that, we'd fail on a later property.
  3. The reason this no longer works, but used to work in 91, is because we replaced the entire Streams implementation in Firefox 99. As a result, we have been able to subsequently ship WritableStreams, and TransformStreams, however it's had some awkward impacts on add ons.

I would be interested to see if you could use the proposed workaround: In your script, use window.wrappedJSObject.ReadableStream as the constructor, and cloneInto your underlying source object.

Here is the simple demo with bug:

import {extensionName} from "./util.js";

console.log("...1", extensionName);
try {
    const readableStream = new ReadableStream({
        async start(controller) {
            controller.enqueue(new Uint8Array([84, 101, 120, 116, 33]));
            controller.close();
        }
    });
    console.log("readableStream", readableStream);
    new Response(readableStream).text().then(text => console.log("Response.text():", text));
} catch (e) {
    console.error(e);
}
console.log("...2", extensionName);

However, this code is working:

import {extensionName} from "./util.js";

void main();

async function main() {
    let resp = await fetch("https://example.com/");
        resp = responseOnProgressProxy(resp, console.log);
    console.log(await resp.blob());
}

function responseOnProgressProxy(response, onProgress) {
    const total = parseInt(response.headers.get("Content-Length")) || -1;
    let loaded = 0;
    const reader = response.body.getReader();
    console.log("Create ReadableStream", extensionName);
    const readableStream = new ReadableStream({
        async start(controller) {
            while (true) {
                const {done, value} = await reader.read();
                if (done) {
                    break;
                }
                loaded += value.length;
                try {
                    onProgress({loaded, total});
                } catch (e) {
                    console.error(e);
                }
                controller.enqueue(value);
            }
            controller.close();
            reader.releaseLock();
        },
        cancel() {
            void reader.cancel();
        }
    });
    console.log("ReadableStream is created", extensionName, readableStream);
    return new Response(readableStream);
}

Find the difference!


Even if you replace const readableStream = new ReadableStream({...}); with the same variable from the first example, and it will work.

Why?

Because of this magical fix:

(await fetch("https://example.com/")).body.getReader();

You can add this magic line in the first example before const readableStream = ..., then the extension will work.

Attached file The simple extension with the bug (obsolete) —
Attachment #9279988 - Attachment is obsolete: true

Yeah,

(await fetch(chrome.runtime.getURL(""))).body.getReader();

is a real fix.
Just checked with the other extension, that "send" ReadableStream from BG script to a content script.


Just use it before ReadableStream constructor.

I hope you will not "fix" (break) it.

Any way it should work even without that magic.

(In reply to Akoki from comment #16)

I hope you will not "fix" (break) it.

Any way it should work even without that magic.

Yeah, I would strongly recommend not relying on this behaviour until a proper conclusion is reached in Bug 1763240.

It's my current belief that the most future proof work-around should be "use window.wrappedJSObject.ReadableStream as the constructor, and cloneInto your underlying source object." -- if that's not working, I would like to know so that we can try to figure out a solution.

I'm back in webextension hell.

This code used to work in Firefox ESR 91:

        function stringReader(str) {
                var offset = 0;
                const chunkSize = 10;
                return new ReadableStream({
                        pull(controller) {
                                var nextOffset = offset + chunkSize;
                                if (nextOffset >= str.length) {
                                        nextOffset = str.length;
                                        controller.enqueue(str.substring(offset, nextOffset));
                                        controller.close();
                                } else {
                                        controller.enqueue(str.substring(offset, nextOffset));
                                }
                                offset = nextOffset;
                        }
                });
        }

Now we break when constructing the ReadableStream with "Permission denied to access property "autoAllocateChunkSize"".

This seems to not crash:

        function stringReader(str) {
                var offset = 0;
                const chunkSize = 10;

                const source = new window.wrappedJSObject.Object();

                source.pull = exportFunction(
                        function(controller) {
                                var nextOffset = offset + chunkSize;
                                if (nextOffset >= str.length) {
                                        nextOffset = str.length;
                                        controller.enqueue(str.substring(offset, nextOffset));
                                        controller.close();
                                } else {
                                        controller.enqueue(str.substring(offset, nextOffset));
                                }
                                offset = nextOffset;
                        }, window.wrappedJSObject
                );

                return new window.wrappedJSObject.ReadableStream(source);
        }

Is this likely to break in future? What are the security implications of this?

Is there any progress in fixing it?

(In reply to Matthew Gaudet (he/him) [:mgaudet] from comment #8)

I'm currently planning on repurposing this bug: We will land a test case ensuring that the behaviour we expect is tested.

However, this is currently blocked on Bug 1763240, which has revealed some bizarre path dependent behaviour that needs to be resolved first.

I explained the observed behavior at https://bugzilla.mozilla.org/show_bug.cgi?id=1763240#c16 and a fix for that issue at the section starting with "There are two ways to get consistent behavior.". The first way would break the work-around mentioned in comment 16. The second way relies on bug 1746653 and would fix this bug, at the cost of introducing new potential issues.

The current behavior is already pretty much broken, so I think that the easiest way to solve this is probably to just export ReadableStream to Sandbox (for relevant places where we would have to fix this, see https://bugzilla.mozilla.org/show_bug.cgi?id=1810573#c3 ).
That would not fix Blob (which also has a stream() method), so a similar fix might be needed there. I'm concerned that it could impact extensions that use Blobs with other DOM APIs if we were to replace the XrayWrapped Blob constructor with a Sandbox-specific one though... At some point we'd be having so many Sandbox-specific DOM APIs that it becomes very unclear when an API is Sandbox-specific and when it's XrayWrapped from the page.

Depends on: 1746653
See Also: → 1810573
Duplicate of this bug: 1816579

Can you try to understand if this is still an issue?

Flags: needinfo?(peterv) → needinfo?(krosylight)

It is still a problem, yes.

Flags: needinfo?(krosylight)
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: