ReadableStreamDefaultReader.read() promise never resolves when reading a sufficiently large file in a web worker.
Categories
(Core :: DOM: Streams, defect, P2)
Tracking
()
People
(Reporter: ikreymer, Unassigned)
References
(Depends on 1 open bug)
Details
User Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.97 Safari/537.36
Steps to reproduce:
When attempting to use the default reader with a File of about 50MB+, the ReadableStream appears to hang and never resolve. No errors are thrown.
The ReadableStream is obtained via the new-ish blob.stream()
This only happens if attempting to read larger files (around 50MB), and only if reading in the web worker. If calling same function in the page itself, the reading is finished.
The below index.html and worker.js demonstrate the issue. They should be loaded on a web server to avoid security errors).
The readFile() function is included in both to demonstrate that everything does work in main thread, but not in worker (uncomment the direct readFile call instead of postMessage to try that option).
- Click choose to choose any file over 50MB
- Observe the console output, which will output total bytes read. It will be stuck somewhere around ~50MB, and not finish and print 'done'.
This has been tested on OS X Firefox Nightly 79 as well as Firefox 77.
index.html
<script>
let worker = new Worker("./worker.js");
function onSelected(event) {
// read via web worker -- this never finishes
worker.postMessage({"file": event.currentTarget.files[0]});
// read directly -- this works
// readFile(event.currentTarget.files[0]);
}
</script>
Choose File
<input onchange="onSelected(event)" type="file"></input>
<script>
async function readFile(file) {
const stream = file.stream();
const reader = stream.getReader();
let res = null;
reader.closed.then(() => console.log("stream closed")).catch((err) => console.log("stream error?", err));
let total = 0;
try {
res = await reader.read();
while (!res.done) {
total += res.value.length;
console.log("Total Read " + total);
res = await reader.read();
}
} catch (err) {
console.log("errored", err);
}
// Done never reached with large files in web worker
console.log('done');
}
</script>
worker.js
console.log("worker loaded");
async function readFile(file) {
const stream = file.stream();
const reader = stream.getReader();
let res = null;
reader.closed.then(() => console.log("stream closed")).catch((err) => console.log("stream error?", err));
let total = 0;
try {
res = await reader.read();
while (!res.done) {
total += res.value.length;
console.log("Total Read " + total);
res = await reader.read();
}
} catch (err) {
console.log("errored", err);
}
// Done never reached with large files in web worker
console.log('done');
}
self.addEventListener("message", (event) => { readFile(event.data.file); });
Actual results:
Running the example shows that large files are not read fully when reading them in a web worker. The 'ReadableStreamDefaultReader.closed' promise is also not resolved/rejected.
When running in web worker, the last line printed will be a 'Total Read' and bytes so far.
When running in main thread, the last line printed will be 'stream closed' and 'done'.
Expected results:
The file should read fully, ReadableStreamDefaultReader.read() should resolve with chunks until the file is fully read, same as when reading in the main thread.
Or, if an error occurs, the read() promise should reject. It should not never resolve.
Reporter | ||
Comment 1•4 years ago
|
||
Here are the code snippets again with markdown formatting:
index.html
<script>
let worker = new Worker("./worker.js");
function onSelected(event) {
// read via web worker -- this never finishes
worker.postMessage({"file": event.currentTarget.files[0]});
// read directly -- this works
//readFile(event.currentTarget.files[0]);
}
</script>
Choose File
<input onchange="onSelected(event)" type="file"></input>
<script>
async function readFile(file) {
const stream = file.stream();
const reader = stream.getReader();
let res = null;
reader.closed.then(() => console.log("stream closed")).catch((err) => console.log("stream error?", err));
let total = 0;
try {
res = await reader.read();
while (!res.done) {
total += res.value.length;
console.log("Total Read " + total);
res = await reader.read();
}
} catch (err) {
console.log("errored", err);
}
// Done never reached with large files in web worker
console.log('done');
}
</script>
worker.js
console.log("worker loaded");
async function readFile(file) {
const stream = file.stream();
const reader = stream.getReader();
let res = null;
reader.closed.then(() => console.log("stream closed")).catch((err) => console.log("stream error?", err));
let total = 0;
try {
res = await reader.read();
while (!res.done) {
total += res.value.length;
console.log("Total Read " + total);
res = await reader.read();
}
} catch (err) {
console.log("errored", err);
}
// Done never reached with large files in web worker
console.log('done');
}
self.addEventListener("message", (event) => { readFile(event.data.file); });
Comment 2•4 years ago
|
||
Bugbug thinks this bug should belong to this component, but please revert this change in case of error.
Updated•4 years ago
|
Comment 3•3 years ago
|
||
Moving to DOM: Streams. We still need to check whether this is still a problem in DOM implementation and close if not.
Reporter | ||
Comment 4•3 years ago
|
||
I just tested again in Firefox 97 on Mac, and the issue still occurs - It did not finish streaming a 53MB file, got stuck ~50MB
Comment 5•3 years ago
|
||
I can confirm it's broken in Firefox 97 and fixed in Firefox Nightly 99 (with the new DOM Streams implementation 👍)
Could you check it also works on your machine in Firefox Nightly 99?
Comment 6•2 years ago
|
||
Redirect a needinfo that is pending on an inactive user to the triage owner.
:hsinyi, since the bug has high priority, could you have a look please?
For more information, please visit auto_nag documentation.
Comment 7•2 years ago
|
||
I think we can consider this fixed.
Description
•