Closed
Bug 1950934
Opened 22 days ago
Closed 9 hours ago
Intermittent toolkit/components/ml/tests/browser/browser_ml_smart_tab_perf.js | Uncaught exception in test bound test_ml_smart_tab_embedding_peak_mem - at chrome://global/content/ml/ort.webgpu-dev.mjs:120 - Error: Error: no available backend found. ERR: [
Categories
(Core :: Machine Learning, defect, P5)
Core
Machine Learning
Tracking
()
RESOLVED
INCOMPLETE
People
(Reporter: intermittent-bug-filer, Unassigned)
Details
(Keywords: intermittent-failure)
Filed by: sstanca [at] mozilla.com
Parsed log: https://treeherder.mozilla.org/logviewer?job_id=496884114&repo=autoland
Full log: https://firefox-ci-tc.services.mozilla.com/api/queue/v1/task/YbhtgBWmQQKDbG_zl6xXaQ/runs/0/artifacts/public/logs/live_backing.log
[task 2025-02-27T16:56:40.767Z] TEST-PASS | toolkit/components/ml/tests/browser/browser_ml_smart_tab_perf.js | true == true -
[task 2025-02-27T16:56:40.767Z] GECKO(816) | Results (ms)
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-pipeline-ready-latency: 584 549 556 552 557 574 551 558 543 562 median 556.5
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-initialization-latency: 592 556 565 560 565 582 559 565 550 570 median 565
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-model-run-latency: 53 57 56 52 64 57 51 40 52 39 median 52.5
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-total-memory-usage: 249 250 252 248 249 249 252 251 249 250 median 249.5
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-e2e-init-latency: 731.6 697 699.8 702.1 702.3 728.9 698.7 705.9 693.4 711.5 median 702.2
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-1st-token-latency: 56.2 63.64 58.85 55.89 67.19 60.44 54.46 44.17 55.01 42.71 median 56.04
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-decoding-latency: 0.3932 0.21 0.1412 0.6404 0.309 0.2382 0.7402 2.991 0.3621 0.2197 median 0.3355
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-decoding-charactersSpeed: 0 0 0 0 0 0 0 0 0 0 median 0
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-decoding-tokenSpeed: 0 0 0 0 0 0 0 0 0 0 median 0
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-prompt-charactersSpeed: 5,890 5,201 5,624 5,922 4,927 5,477 6,078 7,495 6,017 7,750 median 5,906
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-prompt-tokenSpeed: 0 0 0 0 0 0 0 0 0 0 median 0
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-pipeline-ready-latency: 635 median 635
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-initialization-latency: 643 median 643
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-model-run-latency: 58 median 58
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-total-memory-usage: 251 median 251
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-e2e-init-latency: 781.9 median 781.9
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-1st-token-latency: 64.03 median 64.03
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-decoding-latency: 0.1691 median 0.1691
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-decoding-charactersSpeed: 0 median 0
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-decoding-tokenSpeed: 0 median 0
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-prompt-charactersSpeed: 5,169 median 5,169
[task 2025-02-27T16:56:40.767Z] GECKO(816) | SMART-TAB-EMBEDDING-cold-start-prompt-tokenSpeed: 0 median 0
[task 2025-02-27T16:56:40.767Z] perfMetrics | {"SMART-TAB-EMBEDDING-pipeline-ready-latency":556.5,"SMART-TAB-EMBEDDING-initialization-latency":565,"SMART-TAB-EMBEDDING-model-run-latency":52.5,"SMART-TAB-EMBEDDING-total-memory-usage":249.5,"SMART-TAB-EMBEDDING-e2e-init-latency":702.2058609999985,"SMART-TAB-EMBEDDING-1st-token-latency":56.04408350000085,"SMART-TAB-EMBEDDING-decoding-latency":0.33554650000223774,"SMART-TAB-EMBEDDING-decoding-charactersSpeed":0,"SMART-TAB-EMBEDDING-decoding-tokenSpeed":0,"SMART-TAB-EMBEDDING-prompt-charactersSpeed":5906.109309644089,"SMART-TAB-EMBEDDING-prompt-tokenSpeed":0,"SMART-TAB-EMBEDDING-cold-start-pipeline-ready-latency":635,"SMART-TAB-EMBEDDING-cold-start-initialization-latency":643,"SMART-TAB-EMBEDDING-cold-start-model-run-latency":58,"SMART-TAB-EMBEDDING-cold-start-total-memory-usage":251,"SMART-TAB-EMBEDDING-cold-start-e2e-init-latency":781.9439410000014,"SMART-TAB-EMBEDDING-cold-start-1st-token-latency":64.03447199999937,"SMART-TAB-EMBEDDING-cold-start-decoding-latency":0.1690710000002582,"SMART-TAB-EMBEDDING-cold-start-decoding-charactersSpeed":0,"SMART-TAB-EMBEDDING-cold-start-decoding-tokenSpeed":0,"SMART-TAB-EMBEDDING-cold-start-prompt-charactersSpeed":5169.090798468726,"SMART-TAB-EMBEDDING-cold-start-prompt-tokenSpeed":0}
[task 2025-02-27T16:56:40.767Z] Leaving test bound test_ml_smart_tab_embedding
[task 2025-02-27T16:56:40.767Z] Entering test bound test_ml_smart_tab_embedding_peak_mem
[task 2025-02-27T16:56:40.767Z] is request null | false
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] Get Pipeline Options
[task 2025-02-27T16:56:40.767Z] Run the inference
[task 2025-02-27T16:56:40.767Z] runInference is request null | false
[task 2025-02-27T16:56:40.767Z] Model Directory: /opt/worker/tasks/task_174067524985513/fetches/onnx-models
[task 2025-02-27T16:56:40.767Z] ModelHubRootUrl: http://localhost:49211
[task 2025-02-27T16:56:40.767Z] Get the engine process
[task 2025-02-27T16:56:40.767Z] GECKO(816) | console.error: "failed to asynchronously prepare wasm: CompileError: wasm validation error: at offset 5375336: table index out of range for table.get"
[task 2025-02-27T16:56:40.767Z] GECKO(816) | console.error: "Aborted(CompileError: wasm validation error: at offset 5375336: table index out of range for table.get)"
[task 2025-02-27T16:56:40.767Z] GECKO(816) | JavaScript error: chrome://global/content/ml/ort.webgpu-dev.mjs, line 120: Error: no available backend found. ERR: [wasm] RuntimeError: Aborted(CompileError: wasm validation error: at offset 5375336: table index out of range for table.get). Build with -sASSERTIONS for more info.
[task 2025-02-27T16:56:40.767Z] GECKO(816) | console.error: ML:EngineChild: "Could not initalize the engine" (new Error("Error: no available backend found. ERR: [wasm] RuntimeError: Aborted(CompileError: wasm validation error: at offset 5375336: table index out of range for table.get). Build with -sASSERTIONS for more info.", "chrome://global/content/ml/ort.webgpu-dev.mjs", 120))
[task 2025-02-27T16:56:40.767Z] TEST-UNEXPECTED-FAIL | toolkit/components/ml/tests/browser/browser_ml_smart_tab_perf.js | Uncaught exception in test bound test_ml_smart_tab_embedding_peak_mem - at chrome://global/content/ml/ort.webgpu-dev.mjs:120 - Error: Error: no available backend found. ERR: [wasm] RuntimeError: Aborted(CompileError: wasm validation error: at offset 5375336: table index out of range for table.get). Build with -sASSERTIONS for more info.
[task 2025-02-27T16:56:40.767Z] Stack trace:
[task 2025-02-27T16:56:40.767Z] postMessage@resource://gre/modules/PromiseWorker.sys.mjs:386:17
[task 2025-02-27T16:56:40.767Z] async*post@resource://gre/modules/PromiseWorker.sys.mjs:425:17
[task 2025-02-27T16:56:40.767Z] create@resource://gre/actors/MLEngineChild.sys.mjs:779:18
[task 2025-02-27T16:56:40.767Z] initializeInferenceEngine@resource://gre/actors/MLEngineChild.sys.mjs:362:28
[task 2025-02-27T16:56:40.767Z] async*EngineDispatcher@resource://gre/actors/MLEngineChild.sys.mjs:387:25
[task 2025-02-27T16:56:40.767Z] #onNewPortCreated@resource://gre/actors/MLEngineChild.sys.mjs:185:26
[task 2025-02-27T16:56:40.767Z] receiveMessage@resource://gre/actors/MLEngineChild.sys.mjs:116:37
[task 2025-02-27T16:56:40.767Z] JSActor query*setupPortCommunication@resource://gre/actors/MLEngineParent.sys.mjs:839:25
[task 2025-02-27T16:56:40.767Z] initialize@resource://gre/actors/MLEngineParent.sys.mjs:768:20
[task 2025-02-27T16:56:40.767Z] getEngine@resource://gre/actors/MLEngineParent.sys.mjs:183:37
[task 2025-02-27T16:56:40.767Z] initializeEngine@chrome://mochitests/content/browser/toolkit/components/ml/tests/browser/head.js:335:39
[task 2025-02-27T16:56:40.767Z] async*runInference@chrome://mochitests/content/browser/toolkit/components/ml/tests/browser/head.js:462:50
[task 2025-02-27T16:56:40.767Z] perfTest@chrome://mochitests/content/browser/toolkit/components/ml/tests/browser/head.js:650:25
[task 2025-02-27T16:56:40.767Z] async*testEmbedding@chrome://mochitests/content/browser/toolkit/components/ml/tests/browser/browser_ml_smart_tab_perf.js:126:9
[task 2025-02-27T16:56:40.767Z] test_ml_smart_tab_embedding_peak_mem@chrome://mochitests/content/browser/toolkit/components/ml/tests/browser/browser_ml_smart_tab_perf.js:142:9
[task 2025-02-27T16:56:40.767Z] handleTask@chrome://mochikit/content/browser-test.js:1170:26
[task 2025-02-27T16:56:40.767Z] _runTaskBasedTest@chrome://mochikit/content/browser-test.js:1242:18
[task 2025-02-27T16:56:40.767Z] async*Tester_execTest@chrome://mochikit/content/browser-test.js:1383:14
[task 2025-02-27T16:56:40.767Z] nextTest/<@chrome://mochikit/content/browser-test.js:1159:14
[task 2025-02-27T16:56:40.767Z] SimpleTest.waitForFocus/<@chrome://mochikit/content/tests/SimpleTest/SimpleTest.js:1058:13
[task 2025-02-27T16:56:40.767Z] Leaving test bound test_ml_smart_tab_embedding_peak_mem
[task 2025-02-27T16:56:40.767Z] Console message: [JavaScript Error: "Error: no available backend found. ERR: [wasm] RuntimeError: Aborted(CompileError: wasm validation error: at offset 5375336: table index out of range for table.get). Build with -sASSERTIONS for more info." {file: "chrome://global/content/ml/ort.webgpu-dev.mjs" line: 120}]
[task 2025-02-27T16:56:40.767Z] GECKO(816) | MEMORY STAT vsizeMaxContiguous not supported in this build configuration.
[task 2025-02-27T16:56:40.767Z] GECKO(816) | MEMORY STAT | vsize 17185MB | residentFast 383MB | heapAllocated 174MB
[task 2025-02-27T16:56:40.767Z] TEST-OK | toolkit/components/ml/tests/browser/browser_ml_smart_tab_perf.js | took 36177ms
Comment 1•9 hours ago
|
||
https://wiki.mozilla.org/Bug_Triage#Intermittent_Test_Failure_Cleanup
For more information, please visit BugBot documentation.
Status: NEW → RESOLVED
Closed: 9 hours ago
Resolution: --- → INCOMPLETE
You need to log in
before you can comment on or make changes to this bug.
Description
•