Closed
Bug 1976704
Opened 4 months ago
Closed 3 months ago
Enable wllama for extensions
Categories
(Core :: Machine Learning, enhancement)
Core
Machine Learning
Tracking
()
RESOLVED
FIXED
142 Branch
People
(Reporter: atossou, Assigned: atossou)
References
Details
(Whiteboard: [ai-runtime])
Attachments
(1 file)
Currenttly we can't use wllama with extension because the run request is not compatible. This should fix it
Updated•4 months ago
|
| Assignee | ||
Comment 1•4 months ago
|
||
Pushed by atossou@mozilla.com:
https://github.com/mozilla-firefox/firefox/commit/c2c0fbe71f1c
https://hg.mozilla.org/integration/autoland/rev/7671714f3e9e
Enable wllama for extensions - r=ngrato,extension-reviewers,rpl
Comment 3•3 months ago
|
||
| bugherder | ||
Status: ASSIGNED → RESOLVED
Closed: 3 months ago
status-firefox142:
--- → fixed
Resolution: --- → FIXED
Target Milestone: --- → 142 Branch
Comment 4•3 months ago
|
||
:atossou is this something you want to mention in the fx142 release notes? please nominate if so.
Flags: needinfo?(atossou)
Comment 5•3 months ago
|
||
does this sound okay?
Firefox now supports the wllama API for extensions, enabling developers to integrate local language model (LLM) capabilities directly into their add-ons.
relnote-firefox:
--- → ?
Updated•3 months ago
|
QA Whiteboard: [qa-triage-done-c143/b142]
Updated•3 months ago
|
| Assignee | ||
Comment 6•3 months ago
|
||
(In reply to Dianna Smith [:diannaS] from comment #5)
does this sound okay?
Firefox now supports the wllama API for extensions, enabling developers to integrate local language model (LLM) capabilities directly into their add-ons.
This sounds perfect. Thank you
Flags: needinfo?(atossou)
You need to log in
before you can comment on or make changes to this bug.
Description
•