Closed Bug 1976704 Opened 4 months ago Closed 3 months ago

Enable wllama for extensions

Categories

(Core :: Machine Learning, enhancement)

enhancement

Tracking

()

RESOLVED FIXED
142 Branch
Tracking Status
relnote-firefox --- 142+
firefox142 --- fixed

People

(Reporter: atossou, Assigned: atossou)

References

Details

(Whiteboard: [ai-runtime])

Attachments

(1 file)

Currenttly we can't use wllama with extension because the run request is not compatible. This should fix it

Status: ASSIGNED → RESOLVED
Closed: 3 months ago
Resolution: --- → FIXED
Target Milestone: --- → 142 Branch

:atossou is this something you want to mention in the fx142 release notes? please nominate if so.

Flags: needinfo?(atossou)

does this sound okay?

Firefox now supports the wllama API for extensions, enabling developers to integrate local language model (LLM) capabilities directly into their add-ons.

relnote-firefox: --- → ?
QA Whiteboard: [qa-triage-done-c143/b142]

(In reply to Dianna Smith [:diannaS] from comment #5)

does this sound okay?

Firefox now supports the wllama API for extensions, enabling developers to integrate local language model (LLM) capabilities directly into their add-ons.

This sounds perfect. Thank you

Flags: needinfo?(atossou)
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: