Closed
Bug 1940479
Opened 1 month ago
Closed 1 month ago
[larch] Prompt is not passed to the model for simplifeir model inference
Categories
(Core :: Machine Learning, defect)
Tracking
()
RESOLVED
FIXED
People
(Reporter: txia, Assigned: txia)
References
(Depends on 1 open bug, Blocks 3 open bugs)
Details
(Whiteboard: [genai])
Attachments
(1 file)
When running with taskName simplifier and following config:
{"label":"Simplify with Control Token Joint","targeting":"true","value":"","key":"browser.ml.chat.prompts.simplifier","dtype":"fp32","max_new_tokens":200,"modelHubRootUrl":"https://huggingface.co","modelHubUrlTemplate":"{model}/resolve/{revision}","modelId":"tarekziade/Control_Token_Joint","tokenizerId":"tarekziade/Control_Token_Joint","tokenizerRevision":"main","modelRevision":"main","taskName":"simplifier","prompt":"<DEPENDENCYTREEDEPTHRATIO_0.35>,<WORDRANKRATIO_0.85>,<REPLACEONLYLEVENSHTEIN_0.8>, <LENGTHRATIO_0.8>","counter":3,"id":null}, the model input does not include prompt
Updated•1 month ago
|
- change function name to runSimplifierWithAsyncGenerator
- if prompt is defined, use it as prefix to the sentence chunk
Updated•1 month ago
|
Assignee: nobody → txia
Status: NEW → ASSIGNED
Summary: Prompt is not passed to the model for simplifeir model inference → [larch] Prompt is not passed to the model for simplifeir model inference
Comment 2•1 month ago
|
||
Status: ASSIGNED → RESOLVED
Closed: 1 month ago
Resolution: --- → FIXED
You need to log in
before you can comment on or make changes to this bug.
Description
•