Closed
Bug 1278614
Opened 8 years ago
Closed 8 years ago
emoji-style keycap sequences <digit, VS16, U+20e3> fail to render properly on Linux
Categories
(Core :: Graphics: Text, defect)
Tracking
()
RESOLVED
FIXED
mozilla50
Tracking | Status | |
---|---|---|
firefox50 | --- | fixed |
People
(Reporter: jfkthame, Assigned: jfkthame)
References
()
Details
(Whiteboard: [gfx-noted])
Attachments
(1 file)
Using the EmojiOne Mozilla font (see bug 1231701), sequences such as <U+0030, U+FE0F, U+20E3> are expected to render an "emoji-style" (colored) keycap glyph. This works on OS X, but fails on Linux (with the patch from bug 1276594, required to support the EmojiOne Mozilla font at all). This can be seen on https://people.mozilla.org/~jkew/tests/emoji-list.html (search for "keycap" to find examples). AFAICT, this can be worked around by adding a further VS16 after the combining keycap symbol: <U+0030, U+FE0F, U+20E3, U+FE0F> but this should not be needed.
Updated•8 years ago
|
Whiteboard: [gfx-noted]
Assignee | ||
Comment 1•8 years ago
|
||
Hmm, not entirely sure what's going on here. On re-testing just now, I can't seem to reproduce the earlier behavior whereby adding a trailing VS16 caused it to render the keycap properly; it now remains stubbornly broken. (I.e. the emoji-style digit is rendered, but not ligated to form a keycap glyph; the U+20e3 is simply invisible).
Assignee | ||
Comment 2•8 years ago
|
||
The problem here is that gfxFT2FontBase doesn't handle variation selectors in GetGlyph in the way HarfBuzz expects. The proper behavior is to return 0 if the VS is not specifically supported, and then let HB call GetGlyph again for the base character alone, rather than doing the fallback entirely within the single GetGlyph call. With this, the keycaps in EmojiOne Mozilla should render properly. Try build: https://treeherder.mozilla.org/#/jobs?repo=try&revision=2e3a774da6879be27899f26370aa3f39a80d5b95.
Attachment #8760944 -
Flags: review?(karlt)
Assignee | ||
Updated•8 years ago
|
Assignee: nobody → jfkthame
Status: NEW → ASSIGNED
Comment 3•8 years ago
|
||
This is making the equivalent change to https://hg.mozilla.org/mozilla-central/diff/e087b5e21077/gfx/thebes/gfxHarfBuzzShaper.cpp
Blocks: 910506
Comment 4•8 years ago
|
||
Comment on attachment 8760944 [details] [diff] [review] The two-argument GetGlyph method in gfxFT2FontBase should return 0 if the VS is unsupported, not the default glyph from the cmap; Harfbuzz will then call it again with no variation selector to handle the fallback rendering (In reply to Jonathan Kew (:jfkthame) from comment #2) > Created attachment 8760944 [details] [diff] [review] > The two-argument GetGlyph method in gfxFT2FontBase should return 0 if the VS > is unsupported, not the default glyph from the cmap; Harfbuzz will then call > it again with no variation selector to handle the fallback rendering > > The problem here is that gfxFT2FontBase doesn't handle variation selectors > in GetGlyph in the way HarfBuzz expects. The proper behavior is to return 0 > if the VS is not specifically supported, and then let HB call GetGlyph again > for the base character alone, rather than doing the fallback entirely within > the single GetGlyph call. Harfbuzz is not simply calling it again with no variation selector, or there would be no change in behaviour here. https://bugzilla.mozilla.org/show_bug.cgi?id=910506#c2 and https://bugs.freedesktop.org/show_bug.cgi?id=65258#c10 explain a likely reason for the change in behaviour. Please update the commit message to indicate this. Usually the first line is in imperative mood rather than subjunctive, to describe what actually changed rather than what should change. If there is not enough room to describe that and why in the first line of the message, then further lines can explain in more detail. > if (variation_selector) { > uint32_t id = > gfxFT2LockedFace(this).GetUVSGlyph(unicode, variation_selector); >- if (id) >+ if (id) { > return id; >+ } > id = gfxFontUtils::GetUVSFallback(unicode, variation_selector); > if (id) { >- unicode = id; >+ return GetGlyph(id); > } >+ return 0; > } > > return GetGlyph(unicode); Storing a unicode character reference as a glyph id is an existing issue making this code hard to follow, but now there is no need to avoid overwriting |unicode|. Would you mind assigning the rv from GetUVSFallback() to |unicode| please and passing that to GetGlyph? Initializing a new variable |compat| would be an alternative.
Attachment #8760944 -
Flags: review?(karlt) → review+
Assignee | ||
Comment 5•8 years ago
|
||
(In reply to Karl Tomlinson (ni?:karlt) from comment #4) > Comment on attachment 8760944 [details] [diff] [review] > The two-argument GetGlyph method in gfxFT2FontBase should return 0 if the VS > is unsupported, not the default glyph from the cmap; Harfbuzz will then call > it again with no variation selector to handle the fallback rendering > > (In reply to Jonathan Kew (:jfkthame) from comment #2) > > Created attachment 8760944 [details] [diff] [review] > > The two-argument GetGlyph method in gfxFT2FontBase should return 0 if the VS > > is unsupported, not the default glyph from the cmap; Harfbuzz will then call > > it again with no variation selector to handle the fallback rendering > > > > The problem here is that gfxFT2FontBase doesn't handle variation selectors > > in GetGlyph in the way HarfBuzz expects. The proper behavior is to return 0 > > if the VS is not specifically supported, and then let HB call GetGlyph again > > for the base character alone, rather than doing the fallback entirely within > > the single GetGlyph call. > > Harfbuzz is not simply calling it again with no variation selector, > or there would be no change in behaviour here. To clarify, the change in behavior happens because when we return 0 for the <char, vs> pair here, harfbuzz will call GetGlyph again for the base character alone, and then for the variation selector alone. So it ends up with two glyph codes in its buffer, and the font may then use GSUB ligation to process these further. So that's different from the old behavior where the initial GetGlyph silently "consumes" the variation selector if it wasn't supported directly by the cmap.
Assignee | ||
Comment 6•8 years ago
|
||
https://hg.mozilla.org/integration/mozilla-inbound/rev/c841cbb3bee8b7c7fb1221e5d726616085b704cc Bug 1278614 - Update the freetype-based GetGlyph method to conform to the expectations of the HarfBuzz callbacks, returning 0 for unsupported <Base, Variation-Selector> sequences instead of automatically returning the default mapping and ignoring the VS. r=karlt
Comment 7•8 years ago
|
||
(In reply to Jonathan Kew (:jfkthame) from comment #5) > To clarify, the change in behavior happens because when we return 0 for the > <char, vs> pair here, harfbuzz will call GetGlyph again for the base > character alone, and then for the variation selector alone. So it ends up > with two glyph codes in its buffer, and the font may then use GSUB ligation > to process these further. So that's different from the old behavior where > the initial GetGlyph silently "consumes" the variation selector if it wasn't > supported directly by the cmap. Ah. That's helpful, thanks!
Updated•8 years ago
|
Flags: in-testsuite?
Comment 8•8 years ago
|
||
bugherder |
https://hg.mozilla.org/mozilla-central/rev/c841cbb3bee8
Status: ASSIGNED → RESOLVED
Closed: 8 years ago
status-firefox50:
--- → fixed
Resolution: --- → FIXED
Target Milestone: --- → mozilla50
You need to log in
before you can comment on or make changes to this bug.
Description
•