Closed Bug 1307158 Opened 8 years ago Closed 2 years ago

Subpixel AA turns off on some pages only in Firefox 51

Categories

(Core :: Graphics: Text, defect, P3)

51 Branch
x86_64
Linux
defect

Tracking

()

RESOLVED WORKSFORME
Tracking Status
firefox49 --- unaffected
firefox-esr45 --- unaffected
firefox50 --- unaffected
firefox51 --- fix-optional
firefox52 --- fix-optional

People

(Reporter: eugpavl, Unassigned)

References

Details

(Keywords: regression, Whiteboard: [gfx-noted], QA-not-reproducible)

Attachments

(2 files)

Attached image firefox51.png
User Agent: Mozilla/5.0 (X11; Linux x86_64; rv:49.0) Gecko/20100101 Firefox/49.0
Build ID: 20160923225245

Steps to reproduce:

Open this medium page for example: https://medium.com/@evnbr/coding-in-color-3a6db2743a1e#.5vj52da72


Actual results:

subpixel AA has been turned off


Expected results:

subpixel AA should have stayed turned on
Attached image firefox49.png
the same page in Firefox 49
OS: Unspecified → Linux
Hardware: Unspecified → x86_64
What about Firefox 52 (Nightly)?
Is it also affected?
Component: Untriaged → Graphics: Text
Flags: needinfo?(evgeniy.pavlov)
Product: Firefox → Core
nightly/52 unaffected
beta/50 unaffected
Hmm, odd, that it's only affected Firefox 51.
Severity: normal → major
Status: UNCONFIRMED → NEW
Has Regression Range: --- → no
Has STR: --- → yes
Ever confirmed: true
Summary: Subpixel AA turns off on some pages → Subpixel AA turns off on some pages only in Firefox 51
Priority: -- → P3
Whiteboard: [gfx-noted]
Eugene, can you use mozregression to find out either what caused this regression or what fixed it?
Didn't know about mozregression, looks simple enough to try. Will post the results next week.
According to mozregression it was introduced as far as a year ago, and indeed on a new profile, or on my own profile used to run with mozregression it appears to be true.
28:50.34 INFO: Last good revision: 891ee0d0ba3ec42b6484cf0205b3c95e21c58f74 (2015-09-30)
28:50.34 INFO: First bad revision: 096c0f407f8ba3ef7cfe4e0b831761993cac38b1 (2015-10-01)
28:50.34 INFO: Pushlog:
https://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=891ee0d0ba3ec42b6484cf0205b3c95e21c58f74&tochange=096c0f407f8ba3ef7cfe4e0b831761993cac38b1

Don't know if it's normal, but after 
28:50.34 INFO: Switching bisection method to taskcluster

it crashed with
Exception in thread Thread-64:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File "/usr/lib/python2.7/site-packages/mozregression/build_range.py", line 94, in __getitem__
    return self._future_build_infos[i].build_info
  File "/usr/lib/python2.7/site-packages/mozregression/build_range.py", line 41, in build_info
    self._build_info = self._fetch()
  File "/usr/lib/python2.7/site-packages/mozregression/build_range.py", line 35, in _fetch
    return self.build_info_fetcher.find_build_info(self.data)
  File "/usr/lib/python2.7/site-packages/mozregression/fetch_build_info.py", line 143, in find_build_info
    status = self.queue.status(task_id)['status']
  File "/usr/lib/python2.7/site-packages/taskcluster/sync/Queue.py", line 88, in status
    return self.makeHttpRequest('get', route)
  File "/usr/lib/python2.7/site-packages/taskcluster/sync/syncclient.py", line 60, in makeHttpRequest
    return self._makeHttpRequest(method, url, payload, headers)
  File "/usr/lib/python2.7/site-packages/taskcluster/sync/syncclient.py", line 92, in _makeHttpRequest
    self._raiseHttpError(status, data, rerr)
  File "/usr/lib/python2.7/site-packages/taskcluster/baseclient.py", line 337, in _raiseHttpError
    superExc=rerr
TaskclusterRestFailure: R97zjoyzTsyIcevKo0UMXQ does not correspond to a task that exists.
Are you sure this task exists?
----
errorCode:  ResourceNotFound
statusCode: 404
requestInfo:
  method:   status
  params:   {"taskId":"R97zjoyzTsyIcevKo0UMXQ"}
  payload:  {}
  time:     2016-10-17T10:11:33.444Z
details:
{
  "taskId": "R97zjoyzTsyIcevKo0UMXQ"
} - {
    "requestInfo": {
        "time": "2016-10-17T10:11:33.444Z",
        "params": {
            "taskId": "R97zjoyzTsyIcevKo0UMXQ"
        },
        "method": "status",
        "payload": {}
    },
    "message": "R97zjoyzTsyIcevKo0UMXQ does not correspond to a task that exists.\nAre you sure this task exists?\n----\nerrorCode:  ResourceNotFound\nstatusCode: 404\nrequestInfo:\n  method:   status\n  params:   {\"taskId\":\"R97zjoyzTsyIcevKo0UMXQ\"}\n  payload:  {}\n  time:     2016-10-17T10:11:33.444Z\ndetails:\n{\n  \"taskId\": \"R97zjoyzTsyIcevKo0UMXQ\"\n}",
    "code": "ResourceNotFound",
    "details": {
        "taskId": "R97zjoyzTsyIcevKo0UMXQ"
    }
}
mozregression --find-fix report:

14:20.12 INFO: Narrowed inbound regression window from [23c2ec55, 30fa88c8] (3 revisions) to [8ef9629d, 30fa88c8] (2 revisions) (~1 steps left)
14:20.12 INFO: Oh noes, no (more) inbound revisions :(
14:20.12 INFO: First good revision: 30fa88c82366a358f412ca2cc3268142c8991f0d
14:20.12 INFO: Last bad revision: 8ef9629d8f90d6507b1bad01146b14101de79174
14:20.12 INFO: Pushlog:
https://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=8ef9629d8f90d6507b1bad01146b14101de79174&tochange=30fa88c82366a358f412ca2cc3268142c8991f0d

14:23.19 INFO: ************* Switching to autoland by process of elimination (no branch detected in commit message)
14:23.19 INFO: Getting autoland builds between 60f05a3d215f726f21ce60c8e7f7f7dc66265857 and 30fa88c82366a358f412ca2cc3268142c8991f0d
14:23.74 ERROR: The url u'https://hg.mozilla.org/integration/autoland/json-pushes?fromchange=30fa88c82366a358f412ca2cc3268142c8991f0d&tochange=60f05a3d215f726f21ce60c8e7f7f7dc66265857' contains no pushlog. Maybe use another range ?
My guess is APZ on Linux (bug 1143856). Can you go to about:config, set the preference "layers.async-pan-zoom.enabled" to false, and restart. Does it still happen?
Setting "layers.async-pan-zoom.enabled" to false fixes it in Firefox 51, toggling it fixes the bug in earlier versions.
Ok, so regressed by APZ (bug 1143856) but probably fixed by bug 594876. However the fix is #ifdef NIGHTLY only which is why 52 is currently unaffected. It will be affected once it goes up to aurora.
Andrew, is there a bug tracking bug 594876 riding the trains?
Flags: needinfo?(evgeniy.pavlov) → needinfo?(andrew)
We don't have a "Linux acceleration rides the trains" bug at this point.
Flags: needinfo?(andrew)

Hello! I have tried to reproduce the issue with Firefox 96.0a1(2021), 95.0b7 and 94.0.1 on Ubuntu 20, unfortunately I wasn't able to reproduce the issue. Adding the QA-not-reproducible tag and marking this issue as RESOLVED-WORKSFORME.

If this issue is still valid please feel free to reopen this bug or file another one.

Thank you!

Status: NEW → RESOLVED
Closed: 2 years ago
Resolution: --- → WORKSFORME
Whiteboard: [gfx-noted] → [gfx-noted], QA-not-reproducible
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Creator:
Created:
Updated:
Size: