If you think a bug might affect users in the 57 release, please set the correct tracking and status flags for Release Management.

Higher memory consumption on Nightly/Linux 64bit since July 24

RESOLVED INVALID

Status

Mozilla QA
Mozmill Tests
RESOLVED INVALID
5 years ago
5 years ago

People

(Reporter: mihaelav, Assigned: davehunt)

Tracking

Firefox Tracking Flags

(Not tracked)

Details

(Whiteboard: [mozmill-endurance])

(Reporter)

Description

5 years ago
Higher memory consumption is discovered by endurance tests since July 24 on Nightly (17.0) branch for Linux x64.

Explicit memory (MB) July 19:	Minimum: 61 / Maximum: 120 / Average: 76 
Explicit memory (MB) July 24:   Minimum: 70 / Maximum: 156 / Average: 104

Resident memory (MB) July 19:	Minimum: 102 / Maximum: 146 / Average: 117
Resident memory (MB) July 24:	Minimum: 117 / Maximum: 188 / Average: 146

Charts: http://mozmill-ci.blargon7.com/#/endurance/charts?branch=17.0&platform=Linux&from=2012-07-17&to=2012-07-27

Reports:
July 19:http://mozmill-ci.blargon7.com/#/endurance/report/89726f6b98208a209e7ce2df1029b0b7
July 24: http://mozmill-ci.blargon7.com/#/endurance/report/27072cb61461b83e1447b1979b0e6fd3

Notes: 
1. Tests were not run between July 19 and 24
2. Reproducible locally:
July 19: http://mozmill-crowd.blargon7.com/#/endurance/report/a6839eac670d27cd915461a1da013411
July 24: http://mozmill-crowd.blargon7.com/#/endurance/report/a6839eac670d27cd915461a1da01e993
(Assignee)

Comment 1

5 years ago
Please narrow this regression by running the endurance tests locally against the nightly or tinderbox builds. Thanks.
Whiteboard: [mozmill-endurance]

Comment 2

5 years ago
I've been trying to find a regression range for this bug and it seems that it is much older than the 19th of July. On my Ubuntu 12.04 64-bit machine, this issue reproduces back to the Firefox 16 Nightly from 07/10: http://mozmill-crowd.blargon7.com/#/endurance/report/210e02ed6e44c1e1119402056900eefd

I will go further back as soon as possible and post what I find here. If anyone has any hints on how far back this issue could've regressed, that would be very helpful.
(Assignee)

Comment 3

5 years ago
The original report shows that the issue was not in the July 19th build, so if you are unable to see an increase between the July 19 nightly and the July 24 nightly then you are unable to replicate the regression. I have kicked off tests for each nightly between July 19 and 24 in order to attempt to replicate the issue, and narrow it down to a single day. I will post results here when they're available.
(Assignee)

Comment 4

5 years ago
So the initial results show that I have replicated the regression. I will hopefully soon be able to narrow this down to a specific day.

Build ID: 20120719030543
Average Explicit: 87MB
Average Resident: 128MB
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/1e83db11eb3022313051f256db0085c9

Build ID: 20120724071408
Average Explicit: 99MB
Average Resident: 141MB
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/1e83db11eb3022313051f256db009384

Comment 5

5 years ago
(In reply to Dave Hunt (:davehunt) from comment #3)
> The original report shows that the issue was not in the July 19th build, so
> if you are unable to see an increase between the July 19 nightly and the
> July 24 nightly then you are unable to replicate the regression. I have
> kicked off tests for each nightly between July 19 and 24 in order to attempt
> to replicate the issue, and narrow it down to a single day. I will post
> results here when they're available.

Dave, the difference I get between the 07/19 build and the 07/24 build is only somewhere around 4 MB. My concern in my previous comment was that I get high memory consumption on all the builds I've run the tests on (07/10, 07/16-07/24). 
The results are always somewhere on the lines of the 24th of July results in comment 0 (I haven't gotten lower memory usage once, on any of the builds):

> Explicit memory (MB) July 19:	Minimum: 61 / Maximum: 120 / Average: 76 
> Explicit memory (MB) July 24:   Minimum: 70 / Maximum: 156 / Average: 104
> 
> Resident memory (MB) July 19:	Minimum: 102 / Maximum: 146 / Average: 117
> Resident memory (MB) July 24:	Minimum: 117 / Maximum: 188 / Average: 146
(Assignee)

Comment 6

5 years ago
So the remaining results are in:

Build ID: 20120720030549
Average Explicit: 91MB
Average Resident: 132MB
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/1e83db11eb3022313051f256db00c56a

Build ID: 20120721030555
Average Explicit: 93MB
Average Resident: 135MB
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/c77cfb8878087b850538b93cf2000c51

Build ID: 20120722030555
Average Explicit: 94MB
Average Resident: 136MB
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/c77cfb8878087b850538b93cf20036ba

Build ID: 20120723030606
Average Explicit: 95MB
Average Resident: 137MB
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/c77cfb8878087b850538b93cf20038ef

So this shows that there was an initial jump between the July 19 and 20 builds, and then a gradual increase in following builds. There then appears to be another jump between July 23 and 24.
(Assignee)

Comment 7

5 years ago
Ioana: High memory usage is relative to a baseline. Unless you have a lower baseline then you may find that your 'high memory' is in fact your baseline. Are you running on the same OS/version?

Comment 8

5 years ago
(In reply to Dave Hunt (:davehunt) from comment #7)
> Ioana: High memory usage is relative to a baseline. Unless you have a lower
> baseline then you may find that your 'high memory' is in fact your baseline.
> Are you running on the same OS/version?

I can't really agree with what you are saying. High memory usage shouldn't be more acceptable just because it has been happening from the beginning but it should be improved. 

There should exist some memory usage metrics/limits that are considered acceptable. When it gets higher than that it should be considered a bug and get fixed, even if it has happened for a long while. Can I find such metrics for Firefox posted anywhere?


Either way, I have been running the tests on the same Ubuntu 12.04 64-bit and it seems it is a bug. After investigating some more, I found out that the issue appeared on the 5th of May Nightly:

BuildID: 20120504030509
Explicit memory (MB):	Minimum: 65 / Maximum: 120 / Average: 86
Resident memory (MB):	Minimum: 97 / Maximum: 151 / Average: 118
Report: http://mozmill-crowd.blargon7.com/#/endurance/report/c77cfb8878087b850538b93cf20b1e12

BuildID: 20120505030510
Explicit memory (MB):	Minimum: 75 / Maximum: 151 / Average: 104
Resident memory (MB):	Minimum: 106 / Maximum: 175 / Average: 132
http://mozmill-crowd.blargon7.com/#/endurance/report/c77cfb8878087b850538b93cf20a008d

The pushlog only shows one bug fix, so that must be the culprit: http://hg.mozilla.org/mozilla-central/rev/0a48e6561534.
(Assignee)

Comment 9

5 years ago
Ioana: This isn't really the best place for this discussion. Endurance tests are for measuring memory regressions and not general memory usage issues. If you have found such a regression on a different build please file a separate bug. This bug is related to an already identified regression, which needs further investigation.

Comment 10

5 years ago
(In reply to Dave Hunt (:davehunt) from comment #9)
> Ioana: This isn't really the best place for this discussion. Endurance tests
> are for measuring memory regressions and not general memory usage issues. If
> you have found such a regression on a different build please file a separate
> bug. This bug is related to an already identified regression, which needs
> further investigation.

It seems the regression I found was bug 754267. And you are right, this is not the place for that discussion. Sorry for bringing it up here.
What are the next steps on this bug to gain further details about the memory increase?
(Assignee)

Comment 12

5 years ago
I would suggest rerunning the builds mentioned in comment 6 with a larger number of entities/iterations. This should have the effect of making the regression more obvious. We should then isolate the causes for these regressions using tinderbox and local builds as necessary.
Then we should find someone who owns this bug. Can we put Ioana as assignee on this bug?
(Assignee)

Comment 14

5 years ago
That works for me. I believe Miheala is on PTO for another week.

Comment 15

5 years ago
@Henrik, @Dave - I've tried to reproduce this issue on my machine but it doesn't reproduce at all, even with 50 iterations. Here are some examples of my test runs:

BuildID: 20120719030543
http://mozmill-crowd.blargon7.com/#/endurance/report/29fc09ba0c8360d637617903a02b86bc
Explicit memory (MB):	Minimum: 77 / Maximum: 259 / Average: 113
Resident memory (MB):	Minimum: 113 / Maximum: 259 / Average: 147

http://mozmill-crowd.blargon7.com/#/endurance/report/29fc09ba0c8360d637617903a02d567a
Explicit memory (MB):	Minimum: 76 / Maximum: 273 / Average: 112
Resident memory (MB):	Minimum: 113 / Maximum: 260 / Average: 145

BuildID: 20120720030549
http://mozmill-crowd.blargon7.com/#/endurance/report/29fc09ba0c8360d637617903a02cb3b3
Explicit memory (MB):	Minimum: 77 / Maximum: 311 / Average: 112
Resident memory (MB):	Minimum: 114 / Maximum: 287 / Average: 145

http://mozmill-crowd.blargon7.com/#/endurance/report/3491a2617d5af3ec9bb5c88aee00007a
Explicit memory (MB):	Minimum: 77 / Maximum: 256 / Average: 112
Resident memory (MB):	Minimum: 113 / Maximum: 266 / Average: 145

BuildID: 20120723030606
http://mozmill-crowd.blargon7.com/#/endurance/report/3491a2617d5af3ec9bb5c88aee009943
Explicit memory (MB):	Minimum: 79 / Maximum: 267 / Average: 113
Resident memory (MB):	Minimum: 117 / Maximum: 258 / Average: 148

BuildID: 20120724030551
http://mozmill-crowd.blargon7.com/#/endurance/report/3491a2617d5af3ec9bb5c88aee004ea2
Explicit memory (MB):	Minimum: 78 / Maximum: 289 / Average: 114
Resident memory (MB):	Minimum: 117 / Maximum: 277 / Average: 149

I will try to reproduce it on other machines, but if someone else can reproduce it easier, perhaps it would be more efficient to let them help with this.
(Assignee)

Comment 16

5 years ago
Thanks Ioana for your attempts to replicate this. I'm planning on running this through Mozmill CI soon with a higher number of iterations/entities to see if we can finally narrow this down. I'm also hoping to have tinderbox support in Mozmill CI soon, so reducing the regression range should be easier. I'll update here.
Assignee: nobody → dave.hunt
Dave, any value to keep this bug open?
(Assignee)

Comment 18

5 years ago
I must have missed this when I closed similar issues. A few things have changed with the endurance tests that would make investigating this difficult, but investigating future regressions easier.
Status: NEW → RESOLVED
Last Resolved: 5 years ago
Resolution: --- → INVALID
You need to log in before you can comment on or make changes to this bug.