Closed
Bug 634855
Opened 14 years ago
Closed 14 years ago
Memory leak with NoScript 2.0.9.8 installed
Categories
(Core :: General, defect)
Core
General
Tracking
()
VERIFIED
FIXED
mozilla2.0b12
Tracking | Status | |
---|---|---|
blocking2.0 | --- | final+ |
People
(Reporter: danne.da, Assigned: peterv)
References
()
Details
(Keywords: memory-leak, regression, Whiteboard: fixed-in-tracemonkey [mozmill-endurance])
Attachments
(1 file, 1 obsolete file)
2.83 KB,
patch
|
peterv
:
review+
sicking
:
approval2.0+
|
Details | Diff | Splinter Review |
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0b12pre) Gecko/20110216 Firefox/4.0b12pre
Build Identifier:
It appears when using NoScript Firefox will leak some memory on some sites.
First get NoScript here: https://addons.mozilla.org/en-US/firefox/addon/noscript/
Reproducible: Always
Steps to Reproduce:
STR:
1. Goto http://www.askmen.com/specials/top_99_women/ (slightly NSFW)
2. Allow Scripts from askmen.com to run
3. Click on "Start with No.99"
4. Click next till you get to number 80
5. Wait a few seconds to allow GC to run, and note the memory usage
6. Go back to link provided in step 1 *OR* continue clicking next
Actual Results:
After 5 runs of the above my memory usage got to about 1 GB, with a start point of 260 MB.
Expected Results:
Memory usage shouldn't be significantly higher after a few rounds.
If running at least 2 rounds and shortly after closing the tab about 100 MB of memory will be released. No significant drop memory appears to be released thereafter.
Comment 1•14 years ago
|
||
Is this a regression?
I'll see if I can find a regression range now.
I can confirm that the build of January 29 isn't affected in any case.
Updated•14 years ago
|
Updated•14 years ago
|
Whiteboard: [mlk]
Comment 3•14 years ago
|
||
I can't reproduce this using the current trunk.
When I start FF, memory usage is 71MB, when I load
http://www.askmen.com/specials/top_99_women/ memusage is 79MB,
after loading first 20 pages memusage is 90MB and it stays there
even after 5th run.
Severity: major → normal
Version: Trunk → unspecified
Comment 4•14 years ago
|
||
Argh, my mistake. I forgot step 2.
Re-testing.
I believe I may have grabbed an earlier build when doing some testing for 631494, as I get a memory leak from the leak of the 29th.
January 30th: Massive leak, same size as 631494.
January 29th: Slightly larger leak than the current nightly
January 28th: Same as above.
January 17th: No leak, memory goes up to about 400MB and stays there.
Trying to narrow it down a little more.
The leak started happening between the 23rd and 28th, downloading the 26th now, quite a slow download.
I believe I've found the regression range:
http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=80266029824b&tochange=6a097e294828
Jan 24th: 80266029824b - no leak
Jan 25th: 6a097e294828 - leak
Anyone can confirm?
Comment 9•14 years ago
|
||
If the regression is confirmed, change http://hg.mozilla.org/mozilla-central/rev/5a6e9e7e487a looks the most suspicious to me.
Comment 10•14 years ago
|
||
The tracemonkey merge in that range certainly regressed something, but
I wonder if Bug 628599 fixed that problem.
Updated•14 years ago
|
Comment 11•14 years ago
|
||
Bug 628599 does seem to have fixed the problem tm merge caused.
So the problem is somewhere else
Reporter | ||
Comment 12•14 years ago
|
||
Testing with the tracemonkey branch I got this regression range:
http://hg.mozilla.org/tracemonkey/pushloghtml?fromchange=5cc0da184040&tochange=aa618e93942e
Jan 20th: 5cc0da184040 - no leak
Jan 21st: aa618e93942e - leaks
Hopefully this will help figuring out which bug caused it.
Comment 13•14 years ago
|
||
This is btw a real leak, not just something where we keep
things alive too long.
Reporter | ||
Comment 14•14 years ago
|
||
Is there any hourly build archive that goes back to the 20th? tm was merged with mc between the 20th and 21st, would be nice to know if that merge caused the leak or not.
Comment 15•14 years ago
|
||
I built manually tm 5f815fe7434d and tm 64274de90e2d
and neither of those builds had the problem.
Updated•14 years ago
|
blocking2.0: --- → ?
Comment 16•14 years ago
|
||
I can't reproduce with tm aa618e93942e either so that tm regression range
really right?
Reporter | ||
Comment 17•14 years ago
|
||
I'm not so sure myself after looking in the pushlogs for tm and m-c. I know for sure that the leak showed up on the 25th on m-c, and on the 21st there was a tm merge. So it has to have happened on tm between those two dates. I'm pretty sure I saw it on the jan 22nd build of tm.
So I'd start there.
Updated•14 years ago
|
Assignee: nobody → gal
Reporter | ||
Comment 18•14 years ago
|
||
Seems like I was caught out by a false positive.
None of the builds prior to the 21st peaked above 370 MB (stable at 360 MB), while the 21st and 22nd both peak at 410 MB (stable at 395 MB, after 10 sec idle or so) and get there much earlier than those builds. I wonder what happened in the m-c merge that caused that 40MB increase in peak memory usage.
Comment 19•14 years ago
|
||
Gal, you took this. Did you actually figure out what regressed?
Reporter | ||
Comment 20•14 years ago
|
||
This should be the correct regression range:
http://hg.mozilla.org/tracemonkey/pushloghtml?fromchange=5a6e9e7e487a&tochange=bbcc51fa912b
Jan 22nd - 5a6e9e7e487a - no leak
Jan 23rd - bbcc51fa912b - leak
Jan 21st and Jan 23rd build behaves almost identically for the first 10 or so pages, while Jan 20th and 21st differs by the 40 MB, that is what caught me out.
Comment 21•14 years ago
|
||
The shutdown leak regression range is
http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=9a6de1e28d4b&tochange=c93381b53df3
Comment 22•14 years ago
|
||
And yes, that tm range is within that m-c range.
Assignee | ||
Comment 23•14 years ago
|
||
I'm seeing RegExpStatics and a sandbox keeping a window alive. Still hunting for the roots.
Comment 24•14 years ago
|
||
Uh, right, there is 5a6e9e7e487a in the m-c range, and that was fixed later.
Comment 25•14 years ago
|
||
2011-02-03-03-mozilla-central-debug does not leak.
Comment 26•14 years ago
|
||
Ah, ok, there is something very recent
http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=ab0dc35174fb&tochange=2042b8f9756d
Comment 27•14 years ago
|
||
Andreas: do you think that your GC scheduling might have helped?
Reporter: can you try again with latest nightly?
Comment 28•14 years ago
|
||
Nope, CC/GC scheduling didn't affect to this.
There is a real leak in the range http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=ab0dc35174fb&tochange=2042b8f9756d
And I've verified the leak is before gc/cc changes.
Comment 29•14 years ago
|
||
I took it to investigate. More than happy to hand this off if anyone volunteers.
Comment 30•14 years ago
|
||
peterv, I recently added code to clear regexp statics from JS_ClearScope. It might have been not enough.
Comment 31•14 years ago
|
||
http://hg.mozilla.org/mozilla-central/rev/fcfd3cf35d29 doesn't leak on shutdown
http://hg.mozilla.org/mozilla-central/rev/52246c1b1799 leaks
I don't know if there are other non-shutdown leaks.
Blocks: 614347
Assignee | ||
Comment 32•14 years ago
|
||
Comment on attachment 513273 [details] [diff] [review]
v1
>+ if(obj)
>+ xpc_UnmarkGrayObject(obj);
> return obj;
Nit: no need to null-check here.
Attachment #513273 -
Flags: review?(bent.mozilla) → review+
Assignee | ||
Comment 34•14 years ago
|
||
Fix for a recent leak regression.
Attachment #513273 -
Attachment is obsolete: true
Attachment #513276 -
Flags: review+
Attachment #513276 -
Flags: approval2.0?
Attachment #513276 -
Flags: approval2.0? → approval2.0+
Sounds like a bad leak, so blocking.
blocking2.0: ? → final+
http://hg.mozilla.org/mozilla-central/rev/9763667dfc4a
http://hg.mozilla.org/tracemonkey/rev/5f0a5b42ecc0
Status: ASSIGNED → RESOLVED
Closed: 14 years ago
Resolution: --- → FIXED
Whiteboard: fixed-in-tracemonkey
Reporter | ||
Comment 37•14 years ago
|
||
With the latest hourly available (based on changeset: http://hg.mozilla.org/mozilla-central/rev/911ef52fbc49) I can no longer reproduce this bug.
With all my other extensions available (base about ~270 MB after opening browser) memory usage peaked at 417 MB midway through the second run as described in comment #1, and then dropped to 393 MB and throughout the test stayed in the high 390s MB usage.
Memory usage now is 334 MB and I'll see if that drops further.
Updated•14 years ago
|
Keywords: regressionwindow-wanted
Target Milestone: --- → mozilla2.0b12
Updated•14 years ago
|
Whiteboard: fixed-in-tracemonkey → fixed-in-tracemonkey[mozmill-test-needed][mozmill-endurance]
Comment 38•14 years ago
|
||
I created a simple Mozmill endurance test for this issue, which demonstrates the memory leak and also shows that it is not occurring in a recent nightly.
Firefox 4.0b11pre (2.0b11pre, en-US, 20110129030338) with NoScript 2.0.9.8
http://mozmill.blargon7.com/#/endurance/report/66e0fb6bd5645ddae0ee8e21d0092fbe
Firefox 4.0b13pre (2.0b13pre, en-US, 20110303122430) with NoScript 2.0.9.8
http://mozmill.blargon7.com/#/endurance/report/66e0fb6bd5645ddae0ee8e21d0096e91
Whiteboard: fixed-in-tracemonkey[mozmill-test-needed][mozmill-endurance] → fixed-in-tracemonkey [mozmill-endurance]
Updated•14 years ago
|
Status: RESOLVED → VERIFIED
Updated•13 years ago
|
Blocks: mlk-fx4-beta
You need to log in
before you can comment on or make changes to this bug.
Description
•