Closed Bug 634855 Opened 14 years ago Closed 14 years ago

Memory leak with NoScript 2.0.9.8 installed

Categories

(Core :: General, defect)

defect
Not set
normal

Tracking

()

VERIFIED FIXED
mozilla2.0b12
Tracking Status
blocking2.0 --- final+

People

(Reporter: danne.da, Assigned: peterv)

References

()

Details

(Keywords: memory-leak, regression, Whiteboard: fixed-in-tracemonkey [mozmill-endurance])

Attachments

(1 file, 1 obsolete file)

User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.6; rv:2.0b12pre) Gecko/20110216 Firefox/4.0b12pre Build Identifier: It appears when using NoScript Firefox will leak some memory on some sites. First get NoScript here: https://addons.mozilla.org/en-US/firefox/addon/noscript/ Reproducible: Always Steps to Reproduce: STR: 1. Goto http://www.askmen.com/specials/top_99_women/ (slightly NSFW) 2. Allow Scripts from askmen.com to run 3. Click on "Start with No.99" 4. Click next till you get to number 80 5. Wait a few seconds to allow GC to run, and note the memory usage 6. Go back to link provided in step 1 *OR* continue clicking next Actual Results: After 5 runs of the above my memory usage got to about 1 GB, with a start point of 260 MB. Expected Results: Memory usage shouldn't be significantly higher after a few rounds. If running at least 2 rounds and shortly after closing the tab about 100 MB of memory will be released. No significant drop memory appears to be released thereafter.
Is this a regression?
I'll see if I can find a regression range now. I can confirm that the build of January 29 isn't affected in any case.
Severity: normal → major
Version: unspecified → Trunk
Whiteboard: [mlk]
I can't reproduce this using the current trunk. When I start FF, memory usage is 71MB, when I load http://www.askmen.com/specials/top_99_women/ memusage is 79MB, after loading first 20 pages memusage is 90MB and it stays there even after 5th run.
Severity: major → normal
Version: Trunk → unspecified
Argh, my mistake. I forgot step 2. Re-testing.
I believe I may have grabbed an earlier build when doing some testing for 631494, as I get a memory leak from the leak of the 29th. January 30th: Massive leak, same size as 631494. January 29th: Slightly larger leak than the current nightly January 28th: Same as above. January 17th: No leak, memory goes up to about 400MB and stays there. Trying to narrow it down a little more.
The leak started happening between the 23rd and 28th, downloading the 26th now, quite a slow download.
Ok, I can reproduce.
Status: UNCONFIRMED → NEW
Ever confirmed: true
I believe I've found the regression range: http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=80266029824b&tochange=6a097e294828 Jan 24th: 80266029824b - no leak Jan 25th: 6a097e294828 - leak Anyone can confirm?
If the regression is confirmed, change http://hg.mozilla.org/mozilla-central/rev/5a6e9e7e487a looks the most suspicious to me.
The tracemonkey merge in that range certainly regressed something, but I wonder if Bug 628599 fixed that problem.
Keywords: mlk
Whiteboard: [mlk]
Version: unspecified → Trunk
Bug 628599 does seem to have fixed the problem tm merge caused. So the problem is somewhere else
Testing with the tracemonkey branch I got this regression range: http://hg.mozilla.org/tracemonkey/pushloghtml?fromchange=5cc0da184040&tochange=aa618e93942e Jan 20th: 5cc0da184040 - no leak Jan 21st: aa618e93942e - leaks Hopefully this will help figuring out which bug caused it.
This is btw a real leak, not just something where we keep things alive too long.
Is there any hourly build archive that goes back to the 20th? tm was merged with mc between the 20th and 21st, would be nice to know if that merge caused the leak or not.
I built manually tm 5f815fe7434d and tm 64274de90e2d and neither of those builds had the problem.
blocking2.0: --- → ?
I can't reproduce with tm aa618e93942e either so that tm regression range really right?
I'm not so sure myself after looking in the pushlogs for tm and m-c. I know for sure that the leak showed up on the 25th on m-c, and on the 21st there was a tm merge. So it has to have happened on tm between those two dates. I'm pretty sure I saw it on the jan 22nd build of tm. So I'd start there.
Assignee: nobody → gal
Seems like I was caught out by a false positive. None of the builds prior to the 21st peaked above 370 MB (stable at 360 MB), while the 21st and 22nd both peak at 410 MB (stable at 395 MB, after 10 sec idle or so) and get there much earlier than those builds. I wonder what happened in the m-c merge that caused that 40MB increase in peak memory usage.
Gal, you took this. Did you actually figure out what regressed?
This should be the correct regression range: http://hg.mozilla.org/tracemonkey/pushloghtml?fromchange=5a6e9e7e487a&tochange=bbcc51fa912b Jan 22nd - 5a6e9e7e487a - no leak Jan 23rd - bbcc51fa912b - leak Jan 21st and Jan 23rd build behaves almost identically for the first 10 or so pages, while Jan 20th and 21st differs by the 40 MB, that is what caught me out.
And yes, that tm range is within that m-c range.
I'm seeing RegExpStatics and a sandbox keeping a window alive. Still hunting for the roots.
Uh, right, there is 5a6e9e7e487a in the m-c range, and that was fixed later.
2011-02-03-03-mozilla-central-debug does not leak.
Andreas: do you think that your GC scheduling might have helped? Reporter: can you try again with latest nightly?
Nope, CC/GC scheduling didn't affect to this. There is a real leak in the range http://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=ab0dc35174fb&tochange=2042b8f9756d And I've verified the leak is before gc/cc changes.
I took it to investigate. More than happy to hand this off if anyone volunteers.
peterv, I recently added code to clear regexp statics from JS_ClearScope. It might have been not enough.
http://hg.mozilla.org/mozilla-central/rev/fcfd3cf35d29 doesn't leak on shutdown http://hg.mozilla.org/mozilla-central/rev/52246c1b1799 leaks I don't know if there are other non-shutdown leaks.
Blocks: 614347
Attached patch v1 (obsolete) — Splinter Review
Assignee: gal → peterv
Status: NEW → ASSIGNED
Attachment #513273 - Flags: review?(bent.mozilla)
Comment on attachment 513273 [details] [diff] [review] v1 >+ if(obj) >+ xpc_UnmarkGrayObject(obj); > return obj; Nit: no need to null-check here.
Attachment #513273 - Flags: review?(bent.mozilla) → review+
Attached patch v1.1Splinter Review
Fix for a recent leak regression.
Attachment #513273 - Attachment is obsolete: true
Attachment #513276 - Flags: review+
Attachment #513276 - Flags: approval2.0?
Sounds like a bad leak, so blocking.
blocking2.0: ? → final+
Status: ASSIGNED → RESOLVED
Closed: 14 years ago
Resolution: --- → FIXED
Whiteboard: fixed-in-tracemonkey
With the latest hourly available (based on changeset: http://hg.mozilla.org/mozilla-central/rev/911ef52fbc49) I can no longer reproduce this bug. With all my other extensions available (base about ~270 MB after opening browser) memory usage peaked at 417 MB midway through the second run as described in comment #1, and then dropped to 393 MB and throughout the test stayed in the high 390s MB usage. Memory usage now is 334 MB and I'll see if that drops further.
Target Milestone: --- → mozilla2.0b12
Whiteboard: fixed-in-tracemonkey → fixed-in-tracemonkey[mozmill-test-needed][mozmill-endurance]
I created a simple Mozmill endurance test for this issue, which demonstrates the memory leak and also shows that it is not occurring in a recent nightly. Firefox 4.0b11pre (2.0b11pre, en-US, 20110129030338) with NoScript 2.0.9.8 http://mozmill.blargon7.com/#/endurance/report/66e0fb6bd5645ddae0ee8e21d0092fbe Firefox 4.0b13pre (2.0b13pre, en-US, 20110303122430) with NoScript 2.0.9.8 http://mozmill.blargon7.com/#/endurance/report/66e0fb6bd5645ddae0ee8e21d0096e91
Whiteboard: fixed-in-tracemonkey[mozmill-test-needed][mozmill-endurance] → fixed-in-tracemonkey [mozmill-endurance]
Status: RESOLVED → VERIFIED
No longer blocks: mlk2.0
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: