Open Bug 1532886 Opened 5 years ago Updated 2 years ago

Allocations from extension API calls take a long time to be cleaned up

Categories

(Core :: JavaScript: GC, defect, P3)

65 Branch
defect

Tracking

()

UNCONFIRMED

People

(Reporter: svyataleseev88, Unassigned)

References

(Blocks 1 open bug)

Details

User Agent: Mozilla/5.0 (Windows NT 6.3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.119 Safari/537.36

Steps to reproduce:

During testing of the extension API, I discovered a memory leak in a completely basic case:
background.js:

function PARS_IT()
{
clearTimeout(timID);
chrome.windows.getAll({populate:true},function(windows)
{
timID=setTimeout(function(){PARS_IT();}, 5);
});

}
PARS_IT();

Actual results:

Waiting for an asynchronous function when recursively calling SetTimeOut causes a substantial memory leak.

Expected results:

The cleaner must monitor the life cycle of the asynchronous function and clear the memory

*var timID;//at the beginning of the code, I forgot

Product: Firefox → WebExtensions
Component: Untriaged → Request Handling

Can you elaborate on what you observe? That code is calling windows.getAll() very rapidly, which is going to use a bunch of memory, but it should be eventually reclaimed. In which process do you think memory is being leaked?

Component: Request Handling → General
Flags: needinfo?(svyataleseev88)

Yes, this is just the result of calling getAll 200 times per second. The result data for that call will be quite large when a lot of tabs are open, and it's not being cleaned up very quickly.

This is probably just the same issue as bug 1476032. StructuredCloneHolder objects take a while to be cleaned up. They're probably only ever released after a full GC. The fact that we generally neuter them after deserializing now should help.

Component: General → JavaScript: GC
Product: WebExtensions → Core
See Also: → 1476032
Summary: setTimeout + chrome.windows.getAll = Memory Leak → Allocations from extension API calls take a long time to be cleaned up

(In reply to Andrew Swan [:aswan] from comment #2)

Can you elaborate on what you observe? That code is calling windows.getAll() very rapidly, which is going to use a bunch of memory, but it should be eventually reclaimed. In which process do you think memory is being leaked?

5ms delay only to speed up the process. I work at 35ms minimum. Cleaning occurs with a saw, but with a combination of factors that are incomprehensible to me, the tooth begins to grow with time or skips cleaning cycles, which leads to a collapse of 32bit. I tried to recreate this peak for you, here is an example below (only the extensions tab is open, I didn’t touch anything) I got a peak (tooth) + 280mb compared to Chrome, but not always, I don’t see the logic and I don’t know what is the meaning of my appendages. In the task manager, you can see that the first cleaning takes place within reasonable limits, then something happens. Anyway, in my current expansion, I end up with a crash of + 600mb in comparison with browsers on the Blink engines.

var M=[];
var i=0;
function PARS_IT ()
{
M[++i]=i;
clearTimeout (timID);
chrome.windows.getAll ({populate: true}, function (windows)
{

chrome.windows.getAll ({populate: true}, function (windows)
{
timID=setTimeout(function(){M[++i]=i;PARS_IT ();}, 12);
});
});
}
PARS_IT();

Flags: needinfo?(svyataleseev88)

p.s. My extension is a parser. On Chrome \ Opera \ Yandex Browser, I have chrome.system.memory against similar failures. I can save the arrays in the SQL database and reload the script, having solved any problems, but in the case of Firefox, the only ending is a crash.

I'm concerned that this is a crash, but I'll come back to that later.

I want to explain some things about GC because it might help.

A garbage collector doesn't have to free memory quickly. It's no problem to consume memory unless memory is exhausted/about to be exhausted and there's a demand for more memory (or about to be a demand for more memory). So if you have some memory allocated, but you have plenty of free memory, it doesn't matter if the GC doesn't clean up that allocated memory for a while, and usually it's better to wait because it's more efficient for CPU-time.

The saw-tooth behavior of memory consumption is quite normal. Memory usage will build up and when it hits a limit he GC will run and free memory. Then memory usage will build up again until the GC runs again. The GC may take a few seconds and will free the memory as one of its later phases.

I am concerned about the crash you're getting because that shouldn't happen. If it's crashing due to lack of memory we'd like to know because maybe there's a bug or maybe the GC is failing to free memory quickly enough.

Can you open the "about:crashes" URL and find one of your recent crashes due to this problem, click view and paste the URL as a comment here? Thanks.

Flags: needinfo?(svyataleseev88)

(In reply to Paul Bone [:pbone] from comment #6)

I'm concerned that this is a crash, but I'll come back to that later.

I want to explain some things about GC because it might help.

A garbage collector doesn't have to free memory quickly. It's no problem to consume memory unless memory is exhausted/about to be exhausted and there's a demand for more memory (or about to be a demand for more memory). So if you have some memory allocated, but you have plenty of free memory, it doesn't matter if the GC doesn't clean up that allocated memory for a while, and usually it's better to wait because it's more efficient for CPU-time.

The saw-tooth behavior of memory consumption is quite normal. Memory usage will build up and when it hits a limit he GC will run and free memory. Then memory usage will build up again until the GC runs again. The GC may take a few seconds and will free the memory as one of its later phases.

I am concerned about the crash you're getting because that shouldn't happen. If it's crashing due to lack of memory we'd like to know because maybe there's a bug or maybe the GC is failing to free memory quickly enough.

Can you open the "about:crashes" URL and find one of your recent crashes due to this problem, click view and paste the URL as a comment here? Thanks.

I did 2 tests in real extension. With small (ms) and large timers (1000ms). With most timers, we managed to do 5-10% more work. Interestingly, in the second test with large timers, the crash occurred in a completely different form: at the peak, firefox started writing on the SSD, and the script did not crash, but just stopped, leaving 670mb out of 1100 at the peak and without error, therefore there is only the first report:
https://crash-stats.mozilla.com/report/index/e1e64035-4a78-48c5-bec1-a9a7d0190313

Flags: needinfo?(svyataleseev88)

An about:memory dump from shortly before the crash would probably be helpful at this point.

(In reply to Kris Maglione [:kmag] from comment #8)

An about:memory dump from shortly before the crash would probably be helpful at this point.

I found another possible cause of the collapse, your comments helped find directions. With each cycle, my [] grows and it goes according to plan, however, at a certain size of the array, the saw tooth breaks off. I modeled the growth of the array, which causes a crash in less than a minute on a 32bit system. The graph in the task manager can clearly see the behavior of the saw at the beginning and end. This is related to any asynchronous recursion. I really hope that this is a GC problem, not large objects.

<script>
<script>var timID;
var m=[];
var mm=[];
for (i = 0; i < 5500000; ++i){m[i]={ii:i};}
console.log(i);
function PARS_IT () 
{clearTimeout (timID);
	//mm['fix']=null;
	mm['fix']=[];
	for (var i2 = 0; i2 < 10000; ++i2){
	m[++i]={ii:i};
	}
	console.clear();
	console.log(i);
	mm['fix']=m.slice(0);
	timID=setTimeout(function(){PARS_IT ();},200);} 
PARS_IT ();
</script>

Screenshot of memory graphics in task manager:
http://prntscr.com/mxapmq

NI, Kannan who probably knows the JS engine more generally.

What can/does a closure capture. At the closure on the 2nd last line of the script in comment 9 it requires PARTS_IT but nothing else. Is more than this captured? is the frame for the current execution of PARTS_IT captured?

Flags: needinfo?(kvijayan)

Yes, your intuition is correct. Not much will be captured at all by that anonymous function. Even PARTS_IT will likely not need to be captured since it will be defined on the global window object (it's a top-level var). These don't need to be captured at all since the function will implicitly look up names in the global if it fails to find it in one of the intervening lexical scopes.

Flags: needinfo?(kvijayan)

We've got a few bugs like this on file at the moment. Hopefully bug 1395509 will address this.

Blocks: 1533449
Priority: -- → P3
Severity: normal → S3
You need to log in before you can comment on or make changes to this bug.