Closed
Bug 1136046
Opened 10 years ago
Closed 10 years ago
uncaught exception: out of memory
Categories
(Core :: JavaScript Engine, defect)
Tracking
()
RESOLVED
FIXED
mozilla39
Tracking | Status | |
---|---|---|
firefox39 | --- | fixed |
People
(Reporter: kiswono, Assigned: jonco)
Details
Attachments
(1 file, 1 obsolete file)
4.22 KB,
patch
|
luke
:
review+
|
Details | Diff | Splinter Review |
User Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.115 Safari/537.36
Steps to reproduce:
Run this benchmark code:
if(typeof print == 'undefined') print = function(){};
if(typeof console == 'undefined') console = {log:print};
var MAX_DATA = 12000000;
var i2ch = ['0','1','2','3','4','5','6','7','8','9','a','B','c','D','e','F'];
function get_first_digit(d) {
while(d > 10) d /= 10;
return d|0;
}
function to_rhex(v) {
var hex = '';
v = v|0;
while(v>0) {
hex += i2ch[v%16];
v = (v/16)|0;
}
return hex;
}
function add_or_inc(m,key,set,inc) {
if(m[key] === undefined) {
m[key] = set;
return false;
}
m[key] += inc;
return true;
}
(function() {
var m = {};
var dup1 = 0, dup2 = 0, dup3 = 0;
for(var z=MAX_DATA;z>0;--z) {
var val2 = MAX_DATA-z;
var val3 = MAX_DATA*2-z;
var key1 = "" + z;
var key2 = "" + val2;
var key3 = to_rhex(val3);
if(add_or_inc(m,key1,z,val2)) ++dup1;
if(add_or_inc(m,key2,val2,val3)) ++dup2;
if(add_or_inc(m,key3,val3,z)) ++dup3;
}
console.log(dup1,dup2,dup3);
var total = 0, verify = 0, count = 0;
for (var key in m) {
total += get_first_digit(m[key]);
verify += key.length;
count += 1;
}
console.log(total,verify,count);
})()
Actual results:
$ time js24 object.js
uncaught exception: out of memory
Command exited with non-zero status 3
CPU: 31.20s Real: 31.76s RAM: 1818936KB
os some other computer with exact same ram (16GB, 11GB free), it give the same exception after allocating about 1.2GB
Expected results:
JavaScript-C24.2.0
Expected results:
6009354 6009348 611297
36186112 159701682 23370001
see http://kokizzu.blogspot.com/2015/02/string-associative-array-benchmark.html for comparison to another engine.
Assignee | ||
Comment 3•10 years ago
|
||
I've reproduced this: we are hitting our maximum hashtable capacity for the atoms table.
Currently this is 2^24 entries, and the benchmark creates 23 million atoms for object property names.
I think we can safely increase this by juggling the HashTable members around and reducing the size of |gen| so that we can have a full 32 bits for |removedCount|. I couldn't make CAP_BITS greater than 30 as the overflow calculations need some headroom.
The benchmark completes with this change.
Assignee: nobody → jcoppeard
Status: UNCONFIRMED → NEW
Ever confirmed: true
Attachment #8568480 -
Flags: review?(luke)
Comment 4•10 years ago
|
||
Comment on attachment 8568480 [details] [diff] [review]
expand-max-hash-table-size
Nice re-packing.
Attachment #8568480 -
Flags: review?(luke) → review+
Assignee | ||
Comment 5•10 years ago
|
||
I should have pushed this to try first. There were a couple of issues:
1. The static assert of the sMaxCapacity used to check for overflow when calculating the size to allocate, but the calculation and overflow check are now handled in js_pod_calloc() so this can be removed. This caused problems on 32-bit builds for hash tables with large entry size.
2. The testHashInit jsapi-tests now fail on try due to not being able to allocate enough memory now we've increased the maximum table size. I don't see anything else to do but remove this tests. I hope that's ok!
Attachment #8568480 -
Attachment is obsolete: true
Attachment #8569165 -
Flags: review?(luke)
Updated•10 years ago
|
Attachment #8569165 -
Flags: review?(luke) → review+
Assignee | ||
Comment 6•10 years ago
|
||
Status: NEW → RESOLVED
Closed: 10 years ago
status-firefox39:
--- → fixed
Resolution: --- → FIXED
Target Milestone: --- → mozilla39
You need to log in
before you can comment on or make changes to this bug.
Description
•