Closed Bug 1136046 Opened 7 years ago Closed 7 years ago

uncaught exception: out of memory


(Core :: JavaScript Engine, defect)

Not set



Tracking Status
firefox39 --- fixed


(Reporter: kiswono, Assigned: jonco)



(1 file, 1 obsolete file)

User Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/40.0.2214.115 Safari/537.36

Steps to reproduce:

Run this benchmark code:

if(typeof print == 'undefined') print = function(){};
if(typeof console == 'undefined') console = {log:print};
var MAX_DATA = 12000000;
var i2ch = ['0','1','2','3','4','5','6','7','8','9','a','B','c','D','e','F'];
function get_first_digit(d) {
        while(d > 10) d /= 10;
        return d|0;
function to_rhex(v) {
        var hex = '';
        v = v|0;
        while(v>0) {
                hex += i2ch[v%16];
                v = (v/16)|0;
        return hex;
function add_or_inc(m,key,set,inc) {
        if(m[key] === undefined) {
                m[key] = set;
                return false;
        m[key] += inc;
        return true;
(function() {
        var m = {};
        var dup1 = 0, dup2 = 0, dup3 = 0;
        for(var z=MAX_DATA;z>0;--z) {
                var val2 = MAX_DATA-z;
                var val3 = MAX_DATA*2-z;
                var key1 = "" + z;
                var key2 = "" + val2;
                var key3 = to_rhex(val3);
                if(add_or_inc(m,key1,z,val2)) ++dup1;
                if(add_or_inc(m,key2,val2,val3)) ++dup2;
                if(add_or_inc(m,key3,val3,z)) ++dup3;
        var total = 0, verify = 0, count = 0;
        for (var key in m) {
                total += get_first_digit(m[key]);
                verify += key.length;
                count += 1;

Actual results:

$ time js24 object.js 
uncaught exception: out of memory
Command exited with non-zero status 3

CPU: 31.20s     Real: 31.76s    RAM: 1818936KB

os some other computer with exact same ram (16GB, 11GB free), it give the same exception after allocating about 1.2GB

Expected results:

Expected results:
6009354 6009348 611297
36186112 159701682 23370001
see for comparison to another engine.
Attached patch expand-max-hash-table-size (obsolete) — Splinter Review
I've reproduced this: we are hitting our maximum hashtable capacity for the atoms table.

Currently this is 2^24 entries, and the benchmark creates 23 million atoms for object property names.

I think we can safely increase this by juggling the HashTable members around and reducing the size of |gen| so that we can have a full 32 bits for |removedCount|.  I couldn't make CAP_BITS greater than 30 as the overflow calculations need some headroom.

The benchmark completes with this change.
Assignee: nobody → jcoppeard
Ever confirmed: true
Attachment #8568480 - Flags: review?(luke)
Comment on attachment 8568480 [details] [diff] [review]

Nice re-packing.
Attachment #8568480 - Flags: review?(luke) → review+
I should have pushed this to try first.  There were a couple of issues:

1. The static assert of the sMaxCapacity used to check for overflow when calculating the size to allocate, but the calculation and overflow check are now handled in js_pod_calloc() so this can be removed.  This caused problems on 32-bit builds for hash tables with large entry size.

2. The testHashInit jsapi-tests now fail on try due to not being able to allocate enough memory now we've increased the maximum table size.  I don't see anything else to do but remove this tests.  I hope that's ok!
Attachment #8568480 - Attachment is obsolete: true
Attachment #8569165 - Flags: review?(luke)
Attachment #8569165 - Flags: review?(luke) → review+
Closed: 7 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla39
You need to log in before you can comment on or make changes to this bug.