3 years ago
a year ago


(Reporter: ekyle, Unassigned)


Firefox Tracking Flags

(Not tracked)




3 years ago
The ElasticSearch response could be huge, up to a gigabyte.  Python probably can not handle that.  The query response translator has an `aggs_iterator()` [1], which implies the code already deals with the ES response as a stream, but we know the low level JSON decoder requires all JSON to be in memory:  The `aggs_iterator()` has lots of unused potential.

Find a JSON streaming library, or make one, for use by the query response translator.

Since existing JSON streaming appears to be a event-listener driven maddness [2],
I suggest creating a JSON streaming library that accepts a query, and operates on the stream just like any other data set.


> for prefix, event, value in parser:
>     if (prefix, event) == ('earth', 'map_key'):
>         stream.write('<%s>' % value)
>         continent = value
>     elif prefix.endswith('.name'):
>         stream.write('<object name="%s"/>' % value)
>     elif (prefix, event) == ('earth.%s' % continent, 'end_map'):
>         stream.write('</%s>' % continent)
> stream.write('</geo>')


3 years ago
Assignee: klahnakoski → nobody

Comment 1

3 years ago
Around the week of Sept21, test informant was failing due to this problem.  Either the data was too big for the ActiveData service to digest (which causes a crash, and restart), or the service delivered enough JSON to push 32bit Firefox over it memory limit, causing a browser crash.

Comment 2

3 years ago
Also, it is notice that buildbot json files [1] can reach 250megs (uncompressed) or more, which also is a little too big to be processed en-mass with 32bit python.


Comment 3

a year ago
done long ago
Last Resolved: a year ago
Resolution: --- → FIXED
You need to log in before you can comment on or make changes to this bug.