Closed Bug 1229960 Opened 9 years ago Closed 9 years ago

Add support for computing and traversing dominator trees to HeapAnalysisWorker

Categories

(DevTools :: Memory, defect)

defect
Not set
normal

Tracking

(firefox45 fixed)

RESOLVED FIXED
Firefox 45
Tracking Status
firefox45 --- fixed

People

(Reporter: fitzgen, Assigned: fitzgen)

References

Details

Attachments

(1 file)

This commit defines `DominatorTreeNode`, a JS class representing a node in a
heap snapshot's dominator tree. Three heap analysis client/worker
request/responses request and create these `DominatorTreeNode`s. Unlike
censuses, dominator trees are too big to practically mirror in memory as JS
object structures. Instead, we have one request to get a partial/shallow
representation of the tree starting from the root, and another to get subsequent
children and siblings in the tree. This allows for incremental, lazy, and
bounded mirroring of the dominator tree as `DominatorTreeNode`s.
Assignee: nobody → nfitzgerald
Blocks: 1229229
Status: NEW → ASSIGNED
Has STR: --- → irrelevant
Comment on attachment 8694995 [details] [diff] [review]
Add support for computing and traversing dominator trees to HeapAnalysisWorker

Review of attachment 8694995 [details] [diff] [review]:
-----------------------------------------------------------------

::: devtools/shared/heapsnapshot/DominatorTreeNode.js
@@ +19,5 @@
> +  // The id of this node's parent or undefined if this node is the root.
> +  this.parentId = undefined;
> +
> +  // An array of immediately dominated child `DominatorTreeNode`s, or undefined.
> +  this.children = undefined;

Would it be better for these properties to be defined on the prototype, like `children: []`? Granted, `undefined` children has special meaning rather than an empty array it looks like. I guess instantiating an instance that has a null prototype, but static methods, with the constructor just assigning properties seems strange. What are the advantages over plain objects? Maybe these instances could have some assertions on its data in that case? Just some thoughts

@@ +32,5 @@
> +  // * If children is undefined and this property is true: there exist
> +  //   immediately dominated children for this node, but they have not been
> +  //   loaded yet.
> +  // * If children is undefined and this property is false: this node does not
> +  // * dominate any others and therefore has no children.

nit: extra '*'

@@ +55,5 @@
> +  parent.children.push(child);
> +  child.parentId = parent.nodeId;
> +};
> +
> +const DEFAULT_MAX_DEPTH = 3;

nit: put constants at top of the file

@@ +87,5 @@
> +      const endIdx = Math.min(childNodeIds.length, maxSiblings);
> +      for (let i = 0; i < endIdx; i++) {
> +        DominatorTreeNode.addChild(node, dfs(childNodeIds[i], newDepth));
> +      }
> +      node.moreChildrenAvailable = (childNodeIds.length < endIdx);

nit: no () needed

::: devtools/shared/heapsnapshot/HeapAnalysesClient.js
@@ +61,5 @@
> + *        The unix timestamp of the creation time of the snapshot, or null if
> + *        snapshot does not exist.
> + */
> +HeapAnalysesClient.prototype.getCreationTime = function (snapshotFilePath) {
> +  return this._worker.performTask("getCreationTime", snapshotFilePath);

This is already landed, this probably needs a rebase

::: devtools/shared/heapsnapshot/HeapAnalysesWorker.js
@@ +123,5 @@
> +    maxDepth,
> +    maxSiblings
> +  } = request;
> +
> +  if (0 <= dominatorTreeId && dominatorTreeId < dominatorTrees.length) {

nit: I don't really like the "early return, error otherwise" approach here and in `getImmediatelyDominated` -- hard to read, and I'd prefer a quick conditional (inverse of what's here) that throws immediately. Less indentation, and the "correct" path follows on a top level, rather than nested in conditionals

@@ +148,5 @@
> +  if (0 <= dominatorTreeId && dominatorTreeId < dominatorTrees.length) {
> +    const dominatorTree = dominatorTrees[dominatorTreeId];
> +    const childIds = dominatorTree.getImmediatelyDominated(nodeId);
> +    if (!childIds) {
> +      throw new Error(`${nodeId} is not a node id in the dominator tree`);

all the assertions are appreciated!

@@ +152,5 @@
> +      throw new Error(`${nodeId} is not a node id in the dominator tree`);
> +    }
> +
> +    const start = startIndex || 0;
> +    const count = maxCount || 50;

50 should be a const

@@ +157,5 @@
> +    const end = start + count;
> +
> +    const nodes = childIds
> +      .slice(start, end)
> +      .map(id => {

Wonder if this'd be more performant instead of a slice + map, we just did a reduce (or even just a map), skipping over indices > `end`

@@ +161,5 @@
> +      .map(id => {
> +        const size = dominatorTree.getRetainedSize(id);
> +        const node = new DominatorTreeNode(id, size);
> +        node.parentId = nodeId;
> +        node.moreChildrenAvailable = dominatorTree.getImmediatelyDominated(id).length > 0;

Earlier in this function there's an assertion for this returning children -- should do it here too for consistency?

::: devtools/shared/heapsnapshot/tests/unit/test_HeapAnalyses_getDominatorTree_01.js
@@ +6,5 @@
> +function run_test() {
> +  run_next_test();
> +}
> +
> +function isArray(obj) {

Can just use `Array.isArray`, unless there's another reason for this (in which case, comments)

::: devtools/shared/heapsnapshot/tests/unit/test_HeapAnalyses_getImmediatelyDominated_01.js
@@ +7,5 @@
> +  run_next_test();
> +}
> +
> +function isArray(obj) {
> +  return Object.prototype.toString.call(obj) == "[object Array]";

ditto Array.isArray

@@ +42,5 @@
> +    maxCount: Infinity
> +  });
> +
> +  ok(isArray(secondResponse.nodes));
> +  ok(secondResponse.nodes.every(node => node.parentId === partialTree.nodeId));

nit: Some second arguments describing the assertions would be helpful. Actually, if thrown, these should give rather clear failures for sheriffs/bugzilla, yeah?
Attachment #8694995 - Flags: review?(jsantell) → review+
Thanks for the review!

(In reply to Jordan Santell [:jsantell] [@jsantell] (Please needinfo) from comment #4)
> Comment on attachment 8694995 [details] [diff] [review]
> Add support for computing and traversing dominator trees to
> HeapAnalysisWorker
> 
> Review of attachment 8694995 [details] [diff] [review]:
> -----------------------------------------------------------------
> 
> ::: devtools/shared/heapsnapshot/DominatorTreeNode.js
> @@ +19,5 @@
> > +  // The id of this node's parent or undefined if this node is the root.
> > +  this.parentId = undefined;
> > +
> > +  // An array of immediately dominated child `DominatorTreeNode`s, or undefined.
> > +  this.children = undefined;
> 
> Would it be better for these properties to be defined on the prototype, like
> `children: []`? Granted, `undefined` children has special meaning rather
> than an empty array it looks like. I guess instantiating an instance that
> has a null prototype, but static methods, with the constructor just
> assigning properties seems strange. What are the advantages over plain
> objects? Maybe these instances could have some assertions on its data in
> that case? Just some thoughts

Putting them all in the constructor helps SpiderMonkey create a uniform shape for the instances. Putting the properties on the prototype actually hurts because then SM has to generate jit code to handle both cases where the property doesn't exist on this and the prototype chain must be traversed and when it does exist as normal. With the current approach, the only case that needs to be handled is when the property is on the instance.

> ::: devtools/shared/heapsnapshot/HeapAnalysesClient.js
> @@ +61,5 @@
> > + *        The unix timestamp of the creation time of the snapshot, or null if
> > + *        snapshot does not exist.
> > + */
> > +HeapAnalysesClient.prototype.getCreationTime = function (snapshotFilePath) {
> > +  return this._worker.performTask("getCreationTime", snapshotFilePath);
> 
> This is already landed, this probably needs a rebase

I just moved it so that related functions are grouped together.

> @@ +157,5 @@
> > +    const end = start + count;
> > +
> > +    const nodes = childIds
> > +      .slice(start, end)
> > +      .map(id => {
> 
> Wonder if this'd be more performant instead of a slice + map, we just did a
> reduce (or even just a map), skipping over indices > `end`

I'm not too worried about it because we've bounded the number of nodes with the incremental loading, so this won't ever be very big.

> :::
> devtools/shared/heapsnapshot/tests/unit/
> test_HeapAnalyses_getDominatorTree_01.js
> @@ +6,5 @@
> > +function run_test() {
> > +  run_next_test();
> > +}
> > +
> > +function isArray(obj) {
> 
> Can just use `Array.isArray`, unless there's another reason for this (in
> which case, comments)

FYI, Array.isArray is nonstandard and slated for removal from SpiderMonkey.
> FYI, Array.isArray is nonstandard and slated for removal from SpiderMonkey.

Woops, nvm: was thinking of Array.reduce et al
https://hg.mozilla.org/mozilla-central/rev/6e27e28a8dfa
Status: ASSIGNED → RESOLVED
Closed: 9 years ago
Resolution: --- → FIXED
Target Milestone: --- → Firefox 45
Product: Firefox → DevTools
You need to log in before you can comment on or make changes to this bug.

Attachment

General

Created:
Updated:
Size: