Closed Bug 1316347 Opened 3 years ago Closed 3 years ago

Why are HTMLContentElement and HTMLShadowElement unconditionally exposed?

Categories

(Core :: DOM: Core & HTML, defect)

defect
Not set

Tracking

()

RESOLVED FIXED
mozilla52
Tracking Status
firefox52 --- fixed

People

(Reporter: bzbarsky, Assigned: wchen)

References

Details

Attachments

(1 file)

Shouldn't they be conditioned on nsDocument::IsWebComponentsEnabled?
Flags: needinfo?(wchen)
Comment on attachment 8809196 [details] [diff] [review]
Only expose the HTMLShadowElement and HTMLContentElement interfaces when web components are enabled.

You need to fix also at least dom/tests/mochitest/general/test_interfaces.html
Attachment #8809196 - Flags: review?(bugs) → review-
Comment on attachment 8809196 [details] [diff] [review]
Only expose the HTMLShadowElement and HTMLContentElement interfaces when web components are enabled.

Per irc with smaug, this is r+ but tests should be changed so that it does not require web components enabled by default (bug 1159768).
Attachment #8809196 - Flags: review- → review+
Pushed by wchen@mozilla.com:
https://hg.mozilla.org/integration/mozilla-inbound/rev/53c441f68990
Only expose the HTMLShadowElement and HTMLContentElement interfaces when web components are enabled. r=smaug
https://hg.mozilla.org/mozilla-central/rev/53c441f68990
Status: NEW → RESOLVED
Closed: 3 years ago
Resolution: --- → FIXED
Target Milestone: --- → mozilla52
Just a heads-up, I believe this fix may be causing Google's Experiments page to get stuck in an infinite loop, during a custom animation that calls getDistributedNodes on an element, which now returns an empty array where it didn't used to. A build from the 11th works, a build from the 12th does not.

I'm not quite sure if this is something to be concerned about beyond Google's experiments page, but I thought I'd raise the concern here.
Flags: needinfo?(bzbarsky)
This change shouldn't have affected the behavior of getDistributedNodes in any way.  All it changed is whether window.HTMLContentElement and window.HTMLShadowElement are defined.

Is the experiment doing "instanceof HTMLContentElement" or something?
Flags: needinfo?(bzbarsky)
I haven't had a chance to confirm what the precise issue is, but I do know that it appears between builds as mentioned, and the furthest I managed to diagnose was to confirm that getDistributedNodes was no longer returning the two elements their scripts expected. Is there perhaps another fix that landed that day which might have affected this?
And note that dom.webcomponents.enabled is false, so getDistributedNodes() shouldn't even be reachable to start with by default.

Do you have that preference toggled to a non-default value?  Can you link to a pushlog for your regression range?
Sure, here's a pushlog link: https://hg.mozilla.org/mozilla-central/pushloghtml?fromchange=d38d06f85ef59c5dbb5d4a1a8d895957a78714de&tochange=fc104971a4db41e38808e6412bc32e1900172f14

I did specifically check that the webcomponents about:config prefs were false (indeed, setting them to true breaks the page even further).

You're right, it does seem strange to me as well that this would be an issue, so maybe this is a red herring. But I thought I'd raise the issue here, as I might not be able to investigate further for a few days.
Link to the failing page would be useful.  ;)
I guess https://www.chromeexperiments.com/

I'm not seeing an infinite loop, but I'm seeing the page not render...

It's certainly possible that this is related, because it tries to do some webcomponents polyfill stuff. What's weird is that it works in Safari, which does not expose HTMLShadowElement and HTMLContentElement.

If that's the right URI, please let me know and I'll file a bug to investigate?  Or you file one as needed?
Oh, and if that is the right URI, then we do care about breaking it, in general.  The question is why it's broken at all and whether the right fix is on our side or Google's.
Yes, the page does look like nothing's going on; it's just black, with no obvious messages in the console log. But when I turn on the console log's CSS filter, I see a neverending flood of these:

>Error in parsing value for 'transform'. Declaration dropped.

When I hit the pause button in the debugger (which can show the slow script dialog, and may take a couple of tries), it reveals that this function is being called over and over (I'm showing the un-minified version from https://www.chromeexperiments.com/elements/elements.vulcanized.js in the same server directory):
>        // rather than tween the target directly, we set up a state machine and tween
>        // its values, polling it and updating when values change.
>        function pollState() {
>            if (state.x !== prevState.x) {
>                imageStrip.style.transform = 'translateX(' + state.x.toFixed( 3 ) + 'px)';
>                imageStrip.style.webkitTransform = 'translateX(' + state.x.toFixed( 3 ) + 'px)';
>                prevState.x = state.x;
>            }
> 
>            if (state.index !== prevState.index) {
>                animateToIndex( state.index, easeOutQuad );
>                prevState.index = state.index;
>                self.selectedIndex = state.index;
>            }
> 
>            if (state.needsUpdate) {
>                animateToIndex( state.index, easeOutQuad );
>                state.needsUpdate = false;
>            }
> 
>            return requestAnimationFrame( pollState );
>        }

Using mitmproxy to tinker with the script on the live page, I confirmed that it's being called over and over, and that state.x.toFixed(3) is NaN after the first call. Looking at function animateToIndex:
>        function animateToIndex(i, ease) {
>            if (state.anim) {
>                state.anim.stop();
>            }
>            state.anim = tween( state, 'x', -self.lefts[i], DURATION, ease, nullAnim );
>        }

The self.lefts array is empty, so it gets -NaN from there. And it turns out that self.lefts is defined here:
>Polymer( 'cx-carousel', {
>    /* snip */
>    updatePositions: function() {
>        this.slides = [].slice.call( this.$.kontent.getDistributedNodes() );
>        this.lefts = this.slides.map( function(s) {
>            return s.offsetLeft;
>        } );
>        this.state.needsUpdate = false;
>    },

In working builds, this.slides ends up with two elements. In non-working builds, getDistributedNodes returns no elements. This is as far as I had time to investigatey so far. The definition of getDistributedNodes is in webcomponents.js, v0.6.1 of Polymer's polyfill (https://www.chromeexperiments.com/components/webcomponentsjs/webcomponents.js), and I recalled seeing this bug in the works, so I figured it might be a lead on why the page suddenly broke (especially since it landed when the breakage began).

As I said though, I don't know why it would affect things, it's just my first lead. Nightly builds also worked fine like Safari before the given pushlog, after all. Or maybe I'm just going crazy/there's something weird on my end, but I'm testing with a fresh profile, so it would be good to know :)
Depends on: 1319255
I filed bug 1319255 on the chromeexperiments issue, like we should have up front.

Might be worth copying whatever useful information we have after comment 6 here.  It's definitely a regression from this change, but it's not clear why still.
Component: DOM → DOM: Core & HTML
You need to log in before you can comment on or make changes to this bug.