Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Custom Element definition ordering can cause difficult-to-find bugs. #668

Closed
trusktr opened this issue Sep 19, 2017 · 19 comments
Closed

Custom Element definition ordering can cause difficult-to-find bugs. #668

trusktr opened this issue Sep 19, 2017 · 19 comments

Comments

@trusktr
Copy link
Contributor

trusktr commented Sep 19, 2017

What are the recommended ways of dealing with load order issues? For example, take these two fiddles:

One works (outputs true) and one doesn't (outputs false) for reasons that might not be very obvious to people learning how to use web technology. New programmers shouldn't have to deal with these sorts of problems if they can be avoided by the internal implementation.

A simple deferral of the constructing of Custom Elements to a future microtask would likely solve the problem without requiring Custom Element authors to write defensive deferral code themselves.

Note, Custom Element authors can not determine in which order users will load their scripts, and if elements will be in DOM or not before those scripts are loaded, and this causes unnecessary, increasingly complex, defensive code to be written.

Here's the same example, in the other two possible permutations:

And in those last two permutations, the ordering doesn't help at all! It's false both ways!

Here's a defensive code example that always works, in all four permutations:

And, unfortunately, it only work in two of the four permutation when using microtasks! See:

The best solution in userland probably involves customElements.whenDefined, but it's still defensive coding.


The HTML engine could, at least, guarantee that if these elements are defined within the same macrotask synchronously, then the result will be true in all permutations. This would make things easier for people, especially new people who are writing code and letting other mechanism place their codes into a page in arbitrary order that they aren't in control of.

Would it be possible to fix this on the browser engine side by not upgrading elements until a microtask-like similar to how MutationObserver runs on a microtask-like?

@trusktr trusktr changed the title Custom Element definition order issues. Custom Element definition ordering can cause difficult-to-find bugs. Sep 19, 2017
@rniwa
Copy link
Collaborator

rniwa commented Sep 19, 2017

I think it would be too risky make customElements.define delay upgrading its elements until the end of the current micro task at this point.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

I'm still battling with this. This one gives true:

connectedCallback() {
	setTimeout(() => {
		this.firstChild instanceof SomeCustomElement // TRUE
	}, 0)
}

but this one gives false:

connectedCallback() {
	Promise.resolve().then(() => {
		this.firstChild instanceof SomeCustomElement // FALSE
	})
}

. The timing of element upgrades is too tricky to work with (f.e. #671). I can't reliably detect upgraded child elements in a microtask which means I can't reliably do some rendering in the next frame because if I use setTimeout(..., 0) this doesn't guarantee logic will fire before the next frame. This results in janky initial rendering because some elements are upgraded, but other upgraded elements are not caught until after the next frame.

IMO such logic-deferral hacking should not ever be needed by a CE author for detecting children. In React, f.e., this.props.children is always guaranteed to show you the component's children without fail.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

The problem in my last comment is because the elements are upgraded during DOM parsing, so an elements connectedCallback executes before the parser reaches the next element. When the parser is on an element can calls connectedCallback, this.childNodes.length of the currently-being-parsed-element is 0 (and innerHTML is "") even though in reality it will not be 0 as soon as the parser continues.

Making shareable reusable components currently involves tricky and difficult timing issues. Ideally it would be as easy as with React or Vue, so that anyone (especially newcomers to the web) can start making components without friction.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

Hmmm, but I thought parsing was synchronous? If it was, then it shouldn't make a difference if I was using setTimeout or Promise.resolve because both would fire after parsing. Is that right?

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

Actually, with native CE v1 in Chrome, innerHTML is treated atomically (even if innerHTML is used during initial DOM parsing to create elements rather than the elements being part of the initial HTML!), so that childNodes is not 0 and anyElement.innerHTML is not empty by the time life CE cycle methods are called! Basically CE behaviors are not triggered until the DOM created from innerHTML is ready. This is inconsistent with initial DOM parsing.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

Alright, so if I wait for the document to be ready, it seems I can make more of my particular cases work:

connectedCallback() {
	documentReady().then(() => {
		this.firstChild instanceof SomeCustomElement // TRUE
	})
}

but now I've introduced the ability for users of my elements to add/remove elements to/from DOM before document is ready, and who knows what bugs this could cause my elements. There's just too many unknown cases. ¯\_(ツ)_/¯

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

I really think making any parsing atomic as far as CE is concerned, and just general guarantee on CE life cycle ordering (in whole HTML trees, not just for individual elements, would lead to programs that are likely to be more reliable let alone easier to make.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

To depict the situation, the following example shows all the actual test cases that I'm testing (I comment them all out except for one at a time, for now).

https://gist.github.com/trusktr/725acea2e839330400aaa791aa9fbbcc

They're labeled as WORKS or BROKEN. I've got just a few broken ones left to fix. The ones that work show visual output like

screenshot 2017-10-08 at 1 33 47 am

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

Much more of my cases were BROKEN before.

This shows the nature of CE timing complexity when elements need to be aware of their (upgraded) children or parents.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

Lastly, for reference, my actual connectedCallback looks something like the following in order do something with children of the element that has connectedCallback fired:

connectedCallback() {

    //setTimeout(() => {
    //Promise.resolve().then(() => {
    documentReady().then(() => { // seems to work best

        const children = this.childNodes

        for (let l=children.length, i=0; i<l; i+=1) {
            // do something with child of certain (upgraded) type
        }

    })
    //}, 0)
}

@rniwa
Copy link
Collaborator

rniwa commented Oct 8, 2017

You really can't rely on your children being upgraded like that. HTML parser can add more children as it parses things, and there's no way to know when that happens. It could be before or after setTimeout(~, 0). FWIW, it could be being blocked by the network!

See #619

@rniwa rniwa added the question label Oct 8, 2017
@rniwa
Copy link
Collaborator

rniwa commented Oct 8, 2017

I don't think there's anything to be done in the spec here.

@rniwa rniwa closed this as completed Oct 8, 2017
@wiredearp
Copy link

There was according to https://bugs.chromium.org/p/chromium/issues/detail?id=658119#c4 a proposal for adding an "end tag parsed" callback to the spec. It seems to me that such a thing would clear this up. Let's call it readyCallback to stick with that jQuery terminology.

  /**
   * Fires when the local DOM subtree has been fully parsed
   * and, if we are lucky, all nested CEs are fully upgraded :)
   */
  readyCallback() {
    if(this.querySelector('form')) {
      this.classList.add('interactive');
    }
  }

Note that the example use case doesn't involve nested custom elements, just plain elements, because I think those are important to keep in mind.

I haven't been far enough down this rabbit hole to tell if it would solve everything, because I abandoned my plan of basing our custom hacked framework on Custom Elements as soon as this issue became apparent. I think it is a real problem. It could also be fixed in another spec if the DOMContentLoaded event was rigged to fire upon all nodes for whom this particular listener was added.

  connectedCallback() {
    this.addEventListener('DOMContentLoaded', () => {
      if(this.querySelector('form')) {
        this.classList.add('interactive');
      }
    });
  }

In any case, the spec does need to come up with a way for custom elements to synchronously evaluate the descendant subtree before it switches to Mutation Observers to monitor future updates, or at least it needs to formalize the use of setTimeout as a workaround for anyone who plans to build something moderately complicated. Perhaps I have overlooked something, but issues such as this makes me convinced that it is really not possible.

@wiredearp
Copy link

wiredearp commented Oct 8, 2017

Right. Upon reading #668 (comment) again, it sounds to me like the issue is actually fixed in the native implementation and that perhaps this appears worse than it really is because the polyfill uses Mutation Observers for some kind of asynchronous implementation?

In that case, the solution is easy: Just wait for DOMContentLoaded before you do anything, including the registration of Custom Elements. I personally believe that all scripting should be suspended until this moment in time because anything that goes before is just confusing and will only lead to the event being fired later which in turn sustains the need to perform scripting before the event happens. It is in other words an arms race. Anyways, during initial parse the situation (as I imagine it) is equivalent to this:

  <ul>
    <script>alert(document.querySelector('ul').innerHTML)</script>
    <li>One</li>
    <li>Two</li>
    <li>Three</li>
  </ul>

This will alert then <script> tag without the <li> elements simply because the parser haven't found them yet, so it is not a problem that is unique to Custom Elements, it us just unfortunate that the upgrade process can happen during this stage of parsing. Perhaps the suggestion should be to postpone the upgrade to DOMContentLoaded? My own findings were based on the very first native implementation (v0) in Firefox and I am not in a position to verify if it is indeed fixed nowadays, it does sound like a problem with the polyfill, but upgrades during initial parsing do seem bound to be permanently confusing.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

@rniwa If we don't add anything to the spec, then as @wiredearp suggested, many authors will need to similarly wrap their callback logic in document ready handlers, which is exactly what I've done in my above examples.

Many people are going to do something like this, using jQuery:

connectedCallback() {
  $(document).ready(() => {
    doSomethingWithUpgradedChildren(this.children)
  })
}

FWIW, it could be being blocked by the network!

I'd rather wait for the closing tag of an element, even if that means waiting for the network, and therefore for inner element constructors and connectedCallbacks to ALWAYS fire in the same order.

This will simply make programs predictable and reliable.

Just imagine the trouble React and Vue would cause if they ran life cycle methods in arbitrary ordering depending on this or that. It'd be a tsunami of issues flooding GitHub.

Most people these days are not blocked by network on fetching DOM. At least, it's not a significant problem on any sites I ever visit, or the time it takes is always fast enough that I don't notice any problems.

@trusktr
Copy link
Contributor Author

trusktr commented Oct 8, 2017

@rniwa My overall sentiment: consistency (even if slower from the network) would be much more appreciated by many devs (even if the appreciation is unknown to them because consistency won't introduce problems for them) over inconsistent behavior that will introduce bugs into many dev's custom elements before they realize it and have already shipped things to production. The current behavior is just unexpected land mines.

Are you sure that you are okay with this?

@rniwa
Copy link
Collaborator

rniwa commented Oct 9, 2017

Either this is a duplicate of #619 or it's just a question. I don't think there's anything new we haven't thought of.

I agree what you're trying to do is tricky but that's what #619 is for. If you'd like to see that API being spec'ed then we need to keep the discussion in that issue instead of creating new issues to discuss very similar use cases.

trusktr added a commit to lume/lume that referenced this issue Oct 9, 2017
…rocess, we fixed many cases of script load order issues, so now we can define elements in any order agnostic to when/how the elements are put into the DOM. This was painful, see WICG/webcomponents#668 for details on the type of stuff that needed to be dealt with.

We also broke IE 11, and some demos in Edge, but we'll first refactor everything and get that working in Chrome then fix it up again in a following commit.
@trusktr
Copy link
Contributor Author

trusktr commented Oct 9, 2017

@wiredearp

it sounds to me like the issue is actually fixed in the native implementation and that perhaps this appears worse than it really is because the polyfill uses Mutation Observers for some kind of asynchronous implementation?

All my observations were based on Chrome's native implementation: initial DOM parse not being atomic (connetedCallbacks fire from parent to child, down-the-tree, children visible with deferral) while innerHTML being atomic (connectedCallbacks fire from child to parent, up-the-tree, children visible without deferral).

@trusktr
Copy link
Contributor Author

trusktr commented Oct 9, 2017

@rniwa

duplicate of #619

Might be. If #619's childrenChangedCallback will fire even for children that already exist when the element is upgraded (and children have already been upgraded), then that will cover the problem here, otherwise it won't.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants