Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
636 views
in Technique[技术] by (71.8m points)

css - Choosing efficient selectors based on computational complexity

Given the CSS processing specifics, most notably RTL matching and selector efficiency, how should selectors be written purely from the perspective of rendering engine performance?

This should cover general aspects and include using or avoiding pseudo-classes, pseudo-elements and relationship selectors.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

At runtime an HTML document is parsed into a DOM tree containing N elements with an average depth D. There is also a total of S CSS rules in the stylesheets applied.

  1. Elements' styles are applied individually meaning there is a direct relationship between N and overall complexity. Worth noting, this can be somewhat offset by browser logic such as reference caching and recycling styles from identical elements. For instance, the following list items will have the same CSS properties applied (assuming no pseudo-classes such as :nth-child are applied):

    <ul class="sample">
      <li>one</li>
      <li>two</li>
      <li>three</li>
    </ul>
    
  2. Selectors are matched right-to-left for individual rule eligibility - i.e. if the right-most key does not match a particular element, there is no need to further process the selector and it is discarded. This means that the right-most key should match as few elements as possible. Below, the p descriptor will match more elements including paragraphs outside of target container (which, of course, will not have the rule apply but will still result in more iterations of eligibility checking for that particular selector):

    .custom-container p {}
    .container .custom-paragraph {}
    
  3. Relationship selectors: descendant selector requires for up to D elements to be iterated over. For instance, successfully matching .container .content may only require one step should the elements be in a parent-child relationship, but the DOM tree will need to be traversed all the way up to html before an element can be confirmed a mismatch and the rule safely discarded. This applies to chained descendant selectors as well, with some allowances.

    On the other hand, a > child selector, an + adjacent selector or :first-child still require an additional element to be evaluated but only have an implied depth of one and will never require further tree traversal.

  4. The behavior definition of pseudo-elements such as :before and :after implies they are not part of the RTL paradigm. The logic the assumption is that there is no pseudo element per se until a rule instructs for it to be inserted before or after an element's content (which in turn requires extra DOM manipulation but there is no additional computation required to match the selector itself).

  5. I couldn't find any information on pseudo-classes such as :nth-child() or :disabled. Verifying an element state would require additional computation, but from the rule parsing perspective it would only make sense for them to be excluded from RTL processing.

Given these relationships, computational complexity O(N*D*S) should be lowered primarily by minimizing the depth of CSS selectors and addressing point 2 above. This will result in quantifiably stronger improvements when compared to minimizing the number of CSS rules or HTML elements alone^

Shallow, preferably one-level, specific selectors are processed faster. This is taken to a whole new level by Google (programmatically, not by hand!), for instance there is rarely a three-key selector and most of the rules in search results look like

#gb {}
#gbz, #gbg {}
#gbz {}
#gbg {}
#gbs {}
.gbto #gbs {}
#gbx3, #gbx4 {}
#gbx3 {}
#gbx4 {}
/*...*/

^ - while this is true from a rendering engine performance standpoint, there are always additional factors such as traffic overhead and DOM parsing etc.

Sources: 1 2 3 4 5


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...