Technical SEO

Lighthouse’s Limited Use of Interaction to Next Paint (INP) Metric Explained by Google Chrome Developer Advocate Barry Pollard

Google’s Lighthouse does not utilize the Interaction to Next Paint (INP) metric in its standard tests, even though INP is one of the Core Web Vitals.

Barry Pollard, a Web Performance Developer Advocate on Google Chrome, clarified the reasons for this omission and provided insights into measuring INP.

Lighthouse primarily measures page loads rather than interactions. It evaluates a simple page load, capturing various characteristics during the process. It can estimate the Largest Contentful Paint (LCP) and Cumulative Layout Shift (CLS) under specific loading conditions, identifying issues and offering advice on improving these metrics. However, INP is distinct as it relies on user interactions.

Pollard explained:

“The problem is that Lighthouse, like many web performance tools, typically just loads the page and does not interact with it. No interactions = No INP to measure!”

Custom User Flows Enable INP Measurement

While Lighthouse cannot measure INP directly, knowledge of common user journeys allows for the use of “user flows” to measure INP. Pollard added:

“If you, as a site owner, know your common user journeys, then you can measure these in Lighthouse using ‘user flows,’ which will then measure INP.”

These common user journeys can be automated in a continuous integration environment, enabling developers to test INP on each commit and identify potential regressions.

Total Blocking Time As An INP Proxy

Although Lighthouse cannot measure INP without interactions, it can evaluate potential causes, particularly long, blocking JavaScript tasks. The Total Blocking Time (TBT) metric is useful here. According to Pollard:

“TBT (Total Blocking Time) measures the sum time of all tasks over 50ms. The theory is:
– Lots of long, blocking tasks = high risk of INP!
– Few long, blocking tasks = low risk of INP!”

Limitations Of TBT As An INP Substitute

TBT has limitations as a substitute for INP. Pollard noted:

“If you don’t interact during long tasks, then you might not have any INP issues. Also, interactions might load MORE JavaScript that is not measured by Lighthouse.”

He adds:

“So it’s a clue, but not a substitute for actually measuring INP.”

Optimizing For Lighthouse Scores vs. User Experience

Some developers focus on optimizing for Lighthouse scores without considering the user impact. Pollard cautions against this, stating:

“A common pattern I see is to delay ALL JS until the user interacts with a page: Great for Lighthouse scores! Often terrible for users 😢:
– Sometimes nothing loads until you move the mouse.
– Often your first interaction experiences a bigger delay.”

Why This Matters

Understanding the relationships between Lighthouse, INP, and TBT is crucial for optimizing user experience. Recognizing the limitations in measuring INP helps avoid ineffective optimizations. Pollard advises focusing on real user interactions to ensure performance improvements truly enhance user experience. As INP remains a Core Web Vital, understanding its nuances is vital for keeping it within an acceptable threshold.

Practical Applications

To monitor site performance and INP:
1. Use Lighthouse’s “user flows” for INP measurement in common journeys.
2. Automate user flows in CI to monitor INP and catch regressions.
3. Use TBT as an INP proxy, but be aware of its limitations.
4. Prioritize field measurements for accurate INP data.
5. Balance performance optimizations with UX considerations.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button