Inconsistent “Performance” Metrics (FCP/LCP/SI) Reported by PageSpeed Insights

Understanding the Inconsistencies in PageSpeed Insights Performance Metrics for Your WordPress Site

Achieving top-tier website performance is a continuous journey, especially when aiming for a 4×100 score across both desktop and mobile devices. If you’re involved in optimizing a WordPress site, you know how challenging it can be to maintain consistent PageSpeed Insights (PSI) results, particularly when the scores fluctuate significantly within short timeframes.

Recognizing and Addressing Fluctuating PSI Performance Scores

Recently, I embarked on a mission to perfect my custom frontend setup, striving to secure 100/100 scores in PSI’s Performance category. After weeks of incremental improvementsโ€”refining CSS inlining, enabling HTTP response compression, preload hints for critical assets like hero images and fonts, and implementing strategic JavaScript preload sequencesโ€”I managed to consistently hit a perfect 4×100 on desktop and nearly so on mobile. For most cases, a 99/100 mobile score sufficed, but sporadically, PSI reports plummeted to 79/100 or below, despite testing the same page source within minutes.

This variability was perplexing. I meticulously examined server configurations, caching mechanisms, and network throttling settingsโ€”all aligned with best practices. I even added LogRocket to monitor runtime behaviors during PSI tests, later removing it after noting that console errors seem to influence PSI evaluations. I adopted comprehensive frontend logging, capturing every event as an error to scrutinize potential issues within PSI reports. Despite these measures, the inconsistency persisted, leading to significant frustration.

Why Do PSI Scores Vary So Drastically?

One key insight is that while Lighthouse audits under controlled conditions consistently produce perfect scores, PSI performance metrics oscillate unpredictably. My tests configured network throttling to “Slow 4G” and CPU throttling to “Low-tier mobile,” aligning with official calibration, which should give a stable base for evaluation.

The hypothesis points to scripts related to user consent (like CookieYes) and analytics (GA4). These scripts, although essential for compliance and tracking, might interfere with performance metrics, possibly delaying or altering key events like First Contentful Paint (FCP) and Largest Contentful Paint (LCP). However, logs show that “window.onload” fires within approximately half a second, while lazy-loaded assets conclude early, making the delayed metrics puzzling.

Examining the Evidence

  • Performance Metrics Variability: Two PSI tests, run five minutes apart with identical source code and after cache preload

Leave a Reply

Your email address will not be published. Required fields are marked *