What this article covers: The three current Core Web Vitals metrics and their 2026 thresholds; the most common failure patterns on London business websites; the fixes that move the needle; and what CWV scores mean in the context of a broader Technical SEO Audit. Contextual links to the SEO Consultant London practice are placed where the relationship is discussed.
Core Web Vitals became a Google ranking signal in June 2021. Since then, the metrics have been updated — First Input Delay was replaced by Interaction to Next Paint in March 2024 — and the weight of the signal in Google's ranking systems has been confirmed through multiple statements from Google's Search Relations team. For London business websites competing in local and national verticals, CWV scores are a measurable, improvable ranking factor.
The practical question for most London SMEs is not whether CWV matters — it does — but which specific issues on their website are causing poor scores, and which fixes will have the greatest impact on rankings relative to the effort required. The answer varies by site, but the failure patterns are consistent enough that a structured audit can identify the highest-priority fixes within a few hours.
The Three Core Web Vitals: Thresholds and What They Measure
LCPLargest Contentful Paint
Measures how long it takes for the largest visible element on the page to load. For most business websites, this is the hero image or the main heading block.
INPInteraction to Next Paint
Measures the responsiveness of the page to user interactions throughout the session. Replaced FID in March 2024. Poor INP is almost always caused by JavaScript blocking the main thread.
CLSCumulative Layout Shift
Measures visual stability — how much the page layout shifts unexpectedly as it loads. A high CLS score means elements are moving around as the page renders, which is disorienting and signals poor build quality to Google.
The Most Common CWV Failures on London Business Websites
Across technical audits of London business websites, four failure patterns account for the majority of poor Core Web Vitals scores. Understanding these patterns makes the audit process faster and the prioritisation clearer.
Unoptimised hero images are the single most common cause of poor LCP scores. A hero image that is 2–4MB in JPEG format, served without a CDN, and loaded without a preload hint will produce an LCP score of 4–8 seconds on a mobile connection. Converting the image to WebP, compressing it to under 200KB, and adding a preload hint will typically reduce the LCP score to under 2.5 seconds without any other changes.
Third-party script accumulation is the most common cause of poor INP scores. Every chat widget, marketing pixel, analytics tag, and social sharing button added to a website increases the JavaScript load on the main thread. A website with 8–12 third-party scripts will typically have an INP score in the 300–600ms range. Auditing and removing scripts that are not actively used — a task that takes 30–60 minutes — will often move the INP score into the 'good' range without any development work.
Missing image dimensions are the most common cause of poor CLS scores. When an image loads without explicit width and height attributes, the browser doesn't know how much space to reserve for it. As the image loads, the surrounding content shifts to accommodate it — producing a layout shift that is visible to users and measured by Google. Adding width and height attributes to all images is a straightforward fix that eliminates this class of CLS issue entirely.
WordPress plugin bloat is a compounding factor across all three metrics. Many London business websites are built on WordPress with 20–40 active plugins, each adding CSS and JavaScript to every page load regardless of whether those assets are needed on that page. A plugin audit — identifying and deactivating plugins that are not actively used, and replacing multi-function plugins with leaner alternatives — is one of the highest-ROI technical SEO interventions for WordPress sites.
The 48-Hour CWV Fix: A London Ecommerce Case Pattern
A recurring pattern in technical audits of London ecommerce websites: the site has a PageSpeed Insights score of 28–35 on mobile, a Google Search Console CWV report showing 'Poor URLs' for the majority of product pages, and a ranking plateau at positions 8–15 for the primary commercial terms. The client has been told by their web developer that improving the score would require a full site rebuild.
The audit consistently finds that 60–70% of the score improvement is achievable without a rebuild. The highest-impact interventions are: converting product images from JPEG to WebP (typically a 40–60% file size reduction), adding lazy loading to below-the-fold images, removing 3–5 third-party scripts that are either unused or duplicated, and setting explicit dimensions on all product images. These changes take 4–8 hours to implement and typically move the PageSpeed score from 28–35 to 55–70 on mobile.
The ranking impact is not immediate — Google's field data takes 28 days to update after changes are made — but the pattern across multiple London ecommerce sites is consistent: a CWV improvement from 'Poor' to 'Good' on the primary product category pages produces a 2–4 position improvement for the target commercial terms within 6–10 weeks. The Technical SEO Audit includes a full CWV assessment with prioritised fix recommendations and implementation guidance.
What CWV Scores Don't Tell You
Core Web Vitals are a technical signal, not a content signal. A website with perfect CWV scores and thin, poorly structured content will not outrank a website with average CWV scores and comprehensive, well-structured content. The relationship between technical performance and rankings is real but bounded — CWV is a tiebreaker, not a primary ranking factor.
The practical implication is that CWV improvements should be pursued alongside, not instead of, content and topical authority work. A website that fixes its CWV scores but doesn't address its content gaps will see modest ranking improvements. A website that builds topical authority through a structured Semantic SEO approach and maintains good CWV scores will see compounding improvements as both signals reinforce each other.
The Technical SEO Audit covers Core Web Vitals as part of a broader technical health assessment — crawlability, indexation, structured data, internal linking, and mobile usability — so the CWV findings are contextualised within the full technical picture rather than treated in isolation.
How to Measure Your CWV Scores
Two tools provide the most reliable CWV data for London business websites. Google Search Console's Core Web Vitals report (under the Experience section) provides field data — measurements from real Chrome users visiting your site — aggregated over a 28-day rolling window. This is the data Google uses for ranking decisions. PageSpeed Insights provides both field data (from the Chrome User Experience Report, where available) and lab data (from Lighthouse, a controlled test environment).
Field data is more reliable for ranking purposes because it reflects actual user conditions — varying device capabilities, network speeds, and geographic locations. Lab data is more useful for debugging because it provides a consistent, reproducible test environment. The correct approach is to use field data to identify which pages have CWV issues, and lab data to diagnose the specific causes and test the impact of fixes before deploying them.
Frequently Asked Questions
Get a Technical SEO Audit
The Technical SEO Audit covers Core Web Vitals, crawlability, indexation, structured data, and internal linking — with prioritised fix recommendations and implementation guidance. Fixed price at £1,499.
