Core Web Vitals UK
Fix LCP, INP and CLS. Measure with PageSpeed Insights and Search Console for UK ranking signals.
Read guideTechnical SEO ensures search engines can efficiently crawl, index, and understand your website. No amount of great content will rank if the technical foundation is broken.
Technical SEO refers to the optimisations made to a website's infrastructure — the code, server configuration, and architecture — to help search engines efficiently crawl, index, and render your content. While on-page SEO focuses on what's visible, technical SEO addresses the underlying mechanics that determine whether search engines can even find and properly process your pages.
Technical SEO is foundational. If Googlebot can't crawl your pages, or your site loads in 8 seconds, or you have duplicate content issues, even the best written content won't achieve its ranking potential. Think of technical SEO as building a solid foundation before decorating the house.
Core Web Vitals are a set of specific metrics Google uses to measure real-world user experience on web pages. They became an official ranking factor in May 2021 and have grown in importance since. There are three primary Core Web Vitals:
Measures loading performance — specifically how quickly the largest visible element loads. Target: under 2.5 seconds. Poor: over 4 seconds. Optimise by improving server response, compressing images, and using CDN.
Measures interactivity — how quickly a page responds to user interactions. Target: under 200ms. Poor: over 500ms. Optimise by reducing JavaScript execution time and minimising main thread work.
Measures visual stability — how much the page layout shifts unexpectedly during loading. Target: under 0.1. Poor: over 0.25. Avoid by specifying image/video dimensions and avoiding dynamically injected content above the fold.
While not an official CWV, TTFB (how quickly the server responds) is foundational to all other metrics. Target: under 800ms. Improve via better hosting, caching, and CDN implementation.
Before any page can rank, it must first be crawled by Googlebot and then included in Google's index. Crawlability refers to the ease with which search engine bots can access and navigate your website. Indexability refers to whether crawled pages can actually be added to Google's index.
The robots.txt file is a text file at your website's root (e.g., example.com/robots.txt) that provides instructions to web crawlers about which pages they should or shouldn't access. Use it to prevent crawling of admin pages, search result pages, and duplicate content sections. However, robots.txt only prevents crawling — it doesn't prevent indexing of already-known pages.
An XML sitemap is a file listing all the important URLs on your website, helping search engines discover and understand your content structure. Every website should have a sitemap, submitted via Google Search Console. Include only canonical, indexable URLs in your sitemap. Keep it updated whenever you add or remove pages.
Crawl budget is the number of pages Googlebot crawls on your site within a given time frame. For large sites, optimising crawl budget is critical. Block low-value pages (admin, search results, filter combinations) from crawling, improve internal link structure so important pages are discovered more efficiently, and fix crawl errors promptly.
Navigate to Coverage report in GSC to identify 4xx errors, server errors, and pages excluded from the index. Fix broken links, update redirects, and address any crawl anomalies.
When you move or delete pages, always use 301 redirects (permanent) to forward traffic and link equity to the new location. Avoid redirect chains (A→B→C) — they dilute link equity and slow down crawling.
Duplicate content confuses search engines about which version to rank and dilutes authority. Use canonical tags (rel="canonical") to specify the preferred version of pages. Handle www vs non-www, HTTP vs HTTPS, and trailing slashes consistently with server-level redirects.
Google primarily uses the mobile version of your site for indexing and ranking. Ensure your mobile site contains all the same content, structured data, and metadata as your desktop version. Use responsive design rather than separate mobile URLs where possible.
HTTPS (Hypertext Transfer Protocol Secure) is a confirmed ranking signal. Every website in 2026 must use HTTPS — not just because Google prefers it, but because users trust and expect secure connections. If your site still uses HTTP, migrate to HTTPS immediately using an SSL/TLS certificate (Let's Encrypt offers free certificates).
After migrating to HTTPS, ensure all internal links, sitemap URLs, and canonical tags use the HTTPS version. Set up 301 redirects from HTTP to HTTPS at the server level. Update your Google Search Console property to the HTTPS version.
Page speed is both a ranking factor and a user experience factor. Slow sites lose visitors, conversions, and rankings simultaneously. Key technical optimisations for speed include:
Structured data (Schema.org markup) is code added to your HTML that explicitly tells search engines what your content means, not just what it says. Properly implemented structured data can trigger rich results in SERPs, such as star ratings, FAQ expansions, event cards, and recipe information — significantly improving CTR.
Implement structured data using JSON-LD format (Google's preferred method). Validate your markup using Google's Rich Results Test tool. Key schema types to implement: Organisation, WebSite (with SearchAction), Article, BreadcrumbList, FAQPage, HowTo, Product, and LocalBusiness.
| Area | Check | Tool |
|---|---|---|
| Indexability | All important pages indexed in Google | Google Search Console |
| Speed | LCP under 2.5s, INP under 200ms, CLS under 0.1 | PageSpeed Insights |
| Mobile | Mobile-friendly on all devices | Google Mobile-Friendly Test |
| Security | HTTPS implemented site-wide | Browser / GSC |
| Crawling | No crawl errors or soft 404s | Google Search Console |
| Sitemap | XML sitemap submitted and valid | GSC / Screaming Frog |
| Robots.txt | Not blocking important pages | Google Search Console |
| Canonicals | All pages have canonical tags | Screaming Frog |
| Redirects | No broken or chained redirects | Screaming Frog |
| Structured Data | Valid schema markup on key pages | Rich Results Test |
Question 1 of 5