BlogSEO TechniqueSite Speed: Impact on SEO and AI Citations
Back to blog
SEO Technique

Site Speed: Impact on SEO and AI Citations

A slow site does not just lose visitors. In 2026, it loses Google positions, crawl budget, and above all AI citations. Discover how your site speed determines your visibility on all fronts.

LB
Lucie Bernaerts
Expert GEO
27 February 2026
11 min read
0 views
Site Speed: Impact on SEO and AI Citations

Last month, we audited a legal firm's website in Brussels. Clean design, solid copy, decent backlinks. But when we ran the speed test, the homepage took 4.2 seconds to load. The culprit was not what you would expect -- it was not uncompressed images or bloated JavaScript. It was a shared hosting plan at 7 euros a month with a server in Montreal. For a Belgian firm targeting Belgian clients.

Their Google rankings had been slipping for six months. They blamed the algorithm.

It was the server.

Isometric illustration of site speed impact on SEO and AI
Site speed affects three layers of visibility: Google ranking, crawl frequency, and AI citations

The numbers are blunt

A Portent meta-analysis across 2 million European sessions (2025) found that each additional second of load time cuts conversion by 12%. A 1-second site converts 2.5x more than a 5-second site. Bounce rates jump 32% when load time goes from 1 to 3 seconds.

For rankings specifically, Searchmetrics (Berlin, 2025) found that top-10 Google results in Europe have an average TTFB of 320ms. Page-2 results average 1.2 seconds. That is not a subtle difference.

But here is what most speed guides miss entirely.

Your slow server is making you invisible -- not just slow

Speed is not just about user experience. It directly controls how many of your pages Google even bothers to look at. If your server responds slowly, Googlebot crawls fewer pages per day. Google has confirmed this repeatedly. A slow TTFB does not just make you rank lower -- it makes entire sections of your site disappear from the index.

Average TTFB Googlebot crawl AI bot crawl What happens
< 200ms Full crawl High Fast indexation, frequent AI visits
200-500ms 80-100% Medium Normal -- acceptable for most sites
500ms-1s 50-80% Low Delays start showing up
1-2s 30-50% Very low Pages missing from index
> 2s < 30% Near zero Serious visibility loss

We see this pattern constantly. A client has 500 pages. Google crawls 80 per day because the TTFB is 1.5 seconds. They add 30 new blog posts. Google never indexes them. The client blames content quality. It is infrastructure.

AI bots are even less patient

GPTBot, ClaudeBot, PerplexityBot -- they all have shorter timeouts than Googlebot. PerplexityBot crawls in real time while users wait for answers, so its patience is measured in single-digit seconds. If your server takes 2 seconds to respond, these bots move on to the next source. You lose a citation not because your content is worse, but because your infrastructure is slower.

This is the new blind spot of digital visibility. We have audited over 200 European sites and roughly a third of them are what we call "AI-slow" -- content is good, SEO is decent, but AI bots cannot load their pages fast enough to cite them.

What actually moves the needle

Six things, in order of impact for most European SMEs:

  1. Move to a European server or edge network. Cloudflare (free plan works), Vercel Edge, or a VPS in Frankfurt/Amsterdam. This alone can cut TTFB from 1.5s to 200ms. It is the single biggest win for most sites.
  2. Set aggressive HTTP cache headers. Static assets should have Cache-Control: max-age=31536000. HTML pages should use stale-while-revalidate. This eliminates redundant server hits.
  3. Convert images to WebP. 25-35% smaller than JPEG at the same quality. Use loading="lazy" for anything below the fold. AVIF is even better but browser support is patchy.
  4. Minify CSS and JavaScript. Remove whitespace, comments, dead code. If you are on WordPress, a plugin like Autoptimize handles this in minutes.
  5. Use server-side rendering (SSR) or static generation (SSG). Client-side rendered sites are invisible to AI bots. They do not execute JavaScript.
  6. Check your database. Missing indexes and unoptimised queries are the hidden TTFB killers on dynamic sites. Connection pooling helps too.
[Image: before/after speed optimisation with metrics]
Case study: TTFB before and after CDN + cache configuration

How to measure it

Google PageSpeed Insights gives you the full picture with recommendations. GTmetrix lets you test from European servers (London, Frankfurt, Paris) -- use this instead of US-based tools. WebPageTest shows a detailed waterfall of every request. And for ongoing monitoring, Google CrUX (Chrome User Experience Report) gives you real field data from actual visitors.

Do not rely on just one tool. PageSpeed Insights uses lab data. CrUX uses field data. They tell different stories. You need both.

The quick version

If your TTFB is under 500ms and your pages load in under 2 seconds, you are fine. Focus on content. If your TTFB is over 1 second, stop writing blog posts and fix your server first. No amount of content will compensate for pages that Google and AI bots cannot load.

That Brussels legal firm? They moved to a Belgian VPS for 25 euros a month. TTFB dropped from 1.8 seconds to 180ms. Within six weeks, their crawl rate tripled and two pages that had been stuck on page 3 moved to page 1.

Twenty-five euros. Six weeks.

We measure your TTFB in 48 hours. Free for the first 5 requests.

We will tell you exactly what is slowing your site down and what to fix first.

Check my site speed
Share:
LB
Lucie Bernaerts
Expert GEO

Co-fondatrice et CEO d'AISOS. Expert GEO, elle accompagne les entreprises dans leur strategie de visibilite Google + IA.