Last month, we audited a legal firm's website in Brussels. Clean design, solid copy, decent backlinks. But when we ran the speed test, the homepage took 4.2 seconds to load. The culprit was not what you would expect -- it was not uncompressed images or bloated JavaScript. It was a shared hosting plan at 7 euros a month with a server in Montreal. For a Belgian firm targeting Belgian clients.
Their Google rankings had been slipping for six months. They blamed the algorithm.
It was the server.

The numbers are blunt
A Portent meta-analysis across 2 million European sessions (2025) found that each additional second of load time cuts conversion by 12%. A 1-second site converts 2.5x more than a 5-second site. Bounce rates jump 32% when load time goes from 1 to 3 seconds.
For rankings specifically, Searchmetrics (Berlin, 2025) found that top-10 Google results in Europe have an average TTFB of 320ms. Page-2 results average 1.2 seconds. That is not a subtle difference.
But here is what most speed guides miss entirely.
Your slow server is making you invisible -- not just slow
Speed is not just about user experience. It directly controls how many of your pages Google even bothers to look at. If your server responds slowly, Googlebot crawls fewer pages per day. Google has confirmed this repeatedly. A slow TTFB does not just make you rank lower -- it makes entire sections of your site disappear from the index.
| Average TTFB | Googlebot crawl | AI bot crawl | What happens |
|---|---|---|---|
| < 200ms | Full crawl | High | Fast indexation, frequent AI visits |
| 200-500ms | 80-100% | Medium | Normal -- acceptable for most sites |
| 500ms-1s | 50-80% | Low | Delays start showing up |
| 1-2s | 30-50% | Very low | Pages missing from index |
| > 2s | < 30% | Near zero | Serious visibility loss |
We see this pattern constantly. A client has 500 pages. Google crawls 80 per day because the TTFB is 1.5 seconds. They add 30 new blog posts. Google never indexes them. The client blames content quality. It is infrastructure.
AI bots are even less patient
GPTBot, ClaudeBot, PerplexityBot -- they all have shorter timeouts than Googlebot. PerplexityBot crawls in real time while users wait for answers, so its patience is measured in single-digit seconds. If your server takes 2 seconds to respond, these bots move on to the next source. You lose a citation not because your content is worse, but because your infrastructure is slower.
This is the new blind spot of digital visibility. We have audited over 200 European sites and roughly a third of them are what we call "AI-slow" -- content is good, SEO is decent, but AI bots cannot load their pages fast enough to cite them.
What actually moves the needle
Six things, in order of impact for most European SMEs:
- Move to a European server or edge network. Cloudflare (free plan works), Vercel Edge, or a VPS in Frankfurt/Amsterdam. This alone can cut TTFB from 1.5s to 200ms. It is the single biggest win for most sites.
- Set aggressive HTTP cache headers. Static assets should have
Cache-Control: max-age=31536000. HTML pages should usestale-while-revalidate. This eliminates redundant server hits. - Convert images to WebP. 25-35% smaller than JPEG at the same quality. Use
loading="lazy"for anything below the fold. AVIF is even better but browser support is patchy. - Minify CSS and JavaScript. Remove whitespace, comments, dead code. If you are on WordPress, a plugin like Autoptimize handles this in minutes.
- Use server-side rendering (SSR) or static generation (SSG). Client-side rendered sites are invisible to AI bots. They do not execute JavaScript.
- Check your database. Missing indexes and unoptimised queries are the hidden TTFB killers on dynamic sites. Connection pooling helps too.
How to measure it
Google PageSpeed Insights gives you the full picture with recommendations. GTmetrix lets you test from European servers (London, Frankfurt, Paris) -- use this instead of US-based tools. WebPageTest shows a detailed waterfall of every request. And for ongoing monitoring, Google CrUX (Chrome User Experience Report) gives you real field data from actual visitors.
Do not rely on just one tool. PageSpeed Insights uses lab data. CrUX uses field data. They tell different stories. You need both.
The quick version
If your TTFB is under 500ms and your pages load in under 2 seconds, you are fine. Focus on content. If your TTFB is over 1 second, stop writing blog posts and fix your server first. No amount of content will compensate for pages that Google and AI bots cannot load.
That Brussels legal firm? They moved to a Belgian VPS for 25 euros a month. TTFB dropped from 1.8 seconds to 180ms. Within six weeks, their crawl rate tripled and two pages that had been stuck on page 3 moved to page 1.
Twenty-five euros. Six weeks.
We measure your TTFB in 48 hours. Free for the first 5 requests.
We will tell you exactly what is slowing your site down and what to fix first.
Check my site speed

