Architecture: the problem nobody wants to solve

Ask any marketer about their content strategy, they will answer in 30 seconds. Ask about their site architecture, and you will get an awkward silence. This is the SEO paradox of 2026: everyone invests in content, nobody invests in the structure that supports it.
Yet a study by Searchmetrics (Berlin, 2025) shows that sites with a clearly defined silo architecture get on average 41% more organic traffic than sites with a flat or chaotic structure. The results are even more pronounced for AI citations: +67% mentions in Perplexity responses.
Marcus Tober, founder of Searchmetrics (Berlin): "Architecture is the invisible multiplier. Two sites with the same content but different architectures will have radically different SEO performance."
The 5 principles of SEO + AI architecture
Here are the five fundamental rules we apply at AISOS for every site architecture:
- The 3-click rule — every page must be accessible within 3 clicks from the homepage
- Thematic silos — group pages by topic, with a hub (pillar page) for each silo
- Logical URLs — the URL structure must reflect the silo hierarchy (
/blog/seo-technique/core-web-vitals) - Strategic internal linking — internal links follow silo logic, with controlled bridges between silos
- Predictable navigation — a user (or bot) must be able to predict where they will arrive when clicking a link
3 architecture models compared
| Model | Description | Ideal for | SEO | AI | Complexity |
|---|---|---|---|---|---|
| Flat | All pages at the same level | Small sites (< 20 pages) | Medium | Good | Low |
| Strict silo | Sealed silos, no cross-silo links | Thematic sites (50-200 pages) | Very good | Good | Medium |
| Silo with bridges | Silos + strategic cross-silo links | Complex sites (200+ pages) | Excellent | Excellent | High |
At AISOS, we systematically recommend the "silo with bridges" model. It combines the strength of the silo (concentrated topical authority) with the flexibility of bridges (authority sharing between complementary topics).
What AI crawlers expect from your architecture
AI crawlers (GPTBot, ClaudeBot, PerplexityBot) have specific behaviours that your architecture must anticipate:
- No JavaScript rendering — your navigation must work in pure HTML. Client-side generated React/Vue menus are invisible to them
- Reduced page weight — AI bots prefer lightweight pages. An architecture that avoids mega-pages makes each page easier to parse
- Clear hierarchical context — breadcrumbs, BreadcrumbList schema, and logical URLs help AI systems understand a page's position in your hierarchy
- llms.txt file — a "site map" specifically designed for LLMs, that summarises your architecture and points to your most important pages (see our llms.txt guide)
Bartosz Goralewicz, CEO of Onely (Poland): "AI crawlers do not crawl like Googlebot. They do not follow all links, they sample. A clear architecture increases the probability that your important pages are in that sample."
The AISOS method to build your architecture
Here is our 5-step process, applicable whether you are starting from scratch or redesigning an existing site:
- Semantic audit — identify all your topics, sub-topics and associated search intents
- Clustering — group pages by thematic silo, define hubs
- URL hierarchy — define a coherent URL structure that reflects the silos
- Linking plan — map intra-silo internal links and cross-silo bridges (see our internal linking guide)
- Technical validation — crawl the site with Screaming Frog to check depth, orphan pages, redirect loops
To learn more about the internal linking that brings your architecture to life, read our dedicated article: Internal Linking: The Strategy That Boosts Your SEO. And for the complete technical context, see our technical SEO guide 2026.
FAQ — Site Architecture SEO + AI
What is the maximum depth for an SEO site?
3 levels maximum from the homepage. Beyond that, pages are crawled less frequently by Googlebot and almost never by AI bots. If you have pages at 4+ levels, shorten the path through internal linking.
Should you include categories in URLs?
It is recommended for semantic clarity. /blog/seo-technique/core-web-vitals is more explicit than /blog/core-web-vitals. However, if you risk changing categories, flat URLs are more flexible.
How do you detect orphan pages?
Use Screaming Frog or Sitebulb to crawl your site and identify pages with no inbound internal links. These pages are invisible to bots and must be connected to your architecture or deleted.
Can you change architecture without losing existing SEO?
Yes, provided you implement 301 redirects for every modified URL, update the XML sitemap, and monitor Search Console for 2-3 months after the migration.
What is the difference between a hub and a pillar page?
They are synonyms in practice. A hub (or pillar page) is the central page of a thematic silo. It covers the topic exhaustively and links to all the detailed articles in the silo.
Does architecture impact AI citations?
Yes, significantly. A clear architecture helps AI crawlers understand your thematic expertise (topical authority). Well-structured sites are cited more often because LLMs can easily identify the most relevant pages for a given topic.
Is your architecture holding back your visibility?
Our experts analyse your site structure and propose an optimised architecture for Google and AI search engines.
Audit my architecture

