Technical SEO: the complete guide.

Lesson 05 of 08 20 min read Practice 3 quizzes inside

Technical SEO sounds intimidating because of the name. It isn't. It's the work that makes sure search engines can read your site at all. If on-page SEO is what you write on the page, technical SEO is making sure Google can find and understand the page in the first place. Without it, even brilliant content goes nowhere.

The good news: 80% of technical SEO is checking boxes that, once set up correctly, stay correct for years. This lesson walks through the eight pillars of technical SEO, the free tools to audit each, and the priority order to fix problems.

What is technical SEO, exactly?

Technical SEO covers everything related to your site's infrastructure — the parts users don't see but search engines do. The eight pillars:

The 8 pillars of technical SEO

Foundation: Crawlability · Indexability · HTTPS

Performance: Site speed · Mobile-friendliness

Structure: URL structure · Site architecture · Schema markup

Each pillar is something Google explicitly checks. Failing any of them limits your maximum ranking potential, regardless of how good your content is.

Pillar 1: Crawlability

Crawlability is whether search engines can find your pages. If Googlebot can't crawl a page, it can't be indexed. If it can't be indexed, it can't rank. Crawlability is a prerequisite for everything else.

The two files that control crawlability

robots.txt — a plain-text file at yoursite.com/robots.txt that tells crawlers which parts of your site they can access. The simplest correct robots.txt:

User-agent: *
Allow: /

Sitemap: https://yoursite.com/sitemap.xml

XML sitemap — a file at yoursite.com/sitemap.xml that lists every page you want indexed, with metadata about each. Submit this to Google Search Console; Google uses it as a discovery aid.

Common crawlability problems

  • Disallow: / in robots.txt — accidentally blocks the entire site. Most catastrophic technical SEO mistake.
  • Blocking CSS or JS files — Google needs these to render your page. Don't disallow them.
  • Pages with no internal links — orphaned pages that Google has no path to discover.
  • Server timeouts — if Googlebot gets errors on too many pages, it slows or stops crawling.
  • Excessive redirects — redirect chains waste crawl budget.
The most expensive technical mistake

"Disallow: /" in robots.txt blocks every search engine from your entire site. We've seen million-dollar businesses accidentally ship this in production and lose 90% of their organic traffic in a week. Always check robots.txt after any deployment.

Pillar 2: Indexability

Indexing is whether Google decides to store your page in its database after crawling it. Crawled but not indexed = invisible.

Tools that control indexability

Meta robots tag — placed in your HTML <head>:

<meta name="robots" content="index, follow">

The defaults are "index" (allow indexing) and "follow" (pass authority through links). If you want a page not indexed (like internal search results, admin pages, thank-you pages), use:

<meta name="robots" content="noindex, nofollow">

Canonical tag — tells Google which version of a page is the "official" one when multiple URLs show the same content:

<link rel="canonical" href="https://yoursite.com/article" />

Canonicals fix duplicate content issues — for example, when the same article is accessible via multiple URLs (with/without query parameters, with/without trailing slash, HTTP vs HTTPS).

Quick indexability check

To see if a specific page is indexed, search Google for: site:yourdomain.com/page-url. If the page shows up, it's indexed. If not, either Google hasn't crawled it yet, or it's blocked from indexing.

For comprehensive checks, use Google Search Console's URL Inspection tool — it shows exactly how Google sees any URL on your site.

Pillar 3: HTTPS — non-negotiable

HTTPS (encrypted connection) has been a Google ranking signal since 2014. In 2026, sites without HTTPS get flagged by browsers, lose visitor trust, and face ranking penalties.

Free options for HTTPS:

  • Let's Encrypt — free SSL certificates, automated renewal
  • Cloudflare — free SSL via their CDN service
  • Hosting-included SSL — most modern hosts (Netlify, Vercel, Cloudflare Pages) include HTTPS by default

If your site is still on HTTP in 2026, fix this today. There is no acceptable reason to be on HTTP for a site that wants organic traffic.

Check yourself 01 / 03
A site has the line "Disallow: /" in its robots.txt. What does this do?
Right. "Disallow: /" tells crawlers they can't access anything starting from the root — which is the entire site. This is the most catastrophic robots.txt mistake possible.
Reconsider. The answer is B. "Disallow: /" blocks crawlers from the root path, which means everything. The correct way to allow all is "Disallow:" (empty) or "Allow: /".

Pillar 4: Site speed and Core Web Vitals

Page speed has been a ranking factor for over a decade. In 2021, Google formalized it as Core Web Vitals — three specific metrics that directly affect rankings.

Core Web Vitals — what each metric measures LCP Largest Contentful Paint Target: < 2.5s Loads main content fast Fix: optimize images, use CDN, lazy load INP Interaction to Next Paint Target: < 200ms Responds to clicks fast Fix: reduce JS, defer non-critical scripts CLS Cumulative Layout Shift Target: < 0.1 Visual stability Fix: image dimensions, reserved ad slots
Figure 01 — Core Web Vitals at a glance Three metrics. Three thresholds. Failing any one significantly hurts mobile rankings. Test with Google PageSpeed Insights.

How to test and fix Core Web Vitals

Free tools:

  • PageSpeed Insights — paste any URL, get Core Web Vitals scores plus specific recommendations
  • Google Search Console → Core Web Vitals report — site-wide view of which URLs need attention
  • Chrome DevTools → Lighthouse — local audit of any page

Common fixes:

  • Compress images with TinyPNG or use WebP format
  • Add width and height attributes to all images
  • Defer non-critical JavaScript with async or defer
  • Use a CDN (Cloudflare's free tier is excellent)
  • Enable caching headers on your server
  • Remove unused CSS and JavaScript

Pillar 5: Mobile-friendliness

Google switched to mobile-first indexing in 2021 — meaning Google primarily uses your mobile site to determine rankings, even for desktop searches. If your mobile experience is bad, your desktop rankings suffer too.

What mobile-friendliness requires:

  • Responsive design — content adapts to screen size automatically
  • Readable text — at least 16px font size on body text
  • Tap targets large enough — buttons and links at least 48x48 pixels
  • No horizontal scrolling — content fits within the viewport
  • Same content as desktop — don't hide content on mobile
  • Fast on slow connections — assume 3G mobile networks

Test with the URL Inspection tool in Search Console (it shows the mobile rendering Google sees). Or use Chrome DevTools → Toggle Device Toolbar.

Pillar 6: URL structure

Clean URLs help both search engines and users understand what a page is about. We covered URL slugs in Lesson 04: On-Page SEO; here's the technical side.

URL structure rules

  • Use HTTPS — covered above, but worth repeating
  • Lowercase only — URLs are case-sensitive on most servers
  • Hyphens, not underscoreson-page-seo beats on_page_seo
  • Logical hierarchy/blog/category/article structure helps both users and Google
  • Avoid query parameters in canonical URLs — keep them clean
  • Trailing slash consistency — pick one (with or without) and redirect the other
  • 301 redirects for old URLs — never break links; redirect them

Pillar 7: Site architecture

Site architecture is how your pages connect to each other through internal links. Google uses this structure to understand topical authority and pass link equity through your site.

The flat architecture principle

Every page should be reachable in 3 clicks or fewer from the homepage. Pages buried 5+ clicks deep get crawled less often and accumulate less authority.

The hub-and-spoke model from Lesson 04 works at site level too: pillar pages → category pages → individual articles, all interconnected.

Flat architecture — every page within 3 clicks Homepage Foundations Tactics Tools Local SEO Homepage → Category → Article = 2 clicks. Every page within reach.
Figure 02 — Flat architecture in action Pillar pages connect to categories, which connect to articles. Three tiers cover everything. Every page is 2-3 clicks from the homepage — a structure both Google and users navigate easily.

Pillar 8: Schema markup

We covered schema in detail in Lesson 04. Technically, schema markup is JSON-LD code added to your <head> that tells search engines what type of content the page contains.

Why schema is technical SEO: it's invisible to users, parsed by search engines, and significantly boosts both rich result eligibility and AI citation rates.

Use our free Schema Generator to create valid JSON-LD for any page in seconds. Test the result with Google's Rich Results Test after publishing.

Check yourself 02 / 03
Your site has duplicate content because the same article appears at /article and /article/ (with trailing slash). What's the right fix?
Right. Trailing slash duplicates are best fixed with 301 redirects. Pick one canonical version, redirect the other. This consolidates link equity and prevents both URLs from competing.
Reconsider. The answer is B. Noindex would hide the content but not consolidate authority. Robots.txt would block crawling but not indexing. The correct fix is to 301 redirect one URL to the other.

Free tools to audit technical SEO

You don't need expensive tools. These free ones cover 95% of what working SEOs use:

  • Google Search Console — the most important free tool. Shows crawl errors, indexing status, Core Web Vitals, mobile usability, and more.
  • PageSpeed Insights — Core Web Vitals + speed audit for any URL.
  • Rich Results Test — schema markup validation.
  • Chrome DevTools → Lighthouse — comprehensive page audit in your browser.
  • Screaming Frog SEO Spider — desktop crawler. Free up to 500 URLs (enough for most sites).
  • Sitebulb Lite — visual site auditor, free trial.

The priority order to fix technical SEO

Like on-page SEO, technical issues should be fixed in priority order. Don't try to perfect everything at once.

Priority order for technical SEO fixes

Phase 1 — Critical (fix today):

HTTPS · robots.txt not blocking site · sitemap exists · pages are indexable · no duplicate content via canonical · 301 redirects for moved pages

Phase 2 — Important (fix this week):

Mobile-friendly responsive design · Core Web Vitals passing · clean URL structure · proper meta robots tags · clean internal linking

Phase 3 — Polish (fix this month):

Schema markup on key pages · image optimization · CDN setup · advanced caching · structured navigation

Most sites with technical issues fail Phase 1 — and the fixes are usually fast. Address Phase 1 issues for any site that's underperforming and you'll often see ranking lifts within weeks.

Check yourself 03 / 03
A new site has these issues: HTTPS not enabled, slow LCP (4.5s), and no schema markup. Which to fix FIRST?
Right. HTTPS is Phase 1 — without it, browsers warn users and Google ranks the site lower across the board. The other issues are Phase 2 and Phase 3 fixes. Always work the priority order.
Reconsider. The answer is A. HTTPS is critical Phase 1 — sites without it get visible browser warnings ("Not Secure") that destroy user trust before any ranking question matters. Schema and Core Web Vitals are important but later in the order.

The big ideas to keep

From this lesson
  1. Technical SEO has 8 pillars: Crawlability, Indexability, HTTPS, Speed, Mobile, URL structure, Architecture, Schema.
  2. Without crawlability + indexability, nothing else matters — these are the foundation.
  3. HTTPS is non-negotiable in 2026.
  4. Core Web Vitals: LCP < 2.5s, INP < 200ms, CLS < 0.1.
  5. Free tools cover 95% of audits: Google Search Console, PageSpeed Insights, Rich Results Test.
  6. Fix in Phase order — Phase 1 issues (critical) first, polish later.