Your content might be excellent, your backlinks growing, and your keyword targeting on point. But if search engines can’t properly crawl, render, and index your pages, none of that matters. A technical SEO audit identifies the hidden infrastructure problems that silently kill rankings.
This 12-step checklist covers everything you need to check, in the order that matters most. Work through it top to bottom, and you’ll catch the issues that block 90% of sites from reaching their ranking potential.
Before you fix anything, you need a complete picture. Run your site through Screaming Frog, Sitebulb, or a similar crawler. Set it to respect robots.txt and follow redirects. For most sites under 10,000 pages, a full crawl takes under 15 minutes.
Pay attention to the crawl depth report. Every page you want indexed should be reachable within 3 clicks from the homepage. Pages buried 5 or 6 levels deep rarely get crawled frequently enough to rank.
2. Check Your Robots.txt File
Navigate to
yourdomain.com/robots.txt and read it line by line. You’re looking for two things: accidental blocks on pages that should be indexed, and missing blocks on pages that shouldn’t be (admin panels, staging URLs, duplicate parameter pages).
A common mistake is blocking CSS and JavaScript files. Google needs to render your pages to understand them. If your robots.txt blocks
/wp-content/themes/ or similar asset directories, Googlebot sees a blank page instead of your content.
3. Audit Your XML Sitemap
Your sitemap should include every URL you want indexed and exclude everything you don’t. Check these 4 things:
Accuracy: Compare your sitemap URLs against your crawl data. If your sitemap lists 500 URLs but your crawler found 800 indexable pages, 300 pages are missing from the sitemap.
Clean URLs only: Your sitemap shouldn’t contain redirected URLs, 404s, or noindexed pages. Each URL should return a 200 status code.
Size limits: Keep each sitemap file under 50,000 URLs and 50MB uncompressed. Larger sites need a sitemap index file pointing to multiple sitemaps.
Freshness: The
lastmod dates should reflect actual content changes, not auto-generated timestamps. Google has confirmed they use lastmod to prioritize crawling when the dates are accurate.
4. Review Indexing Status in Google Search Console
Open Search Console and go to
Pages > Indexing. This report shows you exactly how Google sees your site. Focus on the “Not indexed” section first.
The most actionable statuses are “Discovered – currently not indexed” (Google found the page but didn’t bother crawling it, which signals low perceived value) and “Crawled – currently not indexed” (Google crawled it but decided not to index it, which signals a quality or duplicate content problem).
If more than 30% of your submitted URLs aren’t indexed, you likely have a site-wide quality or crawl budget issue rather than page-level problems.
5. Fix Duplicate Content and Canonicalization
Duplicate content confuses search engines and splits ranking signals across multiple URLs. Check for these common sources:
WWW vs. non-WWW: Both versions shouldn’t resolve to live pages. One should 301 redirect to the other.
HTTP vs. HTTPS: All HTTP URLs should redirect to HTTPS. No exceptions.
Trailing slashes: Pick a format (with or without trailing slash) and redirect the other version. WordPress sites often serve both
/page/ and
/page as separate URLs.
Parameter URLs: URLs like
?sort=price&color=red create thousands of duplicate pages. Use canonical tags or configure URL parameters in Search Console to prevent indexing.
6. Test Page Speed Across Device Types
Run your 5 highest-traffic pages through Google PageSpeed Insights. Don’t chase a perfect 100 score.
Instead, focus on the 3 Core Web Vitals metrics that Google actually uses for ranking:
Largest Contentful Paint (LCP): Should load in under 2.5 seconds. If it doesn’t, the usual culprits are unoptimized hero images, slow server response times, or render-blocking JavaScript.
Interaction to Next Paint (INP): Should respond in under 200 milliseconds. Heavy JavaScript frameworks and third-party scripts are the most common offenders.
Cumulative Layout Shift (CLS): Should stay below 0.1. Add explicit width and height attributes to all images and embeds. Reserve space for ad slots before they load.
7. Verify Mobile Usability
Google uses mobile-first indexing for all sites. If your mobile experience is broken, your desktop rankings suffer too. Check these specifics:
Text should be readable without zooming (16px minimum font size). Tap targets need at least 48px of spacing between them. No horizontal scrolling should be required on any page. Content parity matters too. If your mobile version hides content that exists on desktop, Google may not index that hidden content.
8. Audit Your Internal Link Structure
Internal links distribute PageRank and help Google discover content. Export your internal link data from your crawl tool and look for these problems:
Orphan pages: Pages with zero internal links pointing to them. If you don’t link to a page, Google assumes it isn’t worth finding.
Link depth issues: Your most valuable pages should sit within 2 clicks of the homepage. Category pages, pillar content, and high-converting pages deserve prominent internal links from your navigation or homepage.
Broken internal links: Every 404 from an internal link wastes crawl budget and leaks PageRank. Fix these by updating the link target or implementing a redirect.
9. Check HTTPS Implementation
HTTPS is a confirmed ranking signal, and a broken implementation causes more problems than no HTTPS at all. Verify these 3 things:
Your SSL certificate is valid and not expiring within 30 days. All internal resources (images, scripts, stylesheets) load over HTTPS with no mixed content warnings. Every HTTP URL redirects to its HTTPS equivalent with a single 301 redirect, not a chain.
10. Validate Structured Data Markup
Structured data won’t directly boost rankings, but it earns rich results that increase click-through rates by 20-30% in most niches. Run your key pages through Google’s Rich Results Test.
Check for errors first, then warnings. Common issues include missing required fields (like
author for Article schema or
aggregateRating for Product schema) and mismatched data between your structured data and visible page content.
Don’t mark up content that isn’t visible on the page. Google considers this spammy and may remove your rich results entirely.
11. Analyze Redirect Chains and 404 Errors
Redirect chains slow down crawling and dilute link equity with every hop. Your crawl report will show these. The fix is straightforward: update each redirect in the chain to point directly to the final destination URL.
For 404 errors, prioritize pages that have external backlinks (check in Ahrefs or Search Console) or pages that still receive organic traffic. Redirect those to the closest relevant live page. Low-value 404s with no links or traffic can stay as 404s. Redirecting everything to your homepage does more harm than good.
12. Set Up Monitoring So Problems Don’t Come Back
A one-time audit fixes today’s problems. Monitoring prevents tomorrow’s. Set up these 3 automated checks:
Weekly crawl: Schedule Screaming Frog or Sitebulb to crawl your site weekly and email you a change report. New 404s, new redirect chains, and new orphan pages show up immediately.
Search Console alerts: Google Search Console sends email notifications for manual actions, security issues, and major indexing problems. Make sure the email address on the account is one you actually check.
Uptime monitoring: Use a free tool like UptimeRobot to ping your site every 5 minutes. Downtime that you don’t notice can tank your rankings if it lasts more than a few hours.
Run this full audit once per quarter. Between audits, your weekly crawl and monitoring alerts will catch most new issues before they compound into ranking drops. The sites that rank consistently aren’t the ones with the best content alone. They’re the ones where the technical foundation actually lets search engines do their job.
For more information, see
Google Search Console.
When it comes to technical SEO audit, understanding the fundamentals is just the starting point. Implementing technical SEO audit best practices consistently is what separates high-performing content from the rest. Every aspect of technical SEO audit covered in this guide builds on proven strategies.
Understanding technical SEO audit is essential for any SEO strategy in 2026. When you apply technical SEO audit best practices consistently, you will see measurable improvements in your search rankings. Many successful sites credit their growth to a strong technical SEO audit approach.
Related Articles