How to Do an SEO Website Audit: Step-by-Step Guide
Learn how to perform a complete SEO website audit in 10 steps. Covers keyword analysis, on-page SEO, technical checks, backlinks, and content quality.
Running an SEO website audit is the single most valuable exercise you can perform to understand why your site ranks where it does and what you need to change. Whether you are an in-house marketer, a freelance consultant, or a business owner doing it yourself, this guide walks you through every step from setting up your tools to producing a prioritised action plan.
A thorough SEO audit typically takes between two and six hours depending on site size. Small sites under 100 pages can be audited in an afternoon. Enterprise sites with tens of thousands of URLs may need a week. The process below works for sites of any scale.
Step 1: Set Up Your Audit Tools
Before you touch a single page, you need the right toolkit in place. Trying to audit a website without proper tools is like trying to diagnose a car engine by listening to it from across the street. Here is what you need at minimum:
- Google Search Console — free, essential. Gives you real indexation data, search queries, click-through rates, and Core Web Vitals from Google's perspective.
- A site crawler — Screaming Frog (free up to 500 URLs), Sitebulb, or Lumar. Crawlers discover every page, flag broken links, missing tags, and redirect chains.
- Google Analytics or equivalent — for traffic trends, bounce rates, and conversion data that add context to ranking data.
- A backlink tool — Ahrefs, Semrush, or Moz. You need one to analyse your link profile and compare against competitors.
- PageSpeed Insights — free from Google. Tests Core Web Vitals with both lab and field data.
Create a dedicated folder or spreadsheet to collect your findings as you move through each step. Label columns for the issue, affected URL, severity (critical, high, medium, low), and recommended fix. This becomes your action plan at the end.
Step 2: Crawl Your Website
Your first active step is running a complete crawl of your website. Configure your crawler to follow internal links, respect robots.txt (or ignore it if you want to see what is blocked), and store response codes for every URL.
Pay attention to these crawler outputs:
- HTTP status codes — 200 is healthy. Look for 301/302 redirect chains (more than two hops waste crawl budget), 404 errors (broken pages), and 500 server errors.
- Redirect chains and loops — a common problem after site migrations. Every chain should resolve in a single hop.
- Orphan pages — URLs in your sitemap or Google index that have zero internal links pointing to them. These pages are almost invisible to search engines.
- Duplicate content — pages with identical or near-identical content, often caused by URL parameters, trailing slashes, or HTTP/HTTPS variants.
- Crawl depth — important pages should be reachable within three clicks from the homepage. Anything deeper gets crawled less frequently.
Export the full crawl data. You will reference it repeatedly in later steps. Most crawlers let you export to CSV or Excel, which makes filtering straightforward.
Step 3: Check Indexation Status
Open Google Search Console and navigate to the Pages report (formerly Index Coverage). This tells you exactly how many of your pages Google has indexed, and more importantly, why certain pages have been excluded.
Compare the number of indexed pages in GSC against the total URLs found by your crawler. A large gap means Google is choosing not to index a significant portion of your site. Common reasons include:
- Crawled, currently not indexed — Google found the page but decided it was not worth indexing. Usually a content quality signal.
- Discovered, not currently indexed — Google knows the URL exists but has not bothered to crawl it yet. Often a crawl budget issue.
- Blocked by robots.txt — you are accidentally preventing Google from accessing important pages.
- Noindex tag — a meta robots noindex directive is present. Sometimes added intentionally, sometimes left over from a staging environment.
- Duplicate, Google chose a different canonical — Google thinks another URL is the primary version and has ignored your stated canonical.
For every category, investigate the affected URLs. Cross-reference with your crawl data. If Google is not indexing pages you consider important, you need to understand why and fix the root cause before worrying about rankings.
Step 4: Analyse On-Page SEO
On-page SEO is where most ranking improvements come from. Review every page that targets a keyword you care about. Check the following elements systematically:
- Title tags — should include the primary keyword, be under 60 characters, and be unique across the site. Duplicate titles cause cannibalisation.
- Meta descriptions — not a ranking factor, but directly affect click-through rate. Keep under 155 characters. Include a call to action.
- H1 tags — every page needs exactly one H1 that matches the search intent. Missing or duplicate H1s are a common finding.
- Heading hierarchy — H2s, H3s, and H4s should follow a logical outline. Skipping levels (H1 straight to H3) signals poor content structure.
- Keyword placement — the primary keyword should appear in the first 100 words, in at least one subheading, and naturally throughout the body. Do not force it.
- Internal links — every important page should receive internal links from related content. Check anchor text distribution. Generic anchors like "click here" waste link equity.
- Image alt text — descriptive, keyword-relevant alt attributes help with image search and accessibility. Missing alt text is one of the most common audit findings.
Create a spreadsheet row for each page with the current title, current H1, target keyword, and notes on what needs changing. This becomes the on-page section of your action plan.
Step 5: Review Technical SEO
Technical SEO ensures search engines can efficiently crawl, render, and index your content. Many technical issues are invisible to users but devastating to rankings. Check these areas:
Robots.txt: Open your robots.txt file (yourdomain.com/robots.txt) and verify it is not blocking important directories. A surprisingly common mistake is blocking CSS and JavaScript files, which prevents Google from rendering your pages correctly.
XML Sitemaps: Your sitemap should list every indexable page and exclude pages with noindex tags, redirects, or 404 errors. Check that the sitemap is referenced in robots.txt and submitted in Google Search Console. The last-modified dates should be accurate, not set to today's date on every page.
Canonical tags: Every page should have a self-referencing canonical tag unless it is a duplicate that points to a primary version. Inspect your canonical tags for common errors: relative URLs instead of absolute, HTTP instead of HTTPS, or pointing to a non-existent page.
Structured data: Test your pages with Google's Rich Results Test. Common schema types include Article, FAQ, HowTo, Product, and LocalBusiness. Invalid or incomplete structured data can lose you rich snippets in search results.
HTTPS: Every page should load over HTTPS. Mixed content warnings (loading HTTP resources on an HTTPS page) can break trust signals and display browser warnings to users.
Hreflang (if multilingual): Check that hreflang annotations are reciprocal, use correct language-region codes, and include a self-referencing tag. Hreflang errors are among the most difficult to debug.
Step 6: Evaluate Content Quality
Google's helpful content system rewards pages that demonstrate first-hand experience, expertise, and genuine usefulness. Evaluating content quality means looking beyond keywords at the substance of what you have published.
Start by identifying thin content — pages with fewer than 300 words that attempt to rank for competitive queries. These pages rarely satisfy search intent and can drag down your entire site's quality signals. Either expand them with genuinely useful information or consolidate them into a stronger page.
Next, look for keyword cannibalisation. This happens when multiple pages target the same keyword, forcing Google to choose between them. Pull a list of your top keywords from Search Console and check if more than one URL ranks for the same query. If so, decide which page should be the canonical target and either redirect or merge the others.
Check for content freshness. Pages with outdated statistics, broken outbound links, or references to past years (like "best tools for 2023") signal neglect. Update them or add a visible "last updated" date.
Finally, assess topical coverage. Map your content against the topics your audience cares about. If competitors cover subtopics that you have ignored entirely, you have a content gap that limits your topical authority.
Step 7: Audit Your Backlink Profile
Backlinks remain one of the strongest ranking factors. A backlink audit reveals the health of your off-page SEO and identifies risks before they become penalties.
Export your full backlink profile from Ahrefs, Semrush, or Moz. Analyse the following:
- Total referring domains — more important than total backlinks. One link from 100 different sites is stronger than 100 links from one site.
- Domain authority distribution — a natural profile has links from sites of varying authority. An unnatural spike of high-DR links can trigger scrutiny.
- Anchor text distribution — should be diverse. If 70% of your anchors are exact-match keywords, it looks manipulative. Natural profiles have branded anchors, URL anchors, and generic phrases.
- Toxic links — links from spammy directories, link farms, foreign-language gambling sites, or PBNs. Use your tool's toxicity score as a starting point, then manually review flagged domains.
- Lost links — valuable backlinks that have disappeared. If a high-authority site removed your link, investigate whether you can reclaim it through outreach.
- Competitor comparison — compare your link profile against the top three competitors for your primary keywords. Identify domains that link to them but not to you. These become outreach targets.
If you find genuinely toxic links, consider using Google's Disavow Tool. However, use it sparingly. Most "toxic" links are simply low-quality and do not require disavowal unless you have a manual penalty.
Step 8: Check Site Speed
Site speed directly affects both rankings and user experience. Google uses Core Web Vitals as a ranking signal, and slow pages have measurably higher bounce rates.
Run your homepage and five to ten key landing pages through PageSpeed Insights. Record these metrics:
- Largest Contentful Paint (LCP) — should be under 2.5 seconds. This measures when the main content finishes loading. Common causes of poor LCP: unoptimised hero images, slow server response times, render-blocking CSS or JavaScript.
- Interaction to Next Paint (INP) — should be under 200 milliseconds. Measures responsiveness when users interact with the page. Heavy JavaScript frameworks and long-running tasks are the usual culprits.
- Cumulative Layout Shift (CLS) — should be under 0.1. Measures visual stability. Ads, images without dimensions, and dynamically injected content cause layout shifts.
- Total Blocking Time (TBT) — a lab metric that correlates with INP. Aim for under 200ms. Reduce by deferring non-critical JavaScript and breaking up long tasks.
Use WebPageTest for deeper diagnostics. It shows a waterfall chart of every resource request, making it easy to identify the specific files slowing your page. Common fixes include enabling compression, setting cache headers, lazy-loading below-the-fold images, and removing unused CSS and JavaScript.
Step 9: Assess Mobile Experience
Google uses mobile-first indexing, meaning it evaluates the mobile version of your site for ranking purposes. A site that looks perfect on desktop but breaks on mobile will underperform in search.
Test your site on real devices, not just Chrome DevTools. Check for:
- Responsive layout — does the design adapt properly at 375px, 414px, and 768px widths? Content should not overflow horizontally.
- Tap target sizes — buttons and links should be at least 48x48 pixels with adequate spacing. Small, closely-packed links frustrate mobile users and Google flags them.
- Font readability — body text should be at least 16px on mobile. Users should not need to pinch-zoom to read your content.
- Intrusive interstitials — full-screen popups that block content on mobile can result in a ranking penalty. Cookie consent banners are exempt, but newsletter modals and app install prompts are not.
- Viewport configuration — ensure your pages include a proper viewport meta tag. Without it, mobile browsers render the page at desktop width and shrink it.
- Mobile content parity — the mobile version should contain the same content as desktop. Hidden tabs, accordions that collapse content, or "read more" truncation can cause Google to miss important text.
Document every mobile issue you find along with the affected page and a screenshot. Mobile fixes often require CSS changes that development teams can batch together.
Step 10: Create Your Action Plan
An audit without an action plan is just a list of problems. The final step is organising your findings into a prioritised roadmap that your team can execute.
Sort every finding into one of four priority levels:
- Critical — issues that actively prevent indexing or cause security vulnerabilities. Fix within one week. Examples: entire sections blocked by robots.txt, sitewide noindex tag, expired SSL certificate, 500 errors on key pages.
- High — issues that significantly harm rankings or user experience. Fix within two weeks. Examples: slow LCP on landing pages, missing canonical tags causing duplicate content, keyword cannibalisation on your top five pages.
- Medium — issues that represent missed opportunities. Fix within 30 days. Examples: missing structured data, thin content on secondary pages, suboptimal title tags, images without alt text.
- Low — minor optimisations and best-practice improvements. Fix within 60 days. Examples: redirect chains with three hops, minor CLS on blog posts, meta descriptions over 155 characters.
For each issue, document the affected URLs, the specific fix required, and the expected impact. Assign an owner if you are working in a team. Set realistic deadlines and schedule a follow-up audit in 90 days to measure progress.
The best SEO audits are not one-off events. They are recurring checkpoints that keep your site aligned with search engine expectations and user needs. Aim to run a full audit quarterly and a lighter technical check monthly. Your rankings, traffic, and revenue will reflect the discipline.
Get Your Free Website Audit
Find out what's holding your website back. Our 72-checkpoint audit reveals exactly what to fix.
Start Free AuditNo credit card required • Results in 60 seconds
Or get free SEO tips delivered weekly