Ahrefs Site Audit: Complete Guide to Using It Effectively
Learn how to use Ahrefs Site Audit to find and fix technical SEO issues. Covers setup, crawl settings, reports, and interpreting results.
Ahrefs Site Audit is one of the most capable technical SEO crawlers built into a broader SEO platform. Unlike standalone crawlers such as Screaming Frog, it runs entirely in the cloud, stores historical crawl data, and presents issues in a way that makes prioritisation straightforward even for teams without deep technical SEO experience.
Having used Ahrefs Site Audit across hundreds of projects ranging from five-page local business sites to enterprise domains with millions of URLs, this guide covers everything you need to know to get meaningful results from the tool. Not just which buttons to press, but how to interpret what it tells you and what to fix first.
What Is Ahrefs Site Audit
Ahrefs Site Audit is a cloud-based website crawler that scans your site for over 170 predefined technical and on-page SEO issues. It is part of the broader Ahrefs platform, which means audit findings can be cross-referenced with backlink data, keyword rankings, and organic traffic estimates without switching tools.
The crawler works by following links from your specified start URL, much like Googlebot would. It fetches each page, parses the HTML (and optionally renders JavaScript), then checks for a comprehensive list of issues grouped by category. Results are stored in your Ahrefs dashboard, where you can track changes over time.
The key differentiator from competitors like Semrush is the depth of crawl data available. Ahrefs provides raw crawl data alongside the issue reports, so you can export full URL lists with status codes, canonical tags, word counts, and dozens of other data points. This makes it useful both as an automated issue checker and as a data source for custom analysis.
Ahrefs also offers a free version called Ahrefs Webmaster Tools, which gives verified site owners access to Site Audit with limited crawl credits. It is one of the better free options available for small site owners who want a proper technical audit without committing to a paid subscription.
Setting Up Your First Crawl
To run your first audit, you need to add your site as a project in Ahrefs. Navigate to Site Audit from the main menu, click "New project", and enter your domain. Ahrefs will ask you to verify ownership through DNS, HTML file upload, or by connecting Google Search Console. Verification is required to crawl your site.
Once verified, you configure the crawl settings. The key decisions at this stage are:
- Crawl scope: Choose whether to crawl the entire domain, a specific subdomain, or a subfolder. For most sites, crawling the entire domain is correct. If you have a large site with distinct sections (such as a blog on a subdomain), you may want to audit them separately.
- Crawl speed: Ahrefs lets you set the maximum number of pages per second. For small to medium sites, leave this at the default. For large or resource-constrained servers, reduce the speed to avoid overloading your hosting. A crawl rate of 2-3 pages per second is safe for most shared hosting environments.
- JavaScript rendering: Enable this if your site relies on client-side rendering (React, Vue, Angular, or heavy JavaScript content loading). JavaScript rendering uses more crawl credits and takes longer, but without it, Ahrefs will only see the initial HTML response, missing dynamically loaded content.
- Crawl limit: Set a maximum number of pages if your site is very large and you want to stay within your plan's crawl credit allowance.
- User agent: Choose between Ahrefs' own bot user agent or a Googlebot user agent. Using Googlebot can be useful if your server returns different content based on the user agent, but be aware this may trigger different caching behaviour.
After configuring these settings, start the crawl. Small sites (under 1,000 pages) typically complete within minutes. Larger sites can take several hours. Ahrefs sends an email notification when the crawl finishes.
Set up a recurring crawl schedule from the project settings. Weekly crawls are ideal for active sites where content and code change frequently. Monthly is sufficient for smaller sites with infrequent changes. Scheduled crawls let you track your Health Score and issue counts over time, which is essential for demonstrating progress to clients or stakeholders.
Understanding the Dashboard
The Site Audit dashboard opens with the Health Score, a number from 0 to 100 that represents the overall technical health of your site. This score is calculated as the ratio of issue-free URLs to total URLs crawled, weighted by issue severity. A score above 80 is generally considered good, but the absolute number matters less than the trend. Watch for drops between crawls, which indicate new problems.
Below the Health Score, you see the issue distribution chart showing how many errors, warnings, and notices were found. Ahrefs uses a three-tier severity system:
- Errors (red): Critical issues that are likely harming your SEO right now. These include broken pages (4xx/5xx status codes), missing or duplicate title tags, noindex on pages that should be indexed, and redirect chains. Fix these first.
- Warnings (yellow): Issues that may be affecting performance or could become errors if left unaddressed. Examples include slow pages, missing meta descriptions, images without alt text, and pages with thin content. Address these after clearing errors.
- Notices (blue): Informational flags that may or may not require action. These include things like pages blocked by robots.txt, external links with nofollow, or pages not in your sitemap. Review these but do not treat them as urgent.
The dashboard also shows a crawl overview with total pages crawled, HTTP status code distribution, and a visual breakdown of page types. The "Crawled pages" section shows how many pages were HTML, how many were redirects, and how many returned error codes. This gives you an immediate sense of your site's crawl health.
Key Reports Explained
Ahrefs Site Audit organises findings into several thematic reports. Understanding what each report covers helps you navigate the tool efficiently.
All Issues: This is the master list of every issue found, sorted by severity and frequency. Each issue type shows the number of affected URLs and can be expanded to see the full list. This is your primary working view. Start here, sort by errors, and work through them in order of impact.
Internal Pages: A full dataset of every internal URL crawled, with columns for status code, title, meta description, word count, incoming internal links, outgoing links, canonical URL, indexability, and more. You can filter, sort, and export this data. This report is invaluable for finding patterns that the automated issue checks might miss.
Links: Covers both internal and external link analysis. Shows broken links (404s), redirected links (301/302), orphan pages (pages with no incoming internal links), and link distribution. The orphan pages check is particularly useful because pages without internal links are effectively invisible to search engines.
Performance: Focuses on page speed metrics. Shows load time, page size, and the number of requests for each URL. While not as detailed as a Lighthouse audit, it highlights the slowest pages and the most common performance bottlenecks across your site.
HTML Tags: Analyses title tags, meta descriptions, H1 tags, and Open Graph tags across the site. Identifies duplicates, missing tags, tags that are too long or too short, and pages with multiple H1 elements. This report catches the most common on-page SEO issues.
Social Tags: Checks for Open Graph and Twitter Card meta tags. Missing social tags mean your pages will not display properly when shared on social media platforms, which can reduce click-through rates from social sharing.
Content Quality: Flags pages with thin content (low word count), duplicate content, and pages with a high text-to-HTML ratio. The thin content check is configurable, so you can set your own minimum word count threshold based on your content strategy.
Localisation: Checks hreflang implementation for multilingual or multi-regional sites. Hreflang is notoriously difficult to implement correctly, and this report catches common mistakes like missing return tags, incorrect language codes, and self-referencing issues.
Incoming Links: Shows the internal link profile for each page, helping you identify pages that receive too few or too many internal links. A flat internal link distribution often means your most important pages are not receiving enough link equity.
Common Issues It Finds
After auditing hundreds of sites with Ahrefs, certain issues appear repeatedly. Understanding these common findings helps you anticipate what your audit will uncover.
Broken internal links (404 errors): The most universally common issue. Pages get deleted or URLs change without redirects, leaving broken links scattered throughout the site. Ahrefs shows you both the broken page and the pages linking to it, making fixes straightforward.
Redirect chains and loops: When one redirect points to another redirect, you have a chain. Three or more hops in a chain waste crawl budget and dilute link equity. Redirect loops (A redirects to B, B redirects to A) are even worse, creating pages that can never load.
Missing or duplicate title tags: Pages without title tags lose a critical ranking signal. Duplicate title tags across multiple pages create ambiguity for search engines about which page to rank for a given query. Both are surprisingly common on larger sites.
Orphan pages: Pages that exist but have no internal links pointing to them. Search engines can only find these pages through the sitemap or external links, making them less likely to be crawled and indexed. Common causes include old blog posts that fell out of category archives, or landing pages created for paid campaigns.
Missing alt text on images: An accessibility issue that also affects image search visibility. Ahrefs flags every image without alt text, along with images that have empty alt attributes.
Slow loading pages: Pages that take more than three seconds to load are flagged. Ahrefs identifies the specific pages and shows their load time, size, and request count so you can diagnose whether the issue is large images, too many scripts, or server response time.
Mixed content warnings: Pages served over HTTPS that load resources (images, scripts, stylesheets) over HTTP. This triggers browser security warnings and can prevent the page from being fully secure in the eyes of both users and search engines.
Advanced Settings
Beyond the basic crawl configuration, Ahrefs offers several advanced settings that can significantly improve the quality of your audit results.
URL rewrite rules: If your site uses URL parameters for tracking, sorting, or filtering, you can create rules to strip these parameters from URLs before crawling. This prevents the crawler from treating the same page with different parameters as separate URLs, which would inflate your page count and create false duplicate content issues.
Crawl source: By default, Ahrefs discovers URLs by following links from your start URL. You can also upload a URL list or use your sitemap as the crawl source. Using both link following and sitemap as sources gives you the most complete picture, as it catches pages that exist in your sitemap but have no internal links (and vice versa).
Exclusion rules: Exclude specific URL patterns from the crawl using regex or prefix matching. Useful for excluding admin areas, staging environments behind the same domain, or sections of the site you do not want to audit (such as a forum or user-generated content area).
Custom robots.txt: Override your live robots.txt with custom directives for the audit crawl. This is useful if your robots.txt blocks certain sections that you want to audit, or if you want to test the impact of robots.txt changes before deploying them.
HTTP authentication: If your site (or a staging version of it) is behind HTTP authentication, you can provide credentials so the crawler can access it. This is essential for auditing staging environments before launch.
Crawl budget management: On paid plans, you have a monthly crawl credit allowance. Large sites with frequent crawl schedules can consume credits quickly. Monitor your usage in the account settings and adjust crawl frequency or scope if you are running low.
Ahrefs vs Alternatives
Ahrefs Site Audit competes directly with Semrush Site Audit and Screaming Frog. Each has distinct strengths.
Compared to Semrush, Ahrefs provides more granular raw crawl data and better data export capabilities. Semrush has a slight edge in user interface clarity and its thematic reports are easier for beginners to understand. Both tools check for a similar range of issues (170+ for Ahrefs, 140+ for Semrush), and both integrate with their respective keyword and backlink databases.
Compared to Screaming Frog, the key difference is deployment model. Screaming Frog runs locally on your machine, giving you unlimited control over crawl settings and custom extraction. Ahrefs runs in the cloud, which means no software installation, automatic scheduling, and historical data storage. Screaming Frog is more powerful for custom audits and large-scale data extraction, while Ahrefs is better for ongoing monitoring and team collaboration.
For agencies managing multiple client sites, Ahrefs offers a clean multi-project interface and the ability to share reports via links. Screaming Frog requires manual report generation and file sharing. Semrush offers similar cloud-based project management to Ahrefs, with the addition of white-label PDF reports on higher-tier plans.
If you already subscribe to Ahrefs for backlink analysis and keyword research, using its Site Audit is a no-brainer since you are already paying for it. If you need the most flexible and powerful desktop crawler, Screaming Frog is the better choice. For a detailed side-by-side breakdown, see our tool comparison guide.
Pricing
Ahrefs pricing is structured in four tiers, all of which include access to Site Audit:
- Lite ($99/month): 5 projects, 10,000 crawl credits per month. Sufficient for freelancers or small agencies managing a handful of sites. The crawl credit limit means you can audit sites with up to 10,000 pages monthly, or run smaller audits more frequently.
- Standard ($199/month): 20 projects, 500,000 crawl credits. The most popular tier for growing agencies. The substantial crawl credit increase supports larger sites and more frequent auditing schedules.
- Advanced ($399/month): 50 projects, 1.25 million crawl credits. Designed for agencies managing large client rosters or enterprise sites with hundreds of thousands of pages.
- Enterprise ($999/month): 100 projects, 5 million crawl credits. For large agencies or in-house teams at enterprise organisations with massive sites.
Ahrefs Webmaster Tools remains available as a free option for verified site owners. You get access to Site Audit for your own sites with 5,000 crawl credits per month. The data is slightly less comprehensive than paid plans, but it is more than enough for small site owners who want to monitor their technical health.
All paid plans include a 7-day trial period. Crawl credits reset monthly and do not roll over. If you run out of credits mid-month, crawls are paused until the next billing cycle. Consider your total page count across all projects when choosing a plan to ensure your credits can support your desired crawl frequency.
Compared to Semrush (starting at $129.95/month) and Screaming Frog (£199/year for the paid version), Ahrefs sits at the higher end for entry-level pricing. However, you are getting a complete SEO toolkit, not just an audit tool. If you would otherwise subscribe to separate tools for backlink analysis, rank tracking, and site auditing, Ahrefs can consolidate those costs.
Get Your Free Website Audit
Find out what's holding your website back. Our 72-checkpoint audit reveals exactly what to fix.
Start Free AuditNo credit card required • Results in 60 seconds
Or get free SEO tips delivered weekly