enterprise-seo-audit-buyers-guide.html
< BACK TO BLOG Dimly lit desk with printed SEO audit spreadsheets, handwritten notes, and a glowing laptop in warm amber light

Enterprise SEO Audit: What to Demand From Any Agency

Three years ago a retailer came to Seahawk after paying £28,000 to a "top-tier" agency for an enterprise SEO audit. They handed me a 94-page PDF. Gorgeous formatting. Executive summary on page two, colour-coded priority matrices, the lot. And buried on page 61, almost as an afterthought, was the actual crawl data — a single Screaming Frog screenshot with no annotation, no fix recommendations, no tie to revenue. Twenty-eight thousand pounds.

That experience crystallised something I'd already suspected after building over 12,000 sites: most agencies sell the appearance of an audit, not the audit itself. So if you're about to commission one — or already have one sitting in your inbox and you're wondering whether it's worth the paper it's printed on — this is what you should actually be demanding.

---

The Brief Has to Come Before the Audit

Here's the thing most clients don't push back on: the agency starts crawling before they've asked the right questions.

A legitimate enterprise audit begins with a scoping call where you talk more than they do. What's the site architecture? Monolith CMS, headless, hybrid? How many indexable URLs — 10,000 or 10 million? Are there subdomains, ccTLDs, hreflang implementations across markets? What does "success" mean to your CFO, not your marketing director?

If an agency skips this and goes straight to "we'll run the crawl and be back in two weeks," that's a red flag. A big one.

I always insist on a written scope document before any technical work starts. One page is fine. It needs to list the exact domains being audited, the tools being used, what's in scope and — critically — what's explicitly out of scope. Without that, you'll end up with a report that audits the wrong thing beautifully.

---

Technical Crawl: The Foundation, Not the Whole Building

A crawl report from Screaming Frog or Sitebulb is table stakes. Every agency does this. What separates a real audit from a template job is what happens after the crawl data comes in.

What the crawl must cover

  • Crawl budget analysis — not just "here are your 4xx errors" but an actual breakdown of how Googlebot is spending its crawl allocation across your site
  • Redirect chains (more than two hops at enterprise scale is genuinely costly)
  • Canonical conflicts, especially nasty when you've got faceted navigation or a Shopify-plus setup with collection and product URL overlaps
  • Core Web Vitals per template type, not just site-wide averages — a homepage passing LCP means nothing if your 50,000 PDP pages are failing
  • JavaScript rendering issues, confirmed via a render comparison (raw HTML vs. rendered DOM), not guessed at

Seahawk had a B2B SaaS client last year — 180,000 URLs, Angular front-end — where the crawl looked clean on the surface. 200s everywhere. But when we ran a render comparison using Google Search Console's URL Inspection tool alongside a manual Puppeteer check, about 40% of their body copy was invisible to Googlebot. The crawl report alone would have completely missed it. That's the kind of thing an enterprise audit needs to catch.

Log file analysis

Honestly, this is where I separate agencies who know their craft from those who don't. Ask directly: "Will you analyse our server logs?" If they hesitate, or say "we'll use GSC data as a proxy," that's not good enough at scale.

Log file analysis tells you what Googlebot is actually doing, not what you think it's doing. Which URLs is it visiting daily? Which are being ignored? Is it wasting crawls on pagination nobody links to? You can't answer those questions from GSC alone.

---

On-Page and Content Signals

Technical SEO without content analysis is half a job.

The audit needs to assess topical authority — are you actually covering the subjects you're trying to rank for, or are you thin in key areas? I use a combination of Semrush's topic research and a manual review of your top 50 landing pages by organic traffic. The manual bit matters. Automated content scores (looking at you, Surfer SEO) are useful directionally but they're not a substitute for someone actually reading your pages and asking "does this answer the query better than the competitor?"

Cannibalisation mapping

This is the one thing I see skipped more than anything else in audits for large sites. Keyword cannibalisation — where multiple URLs compete for the same intent — is endemic at enterprise scale. Any site with a blog, a resources section, and a product catalogue is almost certainly cannibalising itself somewhere.

The audit deliverable here should be an actual spreadsheet: query cluster, competing URLs, traffic split in GSC, and a recommendation for which URL to consolidate to. Not a paragraph saying "cannibalisation was found." A spreadsheet you can hand to a developer.

---

I've seen agencies present a backlink report that opens with "Your site has 2.4 million backlinks!" like that's inherently good news. Domain count doesn't mean anything without context.

What you need is a structured analysis that covers:

  1. Link velocity — is the profile growing organically or did you spike in 2021 and flatline since?
  2. Anchor text distribution — over-optimised exact-match anchors are still a manual action risk in 2024
  3. Toxic link assessment — using Ahrefs or Majestic, not just a traffic-light "toxicity score" from a tool that can't actually tell you why a link is problematic
  4. Competitor gap analysis — which domains are linking to your three closest competitors but not to you, and are those links realistically acquirable?
  5. Lost links in the last 90 days — often the first signal that something went wrong after a site migration

The disavow file question is its own conversation. I'm cautious about recommending disavows unless there's a clear unnatural link pattern and a manual action history. Too many agencies disavow aggressively as a show of activity. Most of the time it does nothing. Sometimes it makes things worse.

---

Site Architecture and Internal Linking

At enterprise scale, internal linking isn't an afterthought. It's how PageRank flows. And it's almost always broken in ways nobody's looked at for three years.

Back in 2019 a client handed me a brief for a news and media site — around 2 million indexed pages. Their newest articles were getting crawled and indexed fine, but content older than 18 months had almost no internal links pointing to it, and their organic traffic on evergreen content had been declining steadily for two years. Nobody had connected those two facts. One internal linking audit and a silo restructure later, that evergreen traffic recovered 34% in four months.

The audit deliverable for architecture should include:

  • A visual site map (not a sitemap.xml, an actual diagram) showing the depth of each major section
  • Orphan page identification — URLs with no internal links pointing to them whatsoever
  • Link equity flow analysis: which pages have high authority but aren't passing it anywhere useful?
  • Navigation audit — are your primary nav links going to the highest-value conversion pages, or to an "About Us" section nobody clicks?

---

What the Deliverable Should Actually Look Like

Let's be specific. A proper enterprise SEO audit deliverable is not one PDF.

It's a package:

  • Executive summary (2–3 pages max) written for a non-technical stakeholder — no jargon, tied to revenue and traffic opportunity estimates
  • Technical audit doc — full crawl findings, log file analysis, rendering issues, speed metrics — with severity ratings and specific fix instructions
  • Content and on-page doc — cannibalisation map, thin content flags, topical gap analysis
  • Backlink report — the five components I listed above, in a format someone can act on
  • Architecture and internal linking doc — orphan pages, silo recommendations, nav review
  • Prioritised roadmap — a numbered list, not a matrix, that tells your dev team what to fix in what order and why

That last bit matters more than anything. Prioritisation. I've read audits that flagged 400 issues with no indication of which three to fix first. That's not an audit. That's a data dump.

---

How to Vet the Agency Before You Commission

A few things I ask before I'd trust anyone with an enterprise audit.

First: show me a redacted sample deliverable. Not a template, a real one. If they won't, walk away.

Second: who specifically is doing the work? At a lot of agencies, the senior person sells the audit and a junior analyst delivers it. That's fine if the senior is reviewing and signing off. But you should know who's writing the crawl analysis. Ask by name.

Third: what's your process when you find something unexpected mid-audit? Enterprise sites are full of surprises — sudden traffic drops, staging environments accidentally indexed, hreflang pointing to 404s. A good agency has a protocol for escalating these mid-audit rather than burying them in appendix F.

And honestly? Ask them what the biggest mistake they've ever made on an enterprise audit was. The answer tells you everything. An agency that says "we've never really had a major issue" either hasn't done many enterprise audits or isn't being straight with you. Every audit at scale uncovers something uncomfortable. That's the whole point.

---

FAQ

How long should an enterprise SEO audit take?

For a site between 50,000 and 500,000 URLs, four to six weeks is reasonable if done properly. I'm suspicious of agencies promising a two-week turnaround at that scale — either they're skipping log file analysis, or the "audit" is an automated report with light commentary. Anything over ten weeks and you should ask what's actually taking so long.

What should an enterprise SEO audit cost?

Honestly, there's a wide range. For a serious audit — proper technical crawl, log file analysis, content review, backlink assessment, architecture review — you're looking at £8,000–£25,000 depending on site complexity, market scope, and whether hreflang or multi-domain setups are involved. Anything under £4,000 for a genuinely large site is almost certainly template work.

Should the agency fixing issues also run the audit?

There's an obvious conflict of interest there, but I don't think it automatically disqualifies them. What matters is that the audit findings are documented clearly enough that a different agency could implement them. If the audit is so vague that only the audit agency can interpret it, that's a problem regardless of whether it's intentional.

Do I need an enterprise audit if we already had one 18 months ago?

Yes, if the site has gone through a migration, a major CMS change, or significant URL restructuring since then. Also yes if organic traffic has shifted more than 20% in either direction — something changed and you need to know what. Audits aren't a one-time thing at enterprise scale. Annual is the minimum for large, active sites.

What's the single most overlooked part of an enterprise SEO audit?

Log file analysis. Every time. It's time-consuming, requires server access that clients are sometimes nervous to grant, and the findings aren't as visually flashy as a Core Web Vitals graph. But it's the closest thing you'll get to reading Googlebot's mind. No serious enterprise audit should skip it.

---

The 94-page PDF from that retailer is still the benchmark I measure everything against — in reverse. If your audit doesn't have a prioritised fix list, real crawl budget data, and a log file analysis, you haven't received an audit. You've received a report. There's a difference, and it's worth the argument to demand the right one.

< BACK TO BLOG