What this guide is for
Accessibility audits vary wildly. Some give you a practical fix list. Others give you noise, screenshots, and a score.
This guide sets expectations. It helps you buy the right thing. It also helps you compare suppliers fairly.
What a proper accessibility audit includes
1) Scope and journeys
A good audit starts by agreeing what will be tested. It focuses on key journeys, not a random list of pages.
- A clear target standard, usually WCAG 2.2 AA.
- A list of key journeys, chosen with you. For example, donate, enquire, apply, book, buy.
- A set of representative templates, not every URL on the site.
- Known constraints called out early, such as embedded third-party tools you cannot control.
2) Manual testing
Automated scanning finds some issues. Manual testing finds the issues that block real people.
- Keyboard-only testing across key journeys, including navigation, menus, modals, and forms.
- Focus checks, order, visibility, trapping, and returning focus after overlays.
- Form checks, labels, instructions, required fields, validation, and error messaging.
- Structure checks, headings, landmarks, lists, tables, and link text.
- Zoom and reflow checks at 200%, plus small-screen checks.
- Motion checks for carousels, animation, video, and reduced motion preferences where relevant.
3) Assistive technology checks
You do not need testing across every screen reader on earth. You do need coverage that reflects common user setups.
- Screen reader spot checks on key templates and key interactions.
- Checks on dynamic components such as menus, accordions, tabs, modals, and notifications.
- Notes that describe what happens, not vague statements like “works fine”.
4) Clear findings and priorities
The output should help you fix issues quickly. It should also help non-technical stakeholders understand risk.
- A prioritised issue list, sorted by user impact and journey impact.
- For each issue, severity, affected templates, steps to reproduce, and expected behaviour.
- Suggested fixes, with examples where helpful.
- Clear mapping back to WCAG criteria where relevant, without turning the report into a legal document.
5) Retesting and closure
Fixes need verification. Without retesting, an audit becomes a to-do list you never trust.
- A retest pass after fixes land.
- Closure notes per issue, including evidence of the fix working in the journey.
- A short regression checklist for future releases.
What an accessibility audit never includes
If an audit leans on these, it is not a useful audit. It is a sales document.
- A single automated scan presented as an audit.
- A score with no issue detail, no reproduction steps, and no fix guidance.
- A recommendation to add an accessibility overlay instead of fixing the site.
- A list of WCAG criteria with no explanation of how your pages fail, or where users get blocked.
- A promise of full compliance without scope, caveats, and retest evidence.
What to ask for before you buy
- A sample report from a recent project, redacted if needed.
- A clear statement of what pages and journeys are in scope.
- The level of manual testing included.
- Whether fixing support is available, and how it is priced.
- Whether retesting is included, and what “done” means.
A quick way to compare two audits
If you have two suppliers, compare on output quality, not on how confident they sound.
- Does the report tell you what to fix and how.
- Does it prioritise issues by user impact.
- Does it cover key journeys, not only random pages.
- Does it include retesting.
Next step
If you want confidence fast, start with your top journeys and your main templates. A good audit gives you a prioritised fix plan you can ship, then retest.