Why Audit Quality Matters
A "proper" accessibility audit is far more than running an automated tool and counting violations. Courts increasingly question the methodology of accessibility audits and dismiss weak ones. When you claim "we had it audited," that claim is only as strong as your audit methodology. A poorly conducted audit can hurt your legal defense more than having no audit at all.
What is an Accessibility Audit?
An accessibility audit is a systematic evaluation of your website against accessibility standards (typically WCAG 2.1). It includes: Many companies claim to do "audits" when they only run automated tools. That's not sufficient.
Automated scanning for technical violations
Manual testing with real assistive technology
User pathway testing (forms, checkout, etc.)
Documentation of all findings
Prioritization of issues by severity
Remediation recommendations
Quick Scan: Automated tool only, identifies obvious violations, limited scope
Proper Audit: Automated + manual testing, comprehensive scope, real assistive technology, professional findings
Components of a Proper Accessibility Audit
A proper audit clearly defines what will be tested: Audit Scope - Acme Corp Website Standard: WCAG 2.1 Level AA Pages Tested: 25 representative pages - Homepage - Main content pages (5 samples) - Contact form - Product pages (3 samples) - User account pages - Checkout flow - Footer content Testing Methodology: - Automated scanning + manual testing - Screen reader testing (NVDA, JAWS) - Keyboard navigation - Browsers: Chrome, Firefox, Safari - Devices: Desktop, Tablet, Mobile Testing Dates: October 1-15, 2025 2. Automated Testing Use multiple automated tools to catch obvious violations:
Which pages will be tested (and why)
Which standards (WCAG 2.1 Level AA, Section 508, etc.)
Testing timeframe and dates
Target audience (all users, or specific personas)
Browsers and devices tested
axe DevTools: Most comprehensive, good for developers
WAVE: Visual feedback, good for designers
Lighthouse: Built-in to Chrome, quick overview
Deque's aXe-core: Underlying engine for many tools
Automated tools only catch coding errors
They can't evaluate user experience
They miss context and logic errors
Manual testing is essential
Missing alt text on images
Color contrast violations
Missing form labels
Heading structure errors
Missing ARIA attributes
Keyboard trap detection (limited)
Bad alt text (technically present, but meaningless)
Logical tab order issues
Screen reader usability problems
User experience with screen readers
Complex interaction patterns
Context-dependent violations
3. Manual Testing: The Critical Component
A proper audit includes extensive manual testing with real assistive technology. Test every interactive element with keyboard only: Test with at least 2-3 screen readers:
Tab through entire page(s)
Verify focus order is logical
Check for focus traps
Verify focus indicators are visible
Test form submission via keyboard
Test modal dialogs
Download at nvaccess.org
Free and open-source
Most common for initial testing
Growing market share
Industry standard for professionals with disabilities
$90/year or $1,200+ one-time
Should always include JAWS in testing
Different behavior than NVDA
Built into Mac computers and iOS devices
Free and widely used
Different interaction model than Windows readers
Page structure (headings, landmarks, sections)
Navigation menus and links
Form fields and error messages
Image descriptions (alt text quality)
Table structure and headers
Dynamic content and updates
Modal dialogs and focus traps
Skip links functionality
Dragon NaturallySpeaking (Windows)
Voice Control (Mac)
Built-in accessibility on mobile devices
VoiceOver on iPhone/iPad
TalkBack on Android devices
Responsive design verification
Touch target size validation
4. Systematic Testing Process
Run automated tools (axe, WAVE, Lighthouse)
Manually inspect page structure
Test with keyboard (Tab through entire page)
Test with NVDA screen reader
Test with JAWS screen reader
Test with mobile screen reader (VoiceOver)
Test color contrast for all text
Test mobile responsiveness
Document all findings
5. Issue Severity Classification
A proper audit prioritizes issues by severity:
Makes entire page unusable for people with disabilities
Prevents form submission
Prevents access to critical content
Focus traps or keyboard traps
Missing form labels
Severely impacts usability
Affects multiple pages
Screen reader compatibility issues
Heading structure errors
Color contrast failures
Impacts specific user actions
Works but is confusing
Alt text could be better
Minor keyboard accessibility issues
Cosmetic or minor issues
Affects edge cases
Polish improvements
Best practices recommendations
Performance optimizations
Future enhancements
6. WCAG Criterion Coverage
A proper audit systematically checks all relevant WCAG 2.1 criteria: A proper audit documents compliance with each criterion.
1.1.1 Non-text Content
1.3.1 Info and Relationships
2.1.1 Keyboard
2.1.2 No Keyboard Trap
2.4.3 Focus Order
3.2.1 On Focus
4.1.2 Name, Role, Value
1.4.3 Contrast (Minimum)
2.4.7 Focus Visible
3.3.1 Error Identification
3.3.3 Error Suggestion
7. Representative Sample Size
Testing page samples should be representative of your entire site:
Test ALL pages
Or minimum 10 pages
Test minimum 25-50 pages
Representative of all content types
All major workflows
Test minimum 50-100 pages
All templates tested
Statistical sample if templates are consistent
Homepage
Product listing pages
Product detail pages
Shopping cart
Checkout flow (critical)
Search results
Filters and facets
8. Comprehensive Documentation
A proper audit includes detailed documentation:
Audit scope and objectives
Audit dates and timeframe
Testing methodology
Tools used (versions)
Standards targeted (WCAG 2.1 Level AA)
Pages tested (with URLs)
Screen readers and browsers tested
Tester names and credentials
Each WCAG criterion tested and status
All violations found with:
Location (page URL, element)
Severity level
WCAG criterion violated
Description of the problem
How to fix it
Steps to reproduce
Screenshots or examples (when applicable)
9. Handling False Positives
Automated tools sometimes report violations that aren't actually problems. A proper audit distinguishes: Real violation that needs fixing Form input without label = True Positive Status: FIX REQUIRED False Positive Tool reports an issue, but it's actually compliant axe reports low contrast on: <div style="color: #666; background: white"> [Decorative design element] </div> Manual Review: FALSE POSITIVE Reason: Element is decorative (aria-hidden), not text Status: NO ACTION NEEDED Context-Dependent Issue requires manual review to confirm
10. Remediation Plan & Timeline
A proper audit includes a remediation roadmap: