Accessibility Audits: What a Proper Audit Includes
Why Audit Quality Matters
A "proper" accessibility audit is far more than running an automated tool and counting violations. Courts increasingly question the methodology of accessibility audits and dismiss weak ones.
When you claim "we had it audited," that claim is only as strong as your audit methodology. A poorly conducted audit can hurt your legal defense more than having no audit at all.
Legal Disclaimer
A11yscan is not a law firm and does not provide legal advice. We operate under best practices based on WCAG Guidelines, ADA requirements, and applicable jurisdictions. Courts don't always agree on terms and expectations for web accessibility, and legal standards can vary by jurisdiction. However, an accessible website works better for all users regardless of legal requirements. For specific legal guidance, consult with a qualified attorney specializing in accessibility law.
What is an Accessibility Audit?
An accessibility audit is a systematic evaluation of your website against accessibility standards (typically WCAG 2.1). It includes:
- Automated scanning for technical violations
- Manual testing with real assistive technology
- User pathway testing (forms, checkout, etc.)
- Documentation of all findings
- Prioritization of issues by severity
- Remediation recommendations
Audit vs. Quick Scan
- Quick Scan: Automated tool only, identifies obvious violations, limited scope
- Proper Audit: Automated + manual testing, comprehensive scope, real assistive technology, professional findings
Many companies claim to do "audits" when they only run automated tools. That's not sufficient.
Components of a Proper Accessibility Audit
1. Scope Definition
A proper audit clearly defines what will be tested:
- Which pages will be tested (and why)
- Which standards (WCAG 2.1 Level AA, Section 508, etc.)
- Testing timeframe and dates
- Target audience (all users, or specific personas)
- Browsers and devices tested
Example Scope Document
Audit Scope - Acme Corp Website Standard: WCAG 2.1 Level AA Pages Tested: 25 representative pages - Homepage - Main content pages (5 samples) - Contact form - Product pages (3 samples) - User account pages - Checkout flow - Footer content Testing Methodology: - Automated scanning + manual testing - Screen reader testing (NVDA, JAWS) - Keyboard navigation - Browsers: Chrome, Firefox, Safari - Devices: Desktop, Tablet, Mobile Testing Dates: October 1-15, 2025
2. Automated Testing
Use multiple automated tools to catch obvious violations:
- axe DevTools: Most comprehensive, good for developers
- WAVE: Visual feedback, good for designers
- Lighthouse: Built-in to Chrome, quick overview
- Deque's aXe-core: Underlying engine for many tools
Important: Automated Tools Miss 60-70% of Issues
- Automated tools only catch coding errors
- They can't evaluate user experience
- They miss context and logic errors
- Manual testing is essential
What Automated Tools Can Catch
- Missing alt text on images
- Color contrast violations
- Missing form labels
- Heading structure errors
- Missing ARIA attributes
- Keyboard trap detection (limited)
What Automated Tools Miss
- Bad alt text (technically present, but meaningless)
- Logical tab order issues
- Screen reader usability problems
- User experience with screen readers
- Complex interaction patterns
- Context-dependent violations
3. Manual Testing: The Critical Component
A proper audit includes extensive manual testing with real assistive technology.
Keyboard Testing
Test every interactive element with keyboard only:
- Tab through entire page(s)
- Verify focus order is logical
- Check for focus traps
- Verify focus indicators are visible
- Test form submission via keyboard
- Test modal dialogs
Screen Reader Testing
Test with at least 2-3 screen readers:
NVDA (Free, Windows)
- Download at nvaccess.org
- Free and open-source
- Most common for initial testing
- Growing market share
JAWS (Commercial)
- Industry standard for professionals with disabilities
- $90/year or $1,200+ one-time
- Should always include JAWS in testing
- Different behavior than NVDA
VoiceOver (Mac/iOS)
- Built into Mac computers and iOS devices
- Free and widely used
- Different interaction model than Windows readers
Screen Reader Testing Includes
- Page structure (headings, landmarks, sections)
- Navigation menus and links
- Form fields and error messages
- Image descriptions (alt text quality)
- Table structure and headers
- Dynamic content and updates
- Modal dialogs and focus traps
- Skip links functionality
Voice Control Testing
- Dragon NaturallySpeaking (Windows)
- Voice Control (Mac)
- Built-in accessibility on mobile devices
Mobile Testing
- VoiceOver on iPhone/iPad
- TalkBack on Android devices
- Responsive design verification
- Touch target size validation
4. Systematic Testing Process
For Each Page Tested:
- Run automated tools (axe, WAVE, Lighthouse)
- Manually inspect page structure
- Test with keyboard (Tab through entire page)
- Test with NVDA screen reader
- Test with JAWS screen reader
- Test with mobile screen reader (VoiceOver)
- Test color contrast for all text
- Test mobile responsiveness
- Document all findings
Example Testing Report
Page: Homepage Date: October 5, 2025 Tester: [Name] Automated Testing Results: - axe DevTools: 12 violations, 8 warnings - WAVE: 10 errors, 15 warnings - Lighthouse: 89/100 accessibility score Manual Testing: ✓ Keyboard navigation: PASS - All buttons keyboard accessible - Focus order logical (top to bottom) - No focus traps ✓ Screen Reader (NVDA): PASS - Page structure clear - Links descriptive - Images have alt text ✗ Screen Reader (JAWS): FAIL - Hero image alt text not announced - Navigation menu structure unclear ✓ Mobile (VoiceOver): PASS - Touch targets sufficient size - Responsive layout accessible ✓ Color Contrast: PASS - All text meets 7:1 ratio Findings: [List of violations and recommendations]
5. Issue Severity Classification
A proper audit prioritizes issues by severity:
Critical (Must Fix Immediately)
- Makes entire page unusable for people with disabilities
- Prevents form submission
- Prevents access to critical content
- Focus traps or keyboard traps
- Missing form labels
High (Fix Within 2-4 Weeks)
- Severely impacts usability
- Affects multiple pages
- Screen reader compatibility issues
- Heading structure errors
- Color contrast failures
Medium (Fix Within 1-2 Months)
- Impacts specific user actions
- Works but is confusing
- Alt text could be better
- Minor keyboard accessibility issues
Low (Fix Within 3 Months)
- Cosmetic or minor issues
- Affects edge cases
- Polish improvements
Informational
- Best practices recommendations
- Performance optimizations
- Future enhancements
6. WCAG Criterion Coverage
A proper audit systematically checks all relevant WCAG 2.1 criteria:
Level A Criteria (Must Meet)
- 1.1.1 Non-text Content
- 1.3.1 Info and Relationships
- 2.1.1 Keyboard
- 2.1.2 No Keyboard Trap
- 2.4.3 Focus Order
- 3.2.1 On Focus
- 4.1.2 Name, Role, Value
Level AA Criteria (Should Meet)
- 1.4.3 Contrast (Minimum)
- 2.4.7 Focus Visible
- 3.3.1 Error Identification
- 3.3.3 Error Suggestion
A proper audit documents compliance with each criterion.
7. Representative Sample Size
Testing page samples should be representative of your entire site:
Small Sites (1-25 pages)
- Test ALL pages
- Or minimum 10 pages
Medium Sites (25-500 pages)
- Test minimum 25-50 pages
- Representative of all content types
- All major workflows
Large Sites (500+ pages)
- Test minimum 50-100 pages
- All templates tested
- Statistical sample if templates are consistent
E-Commerce Sites
- Homepage
- Product listing pages
- Product detail pages
- Shopping cart
- Checkout flow (critical)
- Search results
- Filters and facets
8. Comprehensive Documentation
A proper audit includes detailed documentation:
What Should Be Documented
- Audit scope and objectives
- Audit dates and timeframe
- Testing methodology
- Tools used (versions)
- Standards targeted (WCAG 2.1 Level AA)
- Pages tested (with URLs)
- Screen readers and browsers tested
- Tester names and credentials
- Each WCAG criterion tested and status
- All violations found with:
- Location (page URL, element)
- Severity level
- WCAG criterion violated
- Description of the problem
- How to fix it
- Steps to reproduce
- Screenshots or examples (when applicable)
Example Finding Documentation
Finding #47
Page: https://acme.com/products/widgets
Element: Image in hero section
Criterion: 1.1.1 Non-text Content (Level A)
Severity: CRITICAL
Status: Found October 5, 2025
Problem:
Hero section background image has no alt text.
Image contains "Spring Sale: 50% Off" text that is
important to understanding the page purpose.
How to Fix:
Add alt attribute to image or provide text alternative:
<img src="spring-sale-banner.jpg"
alt="Spring Sale: 50% Off Widgets" />
Steps to Reproduce:
1. Visit https://acme.com/products/widgets
2. Use screen reader to read the hero section
3. Note that no text is announced
4. Inspect HTML to confirm no alt attribute
Screen Reader Testing:
- NVDA: NOT ANNOUNCED (FAIL)
- JAWS: NOT ANNOUNCED (FAIL)
- VoiceOver: NOT ANNOUNCED (FAIL)
9. Handling False Positives
Automated tools sometimes report violations that aren't actually problems. A proper audit distinguishes:
True Positive
Real violation that needs fixing
Form input without label = True Positive Status: FIX REQUIRED
False Positive
Tool reports an issue, but it's actually compliant
axe reports low contrast on: <div style="color: #666; background: white"> [Decorative design element] </div> Manual Review: FALSE POSITIVE Reason: Element is decorative (aria-hidden), not text Status: NO ACTION NEEDED
Context-Dependent
Issue requires manual review to confirm
Tool detects form has no visible label, but CSS hides label off-screen for design reasons. Status: REVIEW CONTEXT Result: If visually hidden for design, add aria-label instead
10. Remediation Plan & Timeline
A proper audit includes a remediation roadmap:
Remediation Priority Matrix
Critical Issues: 12 violations Timeline: Fix within 2 weeks Owner: Development lead Examples: - Form labels missing (3 instances) - Keyboard trap in search (1 instance) - Focus order illogical (2 instances) - Color contrast failures (6 instances) High Priority: 24 violations Timeline: Fix within 4 weeks Owner: Development team Examples: - Bad alt text (12 instances) - Missing error messages (8 instances) - Heading structure (4 instances) Medium Priority: 18 violations Timeline: Fix within 8 weeks Owner: Content team + Development Low Priority: 5 violations Timeline: Fix within 12 weeks Owner: Backlog for future release
Key Takeaways
- A proper audit = automated + manual testing
- Automated tools miss 60-70% of issues
- Manual testing with screen readers is mandatory
- Test with NVDA, JAWS, and VoiceOver
- Test keyboard navigation on every page
- Use representative sample of pages
- Document testing methodology clearly
- Classify issues by severity
- Check all WCAG 2.1 criteria
- Provide specific remediation recommendations
- Include remediation timeline
- Professional audits carry more legal weight