Why Screen Reader Testing Matters
Screen readers are assistive technologies used by people who are blind, have low vision, or have reading disabilities. Testing with screen readers reveals accessibility issues that automated tools miss, including logical reading order, meaningful link text, proper heading structure, and form label associations. WCAG compliance requires content to work with assistive technologies.
Major Screen Readers
Understanding the screen reader landscape helps prioritize testing:
NVDA (NonVisual Desktop Access): Free, open-source Windows screen reader. Most popular for testing.
JAWS (Job Access With Speech): Commercial Windows screen reader. Popular in enterprise environments.
VoiceOver: Built into macOS and iOS. Important for Apple device testing.
TalkBack: Built into Android devices. Essential for Android app testing.
Narrator: Built into Windows. Less commonly used but worth checking.
Browser compatibility: Test with Firefox or Chrome on Windows, Safari on Mac.
Getting Started with NVDA
NVDA is the recommended starting point for screen reader testing because it's free and widely used. Download from nvaccess.org. Key commands:
Insert = NVDA modifier key
NVDA + Down Arrow: Say all (read continuously)
Down Arrow: Next item in browse mode
Tab: Next focusable element
H: Next heading
D: Next landmark
F: Next form field
T: Next table
NVDA + F7: Elements list (headings, links, landmarks)
NVDA + Space: Toggle between browse and focus modes
Getting Started with VoiceOver (Mac)
VoiceOver is built into every Mac. Enable with Command + F5 or System Preferences > Accessibility. Key commands:
VO = Control + Option (VoiceOver modifier)
VO + Right/Left Arrow: Navigate items
VO + Command + H: Next heading
VO + Command + J: Next form control
VO + Command + L: Next link
VO + Command + T: Next table
VO + U: Rotor (navigate by element type)
VO + A: Read all from current position
Tab: Navigate focusable elements
Testing Checklist
Systematically test these aspects with screen readers:
Page title announced correctly when page loads
All headings are announced and properly nested (H1 > H2 > H3)
All images have appropriate alt text (or are marked decorative)
Links clearly describe their destination
Form fields have associated labels announced
Error messages are announced when they appear
Tables have proper headers and reading order makes sense
Modal dialogs trap focus and announce their purpose
Dynamic content changes are announced (live regions)
Skip links work and bypass navigation
Keyboard focus is visible and follows logical order
Common Issues Revealed by Screen Reader Testing
Screen reader testing commonly reveals:
Missing or inadequate alt text on images
Form fields without programmatic labels
Heading structure that skips levels or is illogical
Links that say 'click here' or 'read more' without context
Custom controls (buttons, menus) lacking accessible names and roles
Dynamic content that updates without announcement
Focus not managed properly in modals and single-page applications
Tables without proper header associations
Reading order that differs from visual order
Integrating with A11yScan
A11yScan's automated testing catches many issues before screen reader testing, allowing you to focus manual testing on complex interactions. Use A11yScan to identify and fix obvious issues first, then conduct screen reader testing to verify proper implementation and catch issues that require human judgment. A11yScan's guided manual testing workflows include screen reader testing checkpoints.