Chapter 13: Accessibility & Inclusion Research
Executive Summary
Accessibility in B2B IT is both a legal mandate (ADA, WCAG, Section 508) and a business opportunity: 15% of the global workforce has disabilities, and accessible products serve aging users, temporary impairments (broken arm, eye strain), and situational limitations (noisy environments, mobile use). Yet accessibility is often bolted on post-launch through checklists and automated tools that miss real-world usage patterns. This chapter presents a research-driven approach to accessibility and inclusion: recruit diverse participants (assistive tech users, cognitive differences, motor impairments), conduct accessibility-specific usability tests, validate against WCAG 2.1 AA+ standards, and embed findings into personas, journey maps, and design systems. By treating accessibility as discovery (not compliance theater), teams build products that work for all users, reduce legal risk, expand market reach, and demonstrate enterprise-grade quality.
Definitions & Scope
Accessibility (A11y)
The practice of designing products usable by people with disabilities: visual (blindness, low vision), auditory (deafness, hard of hearing), motor (limited dexterity, tremor), cognitive (dyslexia, ADHD, memory), and speech impairments.
WCAG (Web Content Accessibility Guidelines)
International standard for web accessibility. Three levels:
- Level A: Basic accessibility (minimum).
- Level AA: Acceptable accessibility (enterprise standard, ADA/Section 508 compliance).
- Level AAA: Optimal accessibility (aspirational for most).
B2B Standard: WCAG 2.1 AA (or 2.2 AA, released 2023).
Assistive Technologies (AT)
Tools used by people with disabilities:
- Screen Readers: JAWS, NVDA, VoiceOver (for blind/low-vision users).
- Screen Magnifiers: ZoomText, built-in OS magnification.
- Keyboard-Only Navigation: Users who can't use a mouse (motor impairments).
- Voice Control: Dragon NaturallySpeaking, OS voice commands.
- Alternative Input: Switch controls, eye-tracking, sip-and-puff devices.
Inclusion
Designing for diverse users beyond disability: age (older users), language (non-native speakers), bandwidth (low connectivity), literacy (plain language), and neurodiversity (ADHD, autism, dyslexia).
Scope
This chapter applies to B2B IT product teams (PM, Design, Research, Eng, QA) building web/mobile apps, back-office tools, and websites. Covers accessibility research methods, WCAG validation, and inclusive design practices.
Customer Jobs & Pain Map
| User Group | Job To Be Done | Current Pain (Inaccessible Products) | Outcome with Accessible Design | CX Opportunity |
|---|---|---|---|---|
| Screen Reader Users (Blind/Low Vision) | Complete tasks (provision users, generate reports) via screen reader | Missing alt text; unlabeled forms; poor heading structure; keyboard traps; unannounced dynamic content | All content accessible via screen reader; logical navigation; clear labels; dynamic updates announced | WCAG AA compliance; semantic HTML; ARIA labels; screen reader testing (JAWS, NVDA, VoiceOver) |
| Keyboard-Only Users (Motor Impairments) | Navigate and complete tasks without mouse | No keyboard access to dropdowns, modals; unclear focus indicators; illogical tab order | All interactions keyboard-accessible; visible focus indicators; logical tab order | Keyboard nav testing; focus management; skip links; accessible modals/dropdowns |
| Low Vision Users (Partial Sight) | Read and interact with UI at high zoom (200–400%) | Low contrast text; text doesn't reflow at zoom; fixed font sizes; reliance on color alone | High contrast (4.5:1); responsive zoom; adjustable font sizes; color + text/icons | Color contrast checks; zoom testing (200%); user preference settings (dark mode, font size) |
| Deaf/Hard of Hearing Users | Access video/audio content; receive alerts | No captions on videos; audio-only alerts (no visual); unclear error messages | Captions/transcripts for all video/audio; visual + audio alerts; clear error messages | Video captions; visual alerts (flash, banner); clear error messages (not just sound) |
| Cognitive/Neurodiverse Users (Dyslexia, ADHD, Autism) | Understand complex workflows; complete multi-step tasks without overwhelm | Jargon-heavy language; complex navigation; no progress indicators; overwhelming UI | Plain language; clear navigation; progress indicators; simplified workflows; help text | Plain language; clear IA; progress indicators; cognitive load reduction; contextual help |
| Older Users (Age 60+) | Complete tasks despite declining vision, dexterity, cognitive speed | Small tap targets; low contrast; fast timeouts; complex gestures | Large tap targets (44×44px); high contrast; generous timeouts; simple gestures | Senior-friendly design; large tap targets; extended timeouts; simple interactions |
| Situational Users (Mobile, Noisy, Bright Sun) | Use product in challenging environments | Small fonts on mobile; poor contrast in sunlight; audio-only feedback in noisy warehouse | Responsive mobile design; high contrast; visual + haptic feedback | Mobile-first design; high contrast mode; multi-modal feedback (visual + haptic + audio) |
| Enterprise Buyers/Legal | Ensure vendor compliance; reduce legal risk | Vendor lacks WCAG compliance; no VPAT (Voluntary Product Accessibility Template); accessibility gaps discovered late | WCAG 2.1 AA compliant; VPAT available; accessibility roadmap transparent | Public accessibility statement; VPAT (Section 508); accessibility roadmap; annual audits |
Framework / Model: The Accessibility Research Framework
Five-Phase Accessibility Research Process
Phase 1: Accessibility Baseline Audit (2–4 Weeks)
Objective: Understand current accessibility state.
Methods:
- Automated Testing: Run tools (Axe, WAVE, Lighthouse) on key flows (onboarding, core tasks, admin). Identifies ~30% of issues (low-hanging fruit: missing alt text, color contrast, HTML semantics).
- Manual Testing: Test with keyboard-only (no mouse), screen reader (NVDA, JAWS), zoom (200%), high-contrast mode. Identifies ~40% of issues (focus order, ARIA, dynamic content).
- WCAG Checklist: Map findings to WCAG 2.1 AA criteria (78 success criteria). Track pass/fail/N/A.
Deliverable: Accessibility audit report (issues by severity: Critical, High, Medium, Low). Compliance score (% of WCAG 2.1 AA met).
Phase 2: Inclusive Persona & Journey Research (2–3 Weeks)
Objective: Understand how diverse users (assistive tech, cognitive, age, situational) interact with product.
Methods:
-
Recruit Diverse Participants:
- Assistive Tech Users: Screen reader users (blind/low vision), keyboard-only users (motor impairments), voice control users.
- Cognitive/Neurodiverse: Dyslexia, ADHD, autism, memory impairments.
- Older Users: Age 60+ (declining vision, dexterity, cognitive speed).
- Situational: Mobile users, users in noisy/bright environments.
- Target: 3–5 participants per group, 12–20 total.
-
Accessibility-Specific Interviews:
- Ask: "What assistive tech do you use? (Screen reader, magnifier, keyboard-only, voice control?)"
- "Walk me through completing [task] with your setup. What's hard? What works well?"
- "What are common barriers in software you use? How do you work around them?"
-
Contextual Inquiry: Observe participants using assistive tech in their environment (home, office, field). Note workarounds, frustrations, successes.
Deliverable: Inclusive personas (1–2 pages each): "Maya, Blind Analyst (JAWS User)," "Carlos, Senior IT Admin (Low Vision, Zoom 200%)," "Priya, ADHD Product Manager (Cognitive Load Sensitivity)." Include: AT used, workflows, pains, accessibility needs.
Phase 3: Accessibility Usability Testing (3–5 Weeks)
Objective: Test product with assistive tech users, identify real-world barriers.
Methods:
- Recruit Participants: 8–15 assistive tech users (screen reader, keyboard-only, low vision, cognitive, older users). Mix of personas.
- Test Protocol:
- Give realistic tasks (e.g., "Provision 10 users," "Generate weekly report").
- Observe with their assistive tech (don't interfere unless stuck >5 min).
- Note: Task success (yes/no), time to complete, errors, workarounds, frustration (1–10 scale).
- Ask: "What was hard? What would make this easier?"
- Test Environment: Remote (user's own setup) or in-person (bring their AT). Ensure representative of real usage.
Deliverable: Usability test report (task success %, barriers by severity, quotes, session recordings). Issues mapped to WCAG criteria and product backlog.
Phase 4: WCAG Validation & Remediation (4–8 Weeks)
Objective: Fix identified issues, achieve WCAG 2.1 AA compliance.
Methods:
- Prioritize Issues: Critical (blocks task completion, WCAG Level A fail) → High (degrades experience, WCAG Level AA fail) → Medium/Low.
- Remediate:
- Quick Wins (1–2 weeks): Alt text, color contrast, keyboard access, ARIA labels.
- Medium (2–4 weeks): Focus management, error handling, dynamic content announcements.
- Long-Term (1–2 months): Complex interactions (drag-drop, rich editors), design system updates.
- Re-Test: After fixes, re-run automated tools + manual testing + assistive tech usability test (5–8 participants). Confirm issues resolved.
Deliverable: Remediation roadmap (issues → backlog → shipped). WCAG 2.1 AA compliance report (pass/fail, % met).
Phase 5: Embed Accessibility in Workflow (Ongoing)
Objective: Prevent regression, maintain compliance, continuous improvement.
Methods:
- Design Phase:
- Accessibility review in design critique (color contrast, keyboard access, focus indicators).
- Use accessible design system (pre-validated components: buttons, forms, modals).
- Development Phase:
- Accessibility linting in CI/CD (Axe, Pa11y). Fail build if Critical issues.
- Developer training (semantic HTML, ARIA, keyboard patterns).
- QA Phase:
- Accessibility test cases (keyboard-only, screen reader, zoom). Part of Definition of Done (DoD).
- Quarterly usability tests with assistive tech users (5–8 participants).
- Documentation:
- Public accessibility statement (website): WCAG compliance level, known issues, roadmap, contact (accessibility@company.com).
- VPAT (Voluntary Product Accessibility Template) for enterprise buyers (Section 508 compliance).
Deliverable: Accessibility playbook (design, dev, QA standards). Public accessibility statement. VPAT. Quarterly accessibility usability test cadence.
Diagram description: Visualize as cycle: Audit (baseline) → Research (personas, journeys) → Usability Test (AT users) → Remediate (fix issues) → Embed (ongoing process) → (Loop back: Quarterly re-audit).
Implementation Playbook
0–30 Days: Accessibility Baseline & Recruit
Week 1: Automated + Manual Audit
- Run automated tools (Axe, WAVE, Lighthouse) on top 10 user flows (onboarding, key tasks, admin).
- Conduct manual keyboard-only test (no mouse). Note: Can you reach all controls? Visible focus? Logical tab order?
- Test with screen reader (NVDA or VoiceOver, free). Note: Missing labels? Unlabeled buttons? Dynamic content not announced?
Week 2: WCAG Gap Analysis
- Map issues to WCAG 2.1 AA criteria. Use checklist (78 success criteria).
- Score: % of criteria met. Example: "42 of 78 met (54%). Major gaps: 1.3.1 Info & Relationships, 2.1.1 Keyboard, 4.1.2 Name/Role/Value."
- Prioritize: Critical (blocks tasks), High (degrades experience), Medium, Low.
Week 3: Recruit Diverse Participants
- Identify participant groups: Screen reader users (3–5), keyboard-only (3), low vision (3), cognitive/neurodiverse (3), older users (3). Total: 12–20.
- Recruit via: Accessibility-focused agencies (Fable, Access Works), disability networks, user research panels, customer base (ask CS for referrals).
- Compensation: $100–200/hour (higher for specialized AT users).
Week 4: Accessibility Interviews
- Interview 5–10 participants (mix of groups). 60 min each.
- Ask: AT used? Daily workflows? Common software barriers? Workarounds?
- Observe: Have them show you their setup (screen reader, magnifier, keyboard shortcuts).
- Document: AT needs, pains, workarounds, success criteria.
Artifacts: Accessibility audit report (automated + manual), WCAG gap analysis, participant recruitment plan, accessibility interview insights.
30–90 Days: Usability Test & Remediate
Month 2: Accessibility Usability Testing
- Week 5–6: Run usability tests with 8–12 AT users (screen reader, keyboard-only, low vision, cognitive, older).
- Tasks: Realistic (onboarding, core workflows, admin tasks).
- Observe with their AT. Note: Task success, time, errors, workarounds, frustration.
- Week 7: Synthesize findings. Map barriers to WCAG criteria. Prioritize issues.
Month 2–3: Remediation Sprint
- Week 8–10: Fix Critical + High issues. Focus on:
- Keyboard access: All interactions keyboard-navigable. Visible focus indicators (2px outline, high contrast).
- Screen reader: Semantic HTML (headings, landmarks, lists). ARIA labels for custom controls. Announce dynamic updates (aria-live).
- Color contrast: Text 4.5:1, UI elements 3:1. Don't rely on color alone (use icons + text).
- Week 11: Re-test with 5 AT users. Confirm fixes work.
Month 3: Achieve WCAG 2.1 AA
- Final automated + manual audit. Confirm all Critical/High issues resolved.
- Generate WCAG 2.1 AA compliance report (pass/fail, % met). Target: ≥95% (some N/A criteria acceptable).
- Publish accessibility statement (website): "We conform to WCAG 2.1 AA. Known issues: [list]. Roadmap: [link]."
Checkpoints: Accessibility usability tests completed (8–12 participants), Critical/High issues fixed, WCAG 2.1 AA achieved (≥95%), accessibility statement published.
Design & Engineering Guidance
Design Patterns for Accessibility
Keyboard Navigation
- All interactive elements keyboard-accessible (tab, enter, space, arrow keys).
- Visible focus indicator (2px outline, high contrast color, not default browser blue).
- Logical tab order (left-to-right, top-to-bottom, follows visual hierarchy).
- Skip links ("Skip to main content") at top of page (hidden until focused).
- Keyboard shortcuts documented, customizable, don't conflict with AT (avoid single-key shortcuts).
Screen Reader Support
- Semantic HTML:
<button>,<nav>,<main>,<h1>–<h6>,<ul>/<ol>. Not<div onclick>. - ARIA labels for custom controls:
aria-label="Close dialog",role="button",aria-expanded="false". - Announce dynamic content:
aria-live="polite"(non-urgent),aria-live="assertive"(urgent, e.g., errors). - Heading structure: Logical hierarchy (
<h1>page title,<h2>sections,<h3>subsections). Don't skip levels. - Landmarks:
<header>,<nav>,<main>,<aside>,<footer>. Helps screen reader users navigate.
Visual Accessibility (Low Vision, Color Blindness)
- Color Contrast: Text 4.5:1 (WCAG AA), 7:1 (AAA). Large text (18pt+) 3:1. UI elements (borders, icons) 3:1.
- Don't Rely on Color: Use color + text/icon. Example: Success = green + checkmark icon. Error = red + X icon + "Error:" text.
- Responsive Zoom: UI usable at 200% zoom (WCAG AA), 400% (AAA). Text reflows, no horizontal scroll.
- Font Size: Minimum 16px body text. Allow user-adjustable (settings: Small, Medium, Large).
- High Contrast Mode: Test in OS high-contrast mode (Windows, macOS). Ensure borders, icons visible.
Cognitive Accessibility
- Plain Language: Avoid jargon. Write at 8th-grade reading level. Use short sentences, active voice.
- Clear Navigation: Consistent placement (nav bar always top). Breadcrumbs (Home > Settings > Security).
- Progress Indicators: Multi-step flows show progress (Step 2 of 5). Estimate time ("~10 minutes").
- Error Prevention: Inline validation (real-time). Confirmations for destructive actions ("Delete 1000 users? Confirm/Cancel").
- Help & Context: Tooltips, inline help, "?" icons. FAQs, video tutorials (with captions).
Temporal Accessibility (Timing)
- No timeouts for critical tasks. For session timeouts, warn 2 min before, allow extend.
- Animations pausable (play/pause button). Auto-play videos off by default (or pause button prominent).
Engineering Patterns for Accessibility
Semantic HTML & ARIA
- Use native HTML elements first. Only add ARIA if native doesn't exist.
- Example: Use
<button>(native) not<div role="button">(ARIA). - ARIA patterns for complex widgets: Modals (
role="dialog",aria-modal="true"), tabs (role="tablist"), accordions (aria-expanded).
Focus Management
- When opening modal, move focus to first interactive element (e.g., close button).
- When closing modal, return focus to trigger element (button that opened modal).
- For single-page apps (SPA), announce route changes to screen readers (
aria-live, update<title>).
Keyboard Event Handling
- Support standard keys: Enter/Space (activate), Escape (close), Arrow keys (navigate lists/menus).
- Don't trap focus (user can always tab out). Exception: Modals (trap until closed).
Accessibility Linting & Testing
- CI/CD integration: Axe, Pa11y (automated checks). Fail build if Critical issues (e.g., missing alt text, color contrast <3:1).
- Unit tests for keyboard interaction (React Testing Library:
userEvent.tab(),userEvent.keyboard('{Enter}')). - E2E tests with screen reader assertions (e.g., Playwright + screen reader extensions).
Accessible Components (Design System)
- Pre-validate all design system components (buttons, forms, modals, dropdowns) for WCAG AA.
- Document accessibility features: "Button: Keyboard accessible (Enter/Space). Focus indicator: 2px blue outline. ARIA: role='button'."
- Use component library with built-in accessibility (e.g., Radix UI, Reach UI, Adobe React Spectrum).
Back-Office & Ops Integration
CS Accessibility Workflows
Accessibility Support Requests
- Tag support tickets: "Accessibility Issue" (screen reader, keyboard, contrast, etc.).
- Track volume, themes (e.g., "15 tickets/month: Screen reader can't access audit logs").
- Escalate systemic issues to Product (monthly report: top accessibility issues).
Accessibility in Onboarding
- Ask during onboarding: "Do you use assistive tech? (Screen reader, keyboard-only, voice control.)" Capture in CRM.
- Tailor onboarding: For screen reader users, provide keyboard shortcuts guide, screen reader tips.
Accessibility QBRs
- For accounts with AT users, include accessibility in QBRs: "Any accessibility barriers? Features you can't access?"
- Demonstrate accessibility roadmap: "We're fixing screen reader access to admin panel (Q3). WCAG 2.2 AA compliance (Q4)."
Legal/Compliance Workflows
VPAT (Voluntary Product Accessibility Template)
- Create VPAT for Section 508 compliance (required for US federal gov, many enterprises).
- Document: WCAG 2.1 AA conformance per feature (Supports, Partially Supports, Does Not Support, N/A).
- Update annually or when major features ship.
Public Accessibility Statement
- Publish on website (typically /accessibility or /legal/accessibility).
- Include: Conformance level (WCAG 2.1 AA), known issues, roadmap, contact (accessibility@company.com).
- Example: "We conform to WCAG 2.1 AA. Known issue: Admin panel screen reader access (fixing in Q3 2025). Report issues: accessibility@company.com."
Annual Accessibility Audit
- Third-party audit (accessibility consultancy, e.g., Deque, Level Access) annually.
- Identify regressions, new issues, provide remediation roadmap.
- Share audit summary with enterprise buyers (builds trust).
Metrics That Matter
| Metric | Definition | Target | Data Source |
|---|---|---|---|
| WCAG Compliance Score | % of WCAG 2.1 AA success criteria met (78 total) | ≥95% (some N/A acceptable) | Automated tools (Axe) + manual audit |
| Accessibility Task Success | Task completion rate for AT users (screen reader, keyboard-only, low vision) | ≥90% (same as non-AT users) | Usability tests with AT users |
| Accessibility Ticket Volume | # of support tickets tagged "Accessibility Issue" per month | Decrease 30% YoY (indicates fewer barriers) | Support system (Zendesk, Intercom) |
| AT User Satisfaction | NPS/CSAT for assistive tech users | ≥40 NPS (same as general population) | Surveys (tag respondents by AT use) |
| Accessibility Regression | # of new accessibility issues introduced per release | 0 Critical, <5 Medium/Low | CI/CD accessibility checks, quarterly audits |
| Inclusive Design Adoption | % of design/dev team trained in accessibility | 100% (all designers, frontend devs) | Training records, LMS |
Instrumentation:
- Automated accessibility checks in CI/CD (Axe, Pa11y). Track issue count, severity, trends.
- Tag support tickets by "Accessibility Issue" + AT type (screen reader, keyboard, etc.).
- Tag survey respondents by AT use (ask: "Do you use assistive tech? Which?"). Segment NPS/CSAT.
- Quarterly usability tests with AT users (task success, time, satisfaction).
AI Considerations
Where AI Helps
Auto-Generated Alt Text
- AI generates alt text for images (charts, diagrams, photos).
- Example: Chart image → AI alt text: "Bar chart showing Q1 revenue by region. North America: $2M, Europe: $1.5M, Asia: $1M."
- Caution: AI may miss context (e.g., chart shows declining trend). Human review required.
Accessibility Issue Detection (AI-Augmented Audit)
- AI scans UI, flags potential issues beyond automated tools.
- Example: Detects low contrast in dynamic states (hover, focus) missed by static tools.
Plain Language Suggestions
- AI rewrites complex text in plain language.
- Example: Input "Utilize the aforementioned configuration parameters" → AI: "Use the settings above."
Live Captions & Transcripts
- AI-generated captions for videos, webinars, live calls (Zoom, Teams auto-captions).
- Transcripts for audio content (podcasts, voice notes).
Guardrails
Accuracy of AI Alt Text
- AI may generate generic or inaccurate alt text. Example: Photo of team → AI: "Group of people." Better: "Marketing team at Q2 offsite: 8 people in conference room."
- Avoid: Always human-review AI alt text. Use AI as first draft, not final.
Plain Language Context Loss
- AI may oversimplify, lose necessary detail.
- Example: "Per WCAG 2.1 AA criterion 1.4.3" → AI: "Follow contrast rules." (Loses specificity.)
- Avoid: Use AI for drafts, subject-matter expert (SME) reviews for accuracy.
Bias in AI Tools
- AI alt text trained on biased datasets may misidentify people (e.g., gender, race).
- Avoid: Test AI outputs across diverse images. Correct biases. Prefer human-written alt text for people-focused images.
Risk & Anti-Patterns
Top 5 Pitfalls
-
Compliance Theater: Checklists Without User Testing
- Run automated tools, claim "WCAG AA compliant," but never test with screen reader users. Real AT users can't complete tasks.
- Avoid: Usability test with 8–15 AT users. Automated tools find ~30% of issues. Real users find the rest.
-
Bolted-On Accessibility: Retrofitting Post-Launch
- Build product, launch, then try to add accessibility. Costly rework (redesign, re-engineer). Often incomplete.
- Avoid: Embed accessibility from start (design, dev, QA). Use accessible design system. Accessibility in DoD.
-
One-Size-Fits-All: Designing for "Normal" Users Only
- Design for average user (mouse, good vision, no cognitive impairments). Excludes 15% of workforce (disabilities) + situational users (mobile, noise, sunlight).
- Avoid: Inclusive personas (Chapter 11). Test with diverse users (AT, cognitive, age, situational).
-
No Assistive Tech Representation: Homogeneous User Research
- Recruit only non-disabled users for research. Miss AT barriers, cognitive accessibility, older user needs.
- Avoid: 20–30% of usability test participants should use AT or have accessibility needs. Include in personas, journeys.
-
Accessibility Regression: No Ongoing Validation
- Achieve WCAG AA once, then regress (new features break accessibility, no ongoing testing). Compliance score drops.
- Avoid: Accessibility in CI/CD (automated checks). Quarterly usability tests with AT users. Annual third-party audit.
Case Snapshot
Company: B2B SaaS (HR management platform) Challenge: Lost 3 enterprise deals (combined $1.2M ARR) due to accessibility non-compliance (buyers required WCAG 2.1 AA, VPAT). 12 accessibility-related support tickets/month (screen reader, keyboard access). No accessibility testing in workflow.
Accessibility Intervention:
- Audit (Week 1–2): Automated (Axe, WAVE) + manual (keyboard, screen reader). Found 180 issues (42 Critical, 68 High, 70 Medium/Low). WCAG AA: 58% compliant.
- Research (Week 3–4): Recruited 15 participants (5 screen reader users, 4 keyboard-only, 3 low vision, 3 cognitive/older). Conducted interviews + contextual inquiry. Created 3 inclusive personas: "Sam, Blind HR Admin (JAWS User)," "Lin, Senior Recruiter (Low Vision, 200% Zoom)," "Jordan, ADHD Talent Manager."
- Usability Testing (Week 5–7): Tested with 12 AT users. Tasks: Onboarding, user provisioning, report generation. Task success: Screen reader users 45%, Keyboard-only 60% (vs 92% for non-AT users). Identified barriers: Missing ARIA labels, keyboard traps in modals, low contrast text, complex navigation.
- Remediation (Week 8–16): 8-week sprint. Fixed: Semantic HTML, ARIA labels, keyboard access (all modals, dropdowns), color contrast (4.5:1), focus indicators, plain language (reduced jargon 40%). Re-tested with 8 AT users. Task success: Screen reader 88%, Keyboard-only 91%.
- Embed (Ongoing): Accessibility in CI/CD (Axe linting, fails build if Critical issues). Accessibility in DoD (keyboard + screen reader QA). Quarterly AT user testing (8 participants). Published accessibility statement + VPAT.
6-Month Results:
- Compliance: WCAG 2.1 AA: 58% → 96%. VPAT published (won 5 deals worth $2.3M ARR, buyers required VPAT).
- Task Success: AT users: 45–60% → 88–91% (on par with non-AT users).
- Support Tickets: Accessibility tickets: 12/month → 2/month (83% reduction). Support costs down $45K/year.
- User Satisfaction: AT user NPS: -5 → +38 (43-point lift). Quote: "Finally, a product I can use without constant workarounds."
- Market Reach: Accessibility compliance unlocked public sector (gov, education) + regulated industries (finance, healthcare). New market TAM: +$8M.
- Legal Risk: Zero accessibility lawsuits or complaints (pre-intervention: 2 complaints, legal risk $500K+).
Checklist & Templates
Accessibility Research Checklist
- Run automated audit (Axe, WAVE, Lighthouse) on key flows. Identify issues (alt text, contrast, keyboard).
- Conduct manual audit: Keyboard-only test (no mouse), screen reader test (NVDA/JAWS), zoom test (200%).
- Map issues to WCAG 2.1 AA criteria (78 total). Calculate compliance % (# met / 78).
- Prioritize issues: Critical (blocks tasks, Level A fail), High (degrades, Level AA fail), Medium, Low.
- Recruit diverse participants (15–20): Screen reader (3–5), keyboard-only (3), low vision (3), cognitive (3), older (3).
- Conduct accessibility interviews (5–10 participants). Ask: AT used? Barriers? Workarounds? Needs?
- Create inclusive personas (1–2 pages): AT used, workflows, pains, accessibility needs.
- Run accessibility usability tests (8–15 AT users). Realistic tasks. Observe with their AT. Measure: Task success, time, errors, frustration.
- Synthesize findings. Map barriers to WCAG criteria. Prioritize for remediation.
- Remediate Critical + High issues. Focus: Keyboard access, screen reader (semantic HTML, ARIA), color contrast.
- Re-test with 5–8 AT users. Confirm fixes work.
- Achieve WCAG 2.1 AA (≥95% compliance). Generate compliance report.
- Publish accessibility statement (website): Conformance level, known issues, roadmap, contact.
- Create VPAT (Section 508 compliance) for enterprise buyers.
- Embed accessibility in workflow: CI/CD linting, accessibility in DoD, quarterly AT usability tests, annual third-party audit.
- Track metrics: WCAG compliance %, AT task success, accessibility tickets, AT user NPS.
Templates
- Accessibility Audit Report Template: [Link to Appendix B]
- WCAG 2.1 AA Checklist: [Link to Appendix B]
- Inclusive Persona Template (AT User): [Link to Appendix B]
- Accessibility Usability Test Protocol: [Link to Appendix B]
- Accessibility Statement Template: [Link to Appendix B]
- VPAT Template (Section 508): [Link to Appendix B]
Call to Action (Next Week)
3 Actions for the Next Five Working Days:
-
Run Automated + Manual Audit (Day 1–2): Pick one key flow (onboarding, core task, or admin). Run automated audit (Axe browser extension, free). List top 10 issues (alt text, contrast, keyboard). Then, manual test: Navigate flow with keyboard only (no mouse). Can you reach all controls? Visible focus? Note: Top 3 keyboard issues.
-
Recruit & Interview 3 AT Users (Day 3–4): Find 3 participants (ideally via accessibility agency like Fable, or ask CS for customers who use AT). Interview 60 min each. Ask: "What assistive tech do you use? Walk me through completing [task] with your setup. What's hard?" Observe their screen reader/keyboard/magnifier. Document: AT used, top 3 barriers, workarounds.
-
Fix 3 Quick Wins (Day 5): Pick 3 low-effort, high-impact issues from audit. Examples: (a) Add missing alt text to 5 images, (b) Increase contrast on 3 low-contrast text elements (use 4.5:1 ratio), (c) Add visible focus indicator to primary button (2px outline). Test fixes with keyboard/screen reader. Ship by end of week. Track: Issue resolved? Impact (who benefits)?
Part II Complete. Next: Part III — Strategy & Value Design (Chapter 14 onwards)