
A11Y Audit Excellence Surpassing Client Expectations

My Role
Product Designer
Platform
Desktop
Timeline
01. 2022- 06. 2022
Background
During my time at Viafoura, I contributed to a key project aimed at improving accessibility and surpassing audit satisfaction levels in line with our Q1 and Q2 OKRs. This initiative was driven by crucial feedback from major clients, including CBC and AARP, who raised concerns about the accessibility of our comment section and Engagement Starters (ES).
To address these issues:
In Q1, I conducted comprehensive compatibility testing with 20 diverse Fable users to evaluate accessibility.
In Q2, I performed five QA interviews with screen reader users, leading an in-depth audit.
The results were remarkable—100% of users rated Viafoura's tools as "Easy to Use" and confirmed compliance with accessibility guidelines. Our ongoing goal is to resolve all client concerns and achieve AA accessibility standards, ensuring an inclusive and seamless experience for all users.
The Problem: Accessibility Barriers
Several issues impacted the usability and accessibility of Viafoura’s commenting system:
1. Confusion
Key icons and images lacked essential ARIA labels.
Like/dislike buttons had incorrect ARIA labels, leading to confusion.
2. Uncertainty
Users couldn’t unflag comments, creating frustration.
No confirmation message after posting a comment/reply, leaving users unsure if their action was successful.
3. Overall UX Challenges
Screen reader users struggled to navigate between comments.
The sorting button was missing for screen reader users.
The dropdown menu was inaccessible for certain screen reader software.
Approach & User Testing
We conducted a multi-phase accessibility review with a focus on real user feedback.
Compatibility Testing
We partnered with Fable Tech Labs, engaging 20 diverse accessibility users, including individuals using screen readers, screen magnification, and alternative navigation methods.
Key Result: Most users rated Viafoura’s tools as Easy to Use and compliant with accessibility guidelines.
Flow 1: Replying to a Featured Comment (Mid-Article Widget)
Goal: Assess how accessible it is for users to interact with featured comments.
Test Outcomes:
✅ All five users completed the test and found it easy to use.
⚠️ Two users flagged poor button labeling as an issue.
⚠️ One user reported missing alt-text descriptions for images.
⚠️ One user found the comment/reply layout confusing.
Flow 2: Engaging with Polls as a Regular News Reader
Goal: Determine if users could locate and interact with polls effectively.
Test Outcomes:
✅ All five users completed the test and found it easy to use.
⚠️ Two users flagged poor button labeling.
⚠️ One user reported missing alt-text descriptions.
⚠️ One user found the comment/reply layout disorienting.
QA Testing with Screen Reader Users
I conducted three in-depth interviews with screen reader users across different software to evaluate interactions with:
The comment section
The sorting button & “All Comments” tab
The comment posting flow
Key Findings
The Solution
1. Addressing Key Accessibility Issues
Logged Jira tickets to fix ARIA labels, sorting buttons, and dropdown menus.
Designed new screen reader flows, adding Header Level 4 for easier navigation.
Began discovery work for real-time comment solutions in the mobile SDK.
Real-Time Accessibility Enhancements
Current Desktop Implementation: Improved navigation, button labels, and real-time feedback.
New Mobile Implementation: Adapted accessibility features for mobile comment sections.
Results & Achievements
After implementing these changes, we conducted two rounds of accessibility reviews with CBC, one of our most prominent clients. Their response was overwhelmingly positive—they were "beyond impressed" with our accessibility improvements.
CBC even suggested that Viafoura showcase our work at industry events or publish a case study, recognizing our commitment to inclusive design.
Key Learnings
➡️ Prioritizing accessibility early in development prevents technical debt and improves overall UX.
➡️ Diverse accessibility users have different needs, shaped by their software preferences and experience levels.
➡️ A solution that works for one group (e.g., screen reader users) may not be ideal for another (e.g., users with motor impairments).
➡️User interviews provide invaluable insights, making direct observation one of the most effective accessibility testing methods.
Or send me an
Let’s connect on