Designing for dialogue: How RNID and Sonos developed the Arc Ultra's Speech Enhancement feature

Anna Mitchell speaks to RNID’s Dr Lauren Ward to uncover how Sonos and RNID joined forces on a new approach to TV audio and why early, community-led collaboration was crucial.

When Sonos released its new Speech Enhancement feature for the Arc Ultra soundbar, it arrived with a clear promise: clearer, more intelligible dialogue for TV and film. But the significance of the announcement runs deeper than a technical refinement. The feature is the result of an unusually open, early and sustained collaboration between Sonos and RNID, the UK charity supporting the 18 million people who are deaf, have hearing loss or tinnitus. For RNID technology lead Dr Lauren Ward, it represents the kind of partnership that could, and should, become standard practice in consumer electronics.

Real co-design

Ward is one of two technology leads at RNID, where she works across three strands: helping communities understand the technology available to them, advising companies on how to design better products, and contributing to policy work around technology and inclusion. Alongside this strategic remit, she has a PhD in acoustics and audio engineering, focused on accessible TV audio. “We’ve built the RNID team so that we can do real co-design with companies,” she explains. “Accessibility isn’t just about specialist devices anymore. It’s integrated into consumer electronics, which means we need both the engineering expertise and the user research expertise in-house.”

“Installers are absolutely going to be working with clients who have some degree of hearing loss, even if it’s not discussed upfront.”

That blend was central to the Sonos project. Unlike many manufacturers, which approach RNID late in development to validate a near-finished feature, Sonos arrived much earlier. “Historically, we’ve had companies come to us at beta stage saying, ‘Can you test this before release?’ We can, but they’ve already committed to many aspects of the design, so it’s difficult to make any meaningful change,” Ward says. “Sonos came to us when they knew the technology was capable, but they didn’t yet know what people with hearing loss actually needed.”

Lived experience

The collaboration began with a workshop involving Sonos engineers and what RNID calls “expert listeners”, participants who have experience in audio or music and also have hearing loss. “It was a great introduction because they could speak the same technical language as Sonos’ engineers, but also describe the lived experience,” Ward says. The session covered fundamentals: what hearing loss is, how it affects sound perception, and what technologies people typically have in their living rooms. This laid the groundwork for two substantial user studies.

The first study brought in 27 participants, each with differing degrees of hearing loss, to test prototype versions of speech processing. The aim was to understand how far speech should be enhanced and what level of audio artefacts users could tolerate. The findings challenged Sonos’ expectations. “People wanted the enhancement to go much further than the engineers assumed,” says Ward. “And they were much less sensitive to artefacts. For many people with hearing loss, the artefacts simply weren’t perceptible. And even when they were, the trade-off for clearer speech was worth it.”

A second study with 19 participants compared multiple variations of the feature running on hardware. Participants listened to A/B comparisons and selected what worked best. Sonos had originally planned three levels of enhancement; testing showed users needed four. The highest setting, Max, more than doubled the enhancement Sonos first considered as the top level. “It became around seven dB of lift to the speech,” Ward notes.

Loudness recruitment

One of the project’s biggest insights emerged from witnessing people respond physically to loudness. “There’s a phenomenon called loudness recruitment, where the comfortable listening range becomes much smaller; both the soft sounds and the tolerance for loud sounds shift,” Ward explains. “So just turning the TV up can very quickly become uncomfortable or even painful.” Sonos engineers saw this first-hand during the workshops. “Watching someone recoil from loudness really drives home why dynamic range has to be carefully controlled,” she says. This influenced how the Max setting behaves, prioritising speech without amplifying other elements beyond an individual’s comfort zone.

A final beta phase followed, with a subset of participants testing the feature in their own homes. “That stage showed how the AI behaved with real content, real rooms and real viewing habits,” says Ward.

“Ask questions. Do they use hearing aids? Do they stream audio directly from the TV? Do they prefer immersive surround sound even if speech is harder to follow, or do they want absolute clarity without relying on subtitles?”

Personalisation

RNID also collaborated with Sonos’ UI designers to ensure the feature would be discoverable and usable. Ward emphasises that presenting speech enhancement purely as an “accessibility” feature can limit adoption. “Subtitles used to be seen as only for people with hearing loss. Now lots of people use them; it’s become personalisation. Making the feature visible on the main Now Playing screen makes it easy to use and normalises it,” she says. The final design places the Speech Enhancement button front and centre in the app, rather than hidden in an accessibility submenu.

While Speech Enhancement arrived as a software update, Ward emphasises that AI-driven features still depend on hardware decisions made long before release. “It’s great that features can be deployed through software updates,” she says, “but this can only happen if the device has the processing capacity. You need that foresight early on, enough CPU space, and not already given over to other functions.” Once hardware is in the field, she adds, “you can’t just send out another chip. If the processing headroom isn’t there, your ability to add meaningful AI features is limited.”

“Clear audio straight to someone’s ear will always outperform room-based processing."

What can installers learn?

For custom installers, Ward says the Sonos project underscores something the industry must internalise: people with hearing loss are not a niche group, and their needs are highly individual. “It’s one in three people in the UK,” she notes. “Installers are absolutely going to be working with clients who have some degree of hearing loss, even if it’s not discussed upfront.”

Her strongest recommendation is simply to start a conversation. “Ask questions. Do they use hearing aids? Do they stream audio directly from the TV? Do they prefer immersive surround sound even if speech is harder to follow, or do they want absolute clarity without relying on subtitles?” Understanding the client’s goals matters more than assuming they require a particular technology.

She also encourages installers to undertake deaf awareness training. “You don’t need to know every hearing aid, but you should understand the basics of what people might be using and how it interacts with home systems,” she says. Interoperability is another key barrier. “Lots of devices have great features, but people don’t know where to find them or how to make the ecosystem work without an engineering degree.”

Whenever possible, installers should offer clients the opportunity to test systems. “These are major purchases. People want to know that the sound system will work with their existing devices, especially hearing technology,” Ward explains.

Looking ahead, she sees Auracast, allowing audio to be broadcast directly to compatible Bluetooth low energy audio devices, as a transformative technology in home environments. “Clear audio straight to someone’s ear will always outperform room-based processing. It’s a game changer for watching TV with family or friends,” she says.

The Sonos partnership, Ward believes, demonstrates what becomes possible when accessibility is not treated as a late-stage refinement: “Early engagement isn’t just good design practice. It’s how you build technology that genuinely changes people’s experience of the world.”

Image credits: N Universe/Shutterstock.com