Press "Enter" to skip to content

Project Understood makes Google voice technology more inclusive to those with Down syndrome

A Canadian project aims to make voice assistants more accessible by having individuals with atypical speech offer recordings of their voices.

More about Innovation

The Canadian Down Syndrome Society recently announced its latest program, Project Understood, which helps Google voice technology better understand individuals with Down syndrome. 

“People with down syndrome have atypical speech, their facial structures are different. Their tongues are larger, thicker, and it means that they speak in a way that is not typical,” said Shelley Brown, chief strategy officer at FCB Toronto and leader of the Canadian Down Syndrome Society’s project. “As a result they find voice technology really hard to use.” 

SEE: Amazon Alexa: An insider’s guide (free PDF) (TechRepublic)

Digital voice assistants like Amazon Alexa, Google Assistant, and Apple Siri have become staples in daily life, infiltrating smartphones and smart home devices. Juniper Research found that the popularity of voice assistants will increase worldwide, with eight billion projected to be in use by 2023. 

However, these voice assistants may not be able to help those with disabilities, a population that could be helped most from the tech, Brown said. 

“Voice technology could be so useful and important for people living with Down syndrome. It could be a technology that could make the difference for somebody to be able to live independently or not,” Brown said. “It is a very intuitive, easy way to get access to everything that a computer can do, from giving you a reminder for an appointment, directions on how to get somewhere, or how to do something.” 

Hey Google, how we make voice assistants more accessible? 

Once the Canadian Down Syndrome Society decided to pursue this mission of making voice assistants more accessible, officials realized they needed the support of a big name company, Brown said. 

“We did a lot of research and started calling a whole bunch of different places,” Brown said. “When we called Google, they were so open, excited, and receptive to the whole idea.” 

The first step Google and the society made involved a test. “One of the questions that we didn’t have the exact answer to was, are speech patterns of people with Down syndrome similar enough [to those without] that the voice technology can learn from them?” Brown asked.

To find the answer, Project Understood recruited nine people living with Down syndrome and asked them to record 1,700 words and phrases. The recordings would be played back through a Google voice platform and researchers would discover whether or not the technology could learn from repeated input of the voices, Brown said. 
 
The group received an exciting answer from the test: Yes. “On the basis of that initial test with nine people, it became very clear that the technology could learn from them and it just needed more data,” Brown said. 

Project Understood is still at the very early stages of development. Right now, Google gets approximately one out of three words incorrect for someone living with Down syndrome. However, the system is extremely dynamic, meaning it will be able to better understand if it is given more data, Brown said. 

“What we’re really looking for is to collect about a thousand voices of people with Down syndrome. The more data we can give the system, the smarter the system will be,” Brown said, “Right now, we have almost 300 participants already in the program and we will just continue to bring on more people living with Down syndrome so that they can teach Google how to understand them.”

“By teaching Google to understand people who are living with Down syndrome, it will have an impact for other people with different disabilities or different forms of atypical speech,” Brown said. “While the initial group of people who will benefit from [the project] are people with Down syndrome, their participation is actually going to make voice technology work better for many other people.”

Individuals with Down syndrome can help the cause through this link. By recording their own voices saying certain words and phrases, those with Down syndrome can directly help improve the accuracy of Google’s voice recognition models, Brown said. 

The press release offered some example phrases users may be asked to record: 

  • The boy ran down the path
  • Flowers grow in a garden
  • Strawberry jam is sweet
  • I owe you a yo-yo today

“We’ve always tried to reframe how people think about those living with Down syndrome. These people are not victims, they’re not ill. These are engaged individuals and they have a point of view on the world,” Brown said. “Google is one of the biggest and smartest companies on the planet, and somebody living with Down syndrome is an expert in living with Down syndrome. This is a case where they can share their expertise and actually teach Google to be more helpful, not only for people with Down syndrome, but also people with any form of basic difficulties.”

For more, check out How to make your sites accessible for all users: 3 tips for business owners on TechRepublic.

Also see 

google-home-mini-9.jpg

Source: TechRepublic