Using artificial intelligence (AI) and augmented reality (AR), a team of engineers and researchers from tech giant Huawei have created a unique method for teaching deaf children to read — with the secondary goal of inspiring a love of reading.

Born from the difficulty hearing parents of young deaf children encounter when attempting to read books to them, the app called StorySign relies on AI to help children read printed books. For deaf children, learning to read is often plagued with challenges as books appear as a jumble of words on a page without context, meaning very little to a deaf child.

As such, developers embedded an AI/AR system in the StorySign app that can work using a phone or a tablet. To operate, the child or the parent can hover a device above the page of a book and an avatar will appear superimposed alongside the text, translating those words to the child by signing them. Following along with the story, the app will highlight the words as they are signed. This is intended to help the child make connections between the printed word and the sign meant to represent that word. Additionally, as the child learns to read, the parent learns sign language.

Source: HuaweiSource: Huawei

Working with Aardman Animations, the researchers concentrated on creating a relatable avatar. The end result is an avatar called Star who is capable of more than just issuing hand gestures: Star also relies on body and facial gestures to convey emotion, which will ultimately help the child better grasp the context of the book. Currently, the app only works in tandem with children’s books published by Penguin Random House.

Developed with support from the European Union of the Deaf and the British Deaf Association, the app is free and available for Android and Apple products. Currently, it is only available in 10 different languages in 10 European countries. If the app proves successful, it will be rolled out to more locations and in more languages.

To see how StorySign works, watch the accompanying video courtesy of Huawei.

To contact the author of this article, email mdonlon@globalspec.com