Animation helps Web-based sign language come alive




Follow Scott Bureau on Twitter
Follow RITNEWS on Twitter

The Web is full of pages written in hundreds of different languages, allowing organizations to make information accessible to readers around the world.

But how do you provide a webpage in American Sign Language? Currently, some websites include videos of human signers, but these can be costly and time consuming to update as information changes.

To make it easier for companies and organizations to offer sign language on their websites, RIT Associate Professor Matt Huenerfauth is creating software that automatically produces editable ASL animations based on online content. Similar to editing one word on a webpage, someone could easily change one or two signs in an animation. The research aims to make online information and services more accessible for the nearly half a million users of ASL in the United States.

“If your website only has text—no videos or audio—you might think it’s completely accessible for someone who is deaf or hard of hearing,” said Huenerfauth, who teaches in the Department of Information Sciences and Technologies. “But that’s not necessarily the case.”

Standardized testing has revealed that many deaf adults in the United States have lower levels of English literacy; the majority of deaf high school graduates in the United States have only a fourth-grade English reading level.

Huenerfauth first began working with computational linguistics as a student, helping a professor design grammar check tools specifically for deaf children.

“The software could identify a problem in somebody’s writing, but when it came time to explain how to fix it, the message would pop up in English,” Huenerfauth said. “That’s not very helpful if your native language is ASL.”

Huenerfauth began looking for ways to create just-in-time messages in sign language, based on an easy-to-update script as input. The challenge was creating linguistically accurate and understandable animations of virtual humans.

“Subtle eyebrow movements, the wrinkling of the forehead and the few milliseconds of pausing between sentences are just a few of the details that dramatically affect comprehension,” said Huenerfauth.

In order to make truly realistic sign-language animations, Huenerfauth began using motion-capture equipment—the same tools used by movie studios to make animated films—to create a digital dictionary of signs. Wearing motion-capture data gloves, a spandex suit and a head-mounted sensor, research participants would perform thousands of ASL sentences and words.

Using the data collected, the movement of animated characters is based mathematically on the way that humans actually move when signing. Huenerfauth’s Linguistic and Assistive Technologies Laboratory has recorded almost 20 hours of video with more than a dozen native signers, carefully analyzing each frame of video (30 per second) to identify the words and linguistic structure.

Beginning his work at the City University of New York and moving to RIT in 2014, Huenerfauth has secured more than $2 million in grants to support his research since 2007. He has also received a prestigious CAREER award from the National Science Foundation.

Today, Huenerfauth is working with RIT graduate students to create a software tool that uses the Microsoft Kinect to give sign language students immediate feedback about how fluid and accurate their signing and facial expressions are.

“This tool is not a replacement for human ASL instructors, who provide sophisticated feedback,” Huenerfauth said. “They are a way to make it easier for more students to practice and learn sign language.”