Animation helps Web-based sign language come alive
The Web is full of pages written in hundreds of different languages, allowing organizations to make information accessible to readers around the world.
But how do you provide a webpage in American Sign Language? Currently, some websites include videos of human signers, but these can be costly and time consuming to update as information changes.
To make it easier for companies and organizations to offer sign language on their websites, RIT Associate Professor Matt Huenerfauth is creating software that automatically produces editable ASL animations based on online content. Similar to editing one word on a webpage, someone could easily change one or two signs in an animation. The research aims to make online information and services more accessible for the nearly half a million users of ASL in the United States.
“If your website only has text—no videos or audio—you might think it’s completely accessible for someone who is deaf or hard of hearing,” said Huenerfauth, who teaches in the Department of Information Sciences and Technologies. “But that’s not necessarily the case.”
Standardized testing has revealed that many deaf adults in the United States have lower levels of English literacy; the majority of deaf high school graduates in the United States have only a fourth-grade English reading level.
Huenerfauth first began working with computational linguistics as a student, helping a professor design grammar check tools specifically for deaf children.
“The software could identify a problem in somebody’s writing, but when it came time to explain how to fix it, the message would pop up in English,” Huenerfauth said. “That’s not very helpful if your native language is ASL.”
Huenerfauth began looking for ways to create just-in-time messages in sign language, based on an easy-to-update script as input. The challenge was creating linguistically accurate and understandable animations of virtual humans.
“Subtle eyebrow movements, the wrinkling of the forehead and the few milliseconds of pausing between sentences are just a few of the details that dramatically affect comprehension,” said Huenerfauth.
In order to make truly realistic sign-language animations, Huenerfauth began using motion-capture equipment—the same tools used by movie studios to make animated films—to create a digital dictionary of signs. Wearing motion-capture data gloves, a spandex suit and a head-mounted sensor, research participants would perform thousands of ASL sentences and words.
Using the data collected, the movement of animated characters is based mathematically on the way that humans actually move when signing. Huenerfauth’s Linguistic and Assistive Technologies Laboratory has recorded almost 20 hours of video with more than a dozen native signers, carefully analyzing each frame of video (30 per second) to identify the words and linguistic structure.
Beginning his work at the City University of New York and moving to RIT in 2014, Huenerfauth has secured more than $2 million in grants to support his research since 2007. He has also received a prestigious CAREER award from the National Science Foundation.
Today, Huenerfauth is working with RIT graduate students to create a software tool that uses the Microsoft Kinect to give sign language students immediate feedback about how fluid and accurate their signing and facial expressions are.
“This tool is not a replacement for human ASL instructors, who provide sophisticated feedback,” Huenerfauth said. “They are a way to make it easier for more students to practice and learn sign language.”
April 24, 2019
How Big Tech’s cozy relationship with Ireland threatens data privacy around the world
Politico talks to Josephine Wolff, assistant professor of public policy.
April 23, 2019
RIT cyber fighters go deep on Tor security
Recognizing that the internet is not always secure, millions of people are turning to the Tor anonymity system as a way to browse the World Wide Web more privately. However, Tor has been found to have its own vulnerabilities. This has a team of faculty and students from RIT’s Center for Cybersecurity researching the extent of the problem and ways to address it.
April 22, 2019
Imagine RIT preview: How phones and laptops can be tracked via their radio waves
A team of computing security students will demonstrate how the unique properties that exist in the radio waves of a wireless device can allow a third-party to single out, fingerprint and track that specific Wi-Fi device during the Imagine RIT: Creativity and Innovation Festival on Saturday.
April 19, 2019
RIT/NTID hosts ‘Signing Time’ free family concert May 3
Rachel Coleman, musician and star of the popular PBS and video series Signing Time, will perform a free show at NTID on May 3. Joining her on stage will be Coleman’s daughter Leah, an industrial design major at RIT/NTID, and her show sidekick Hopkins the Frog.