Improving ASL communication
Myers Creative Imaging
One day, Siri might be able to respond to questions using sign language in the same way she now speaks using voice output.
That may happen through work by Matt Huenerfauth and his research team, which is developing animations of American Sign Language (ASL)—a language that requires precise control of hand and body movement as well as facial expressions.
“Those are larger, fundamental aspects of how you make things look natural,” said Huenerfauth, professor in the B. Thomas Golisano College of Computing and Information Sciences and co-lead of the human-centered AI pillar. “Technology for automatically producing animations of ASL could make it easier for companies to provide information in sign language on websites, since updating these animations may be easier than re-recording videos of human ASL signers.”
Taking a machine-learning approach involves some human linguistic intuition to tell the new system what characteristics are important. For example, machine- learning algorithms are trained to predict where to pause—spaces between words— to ensure that the pause reflects the actual end of a sentence in a conversation.
Research team members have thousands of hours of recordings of native signers, which have been analyzed by expert linguists, to produce this distinctive data set. They have strategies to use different machine learning models to learn patterns in how humans move during sign language.
“There is a hesitancy among people who do natural language processing—also called computational linguistics—to apply their methods to sign language because it feels unfamiliar. We hope we are reducing a bit of this mystery and trying to create useful tools,” he said.
“A focus of our lab is to take an imperfect AI system and try to do something with it to help people now, even though we know it is not perfect; we wrap it up in some application that someone can derive a benefit from.”
Who is involved?
The Center for Human-Aware Artificial Intelligence (CHAI) was formed last year after more than 200 faculty, students and staff attended a retreat called Move 78 to talk about how RIT can further distinguish itself in new AI discoveries. Center leaders are:
Brain-inspired computing: Dhireesha Kudithipudi and Andreas Savakis, both professors of computer engineering;
Machine learning and perception: Christopher Kanan, assistant professor of imaging science and CHAI’s associate director, and Reynold Bailey, associate professor of computer science;
Automation and robotics: Ray Ptucha, assistant professor of computer engineering, and Ferat Sahin, professor of electrical engineering;
Human-centered AI: Cecilia Ovesdotter Alm, associate professor of computational linguistics and language science, and Matt Huenerfauth, professor of information sciences and technologies.
September 13, 2019
How a person vapes, not just what a person vapes, could also play a big role in vaping harm
Essay by Risa Robinson, professor and department chair, mechanical engineering, published by The Conversation.
September 13, 2019
RIT Sponsored Research garners $74 million in funding
RIT had its second best year ever in sponsored research funding and a record year for research expenditures in fiscal year 2019. RIT received 366 new awards totaling $74 million in funding, and expenditures grew to $61 million.
September 12, 2019
Scientists developing single photon detector to help search for habitable exoplanets
NASA announced it is awarding a team of researchers from Rochester Institute of Technology and Dartmouth College a grant to develop a detector capable of sensing and counting single photons that could be crucial to future NASA astrophysics missions. The extremely sensitive detector would allow scientists to see the faintest observable objects in space, such as Earth-like planets around other stars.
September 11, 2019
Could a toilet seat help prevent hospital readmissions?
Guest essay by Nicholas Conn '11, '13 MS (electrical engineering), research scientist and founder and CEO of Heart Health Intelligence, published by The Conversation.