Creating tomorrow’s technology today

Researchers find new interactions in Future Everyday Technology research lab

Follow Scott Bureau on Twitter
Follow RITNEWS on Twitter

A. Sue Weisler

In the FETLab, Amanda Yung, a human-computer interaction master’s student, and Zhiyuan Li, a computing and information sciences Ph.D. student, are making it easier for non-experts to build their own smart devices.

Daniel Ashbrook wants to do more than imagine a future where computers are embedded in the world around us. He wants to help create it.

The assistant professor of information sciences and technologies is working with student researchers in RIT’s Future Everyday Technology—or FET—research lab, to stretch the boundaries of human-computer interaction. Together, they are finding new ways to improve the usability of wearable and mobile computing devices and making it easier for people to create their own smart gadgets.

“In other words, we’re designing, prototyping and studying interactive technology that may one day become woven into the fabric of our everyday lives,” said Ashbrook, director of the lab. “The key is that our work focuses on the people using this technology, just as much as it focuses on the computing.”

Located on the second floor of RIT’s B. Thomas Golisano College of Computing and Information Sciences, the FETLab is home to a dozen Ph.D., graduate and undergraduate students who are excited about creating tools of the future.

Amanda Yung, a human-computer interaction master’s student, and Zhiyuan Li, a computing and information sciences Ph.D. student, are making it easier for non-experts to personally fabricate their own Internet of Things (IoT) devices, such as a gadget that sticks to the side of a dryer and sends a text message when your laundry is done.

Using littleBits—modular parts of a circuit that easily snap together—and a 3D printer, someone can create an IoT gadget in the shape of the Twitter logo, a Batman symbol or any other design, explains Li. Normally, you would need 3D modeling expertise to fit the circuit in the container, but Yung and Li want to make it more accessible to anyone with a web browser and 3D printer.

“Using an HP Sprout computer, we are creating augmented fabrication—where users can physically move the real littleBits around to see how they would fit within a digitally projected 3D object,” said Yung, who is also a senior programmer at the University of Rochester. “The printable container is automatically created based on where the user places the littleBits.”

Rather than the user interacting with the technology, augmented fabrication shifts the focus to the technology interacting with the user, Yung said.

Other research in the lab looks at how people interact with wearable and mobile devices. Dhwanit Mehta, a 2017 HCI master’s graduate, created software that allows round Android smart watches to display icons in a circular interface, as opposed to a vertical list that can’t show as many items. Ashbrook worked with computing and information sciences Ph.D. student Carlos Tejada to create Bitey, a wearable microphone that allows people to communicate with a computer by clicking their teeth, allowing for accessible hands-free interface control.

While the projects being created in the FETLab may vary, Ashbrook says they’re all aimed at allowing people to be less focused on their technology and more engaged with the world—while still reaping the creative and productive benefits of those hi-tech devices.