RIT Takes Eye-Tracking Research to Next Level

How do we use our eyes to perceive the world? Could eye movements be windows into human cognition?

Scientist Jeff Pelz thinks so. The director of the Visual Perception Laboratory at Rochester Institute of Technology studies the link between eye movements and cognition. His latest research, in collaboration with the National Technical Institute for the Deaf (NTID), focuses on how deaf students process information in the classroom. Another project tracks how the human eye perceives high-speed motion on large-scale LCD monitors for Sharp Research Laboratory of America.

Until recently, visual perception research was rooted in laboratories where subjects looked at simple patterns on monitors in darkened rooms. Pelz argues that those experiments tell scientists little about how people use their eyes in daily life.

“The overarching question is how much of what we learn in the laboratory can we extend to the real world?” asks Pelz, an associate professor at RIT’s Chester F. Carlson Center for Imaging Science.

The wearable eye tracker—new technology developed in the Visual Perception Laboratory—is helping to answer those questions. The eye tracker has transformed the field of visual perception by enabling subjects to wear the technology outside of the laboratory and even outdoors.

“The system we’ve developed at RIT is unique in its ability to automatically monitor even complex tasks in a large range of environments,” Pelz says. “We can study students in a classroom or people finding their way in the woods.”

The wearable eye tracker extends the laboratory to the real world by recording what people look at and how their eyes move as they perform a specified task, such as attending to a lecture in a classroom, driving a car, walking or playing racquetball. In other words, the device tracks how eye movements support perception and what people pay attention to in order to gather the information they need to perform everyday activities.

Two eye-tracking models unique to RIT have different capabilities: one performs on-line processing in real time within any indoor setting outside the laboratory; the second model fits neatly in a backpack and can be worn anywhere, even outside. The latter model trades real-time capability for lightness; data recorded outside is later processed in the lab.

Developing wearable eye-tracking technology has long been one of Pelz’s goals. His own research received a boost from the U.S. Naval Research Laboratory in 2002, which established a cooperative agreement with Pelz’s lab to develop the device. One goal of that project has been to study how people locate difficult-to-find objects in natural scenes.

The portable eye-tracking equipment also has led to a National Science Foundation-funded collaboration with professor Marc Marschark and Carol Convertino of NTID at RIT. This project, now in its second year, uses eye tracking with hearing and deaf students in a simulated classroom. The study, sponsored by the NSF’s Research on Learning and Education program, seeks to understand how deaf students divide their attention between instructor, interpreter and a graphic display.

Another study, conducted with a team including graduate student Justin Laird and faculty members Mitch Rosen and Ethan Montag, is funded by the Sharp Research Laboratory of America. In this case, Pelz’s team is exploring how people view rapid motion on the new class of large displays. Knowing how fast an object is moving across a viewer’s retina—rather than on the screen—will help them write software algorithms to overcome hardware limitations.

"The challenges to the imaging designers of those systems cannot be solved without understanding how people move their attention, and therefore their eyes, when viewing fast-action sequences” says Pelz.

In the newest incarnation of the wearable eye tracker, Pelz is using a binocular tracker that monitors both eyes to study how people move them together to explore the third dimension.

“By tracking both eyes, we can measure the angle between them and calculate not only the direction, but also the distance to objects in the world,” Pelz says. “This lets us identify the point in 3-D space where they are paying attention instead of just the 2-D direction.”

“We’ve learned a huge amount in the lab about what the visual system can do,” Pelz adds. “Now we’re beginning to learn what the visual system does in the real world.”


Recommended News