The Multidisciplinary Vision Research Laboratory, or MVRL, studies how humans extract visual information, and how that information is used to make decisions and guide actions. This research is done through the use of eye-tracking equipment that actively monitors the position of the subjects' eyes in order to determine what they are looking at. The research can be applied to many different fields, including linguistics, computer science, psychology, marketing, and geology.
The geological applications of eye tracking are currently a main focus of the lab. A five-year research grant from the National Science Foundation has allowed researchers from RIT and the University of Rochester (UR) to travel to various locations in California and Nevada to study the differences between how novice and expert geologists perceive scenes in the field. They have travelled through many geologically active areas, including the Sierra Nevada mountain range, formed by the movement of tectonic plates, and Death Valley, and the adjacent basins formed by the same movement. UR geophysicist John Tarduno leads the students and experts on the 10-day field trip, and Robert Jacobs, a brain and cognitive science professor at UR, leads the team's analysis of cognitive and perceptual learning.
"A major component of geology is about what you can see and what you can experience," says Tommy Keane, an RIT imaging science doctoral student. "Being able to go out and have geologists look at scenes in the field will enable us to learn a great deal about how novices and experts see and learn."
This research aims to improve geology education. By studying how a novice explores a scene, and comparing it with how an expert examines the same scenes, the researchers hope to reveal search patterns that can be used to teach novices how to better identify geologically significant features in a scene.
Keane, who earned his bachelor's and master's degrees from RIT in electrical engineering, and Thomas Kinsman, an RIT imaging science doctoral student, are part of a team working on improving the way in which the immense amounts of data from the mobile eye-tracking devices are processed. Currently, it takes 15-20 hours to process the data from a single eye tracker, and much of the work has to be done by hand. Along with other researchers, Keane and Kinsman are attempting to add more automation that will serve to streamline the process—no small task.
"There is a lot of ambiguity and complexity in the analysis of these data," says Keane. "We have to analyze not only where people look, but also how long they look there, how many times they come back to a point, and the order in which they look at things."
Most of their efforts to date have focused on developing new methods of capturing, analyzing, and evaluating the complex data. The research has shown that traditional low-level metrics, such as how long individuals look at each point in the scene, or how large individual eye movements are, do not reveal differences between novices and experts. Higher-level metrics that measure the cyclical nature of the participants' viewing patterns have revealed the subtle differences between the groups.
RIT was one of the first universities to pioneer wearable eye-tracking devices, which afford the ability to study eye movements in real-world environments instead of a laboratory. Jason Babcock, an alumnus of RIT, perfected the system in the late 1990s and now runs his own company called Positive Science, which manufactures eye trackers. Currently, the MVRL uses 12 mobile eye trackers—more than any other research facility in the world.
Jeff Pelz, Frederick Wiedman professor and co-director of the Multi-disciplinary Vision Research Laboratory, who describes himself as a 'gadget guy,' says that one of the goals of the MVRL is to develop new systems to allow researchers to keep pushing the envelope in vision experiments. As technology advances, so too does the MVRL's approach to eye tracking.
"We have to keep up with the available technology," says Pelz. "Cameras are smaller, lighter, higher resolution, and have higher frame rates than ever before. We're exploring environments with eye trackers that would have been impossible 10 years ago."
Keeping up with technology may present some challenges, but keeping up with the human eye presents entirely different ones. The eye makes some of the fastest movements of any part of the human body. That's why it is crucial for researchers in the MVRL to take advantage of the newest cameras and computers.
The mobile eye tracker currently utilizes an Apple MacBook Air in a backpack, hooked up to a wearable headset that uses a pair of cameras and an infrared LED to record eye movements. Weighing in at less than three pounds, the MacBook system is lighter than the older camera-based system that was used for the job. In a pilot project underway in the lab, funded in part by a CIS micro grant and RIT's Kate Gleason College of Engineering, Dorin Patru, professor in electrical engineering, is working toward a wearable computer that weighs only a few ounces, thereby increasing the mobility of the device even further.
Sometimes, however, simple technologies are required to facilitate eye tracking. While working outdoors in places such as California, researchers had to wear large hats to shield the eye trackers from direct sunlight. The eye trackers utilize a small infrared LED, which is limited in power, making it difficult to detect due to the large amount of infrared light coming from the sun.
Originally named the Visual Perception Lab, the MVRL has grown considerably over its 20-year history under the leadership of Pelz and co-directors Anne Haake, professor and associate dean for research and scholarship in RIT's B. Thomas Golisano College of Computing and Information Sciences, and Andrew Herbert, professor and chair of the psychology department in RIT's College of Liberal Arts. About five years ago, the lab was renamed to reflect its multi-disciplinary approach to research.
"The MVRL is a lot more than just a collection of rooms," says Pelz. "We wanted the name of the lab to reflect the fact that the research involves students, staff, and faculty from several colleges, departments, and disciplines across RIT. The faculty benefit greatly from the fact that we have undergraduate and graduate students actively involved from imaging science, psychology, motion picture science, computer science, linguistics, interpreting, and engineering, to name a few. We think the students benefit from an environment where faculty from all the colleges come together to collaborate."
Imagine what it would be like if you could see what people were thinking. You could learn quite a bit about people's brains. In a sense, eye tracking does exactly this.
"People don't typically think about where they are looking," says Pelz. "If you can keep track of where people look, you can keep track of people's cognition."
The key to success, according to Pelz, is that eye tracking must be externally observable and must not interfere with the subject. If the subject has to consciously think about where they are looking, this will taint the results.
"The assumption, of course, is that people are thinking about what they are looking at," he says. "If this is true, eye tracking becomes a powerful tool to measure cognitive processes and cognitive load."
The wide variety of research that has been done at the MVRL reflects the strong connection with vision and the brain. Past research has included examining the eye movements of automobile drivers on the road, doctors while examining patients' images and deaf and hard-of-hearing people while viewing a presentation and following an interpreter.
Cecilia Ovesdotter Alm, assistant professor of English in RIT's College of Liberal Arts, is part of a multidisciplinary team, led by Haake, working on a research project to create a multi-modal content-based image retrieval system. The goal of the research, funded by the National Science Foundation and the National Institutes of Health, is to leverage several modes of input from experts viewing images. In the current work, trained dermatologists explain to physician assistant students how they reach a diagnosis from a medical image. The eye movements of the dermatologists are recorded along with their spoken explanations. Alm and the students working on the project are developing new methods to extract meaning from the synchronized combination of gaze patterns and utterances, with the goal of using the results to build new teaching systems for medical students.
Alm, who is a computational linguist, says the lab is a good example of how interdisciplinary research produces positive results. "It provides a new perspective to work with people from other areas of expertise," says Alm. "I think the Multivisionary Research Laboratory reflects the very spirit of RIT."