Immersive Technologies: Virtually Endless Possibilities

Researchers at RIT are using extended reality (XR) to expand experiences.

They have made it possible to virtually print The Works of Geoffrey Chaucer on a historic printing press and take the stage in a virtual theater. RIT experts are also using XR to study blindness and making sure the technology is accessible to all.

Immersive technologies have been studied at RIT for more than 20 years. Today, students and faculty continue to embrace XR as the developing technology grows more powerful, gets physically smaller, and companies make it more cost effective.

In 2016, the university formed a collaborative community and annual symposium for XR enthusiasts called Frameless Labs. In addition to games, the projects include scientific research, experiences, and narratives.

For the last three years, Animation Career Review has ranked RIT one of the top 10 AR/VR colleges in the country. Most recently, students started a Virtual/Augmented Reality club with more than 100 members.

“We’ve been at the forefront of this technology and our students get it—they know it’s here to stay,” said Susan Lakin, professor in the School of Photographic Arts and Sciences and director of Frameless Labs. “As more people get these devices, we want to be ready with interesting applications.”

Read about four RIT projects in the world of XR.

What’s the difference?


Extended reality (XR):
Umbrella term for any technology that adds digital elements to the real world.

Virtual reality (VR):
Immersive experience within a computer-generated simulated world.

Augmented reality (AR):
Technology that combines real and computer-generated content.

Mixed reality (MR):
A hybrid environment blending physical and digital elements.

Metaverse:
A network of immersive digital spaces.

 
 
A person performing virtual karaoke in the foreground with the virtual result in the background, showing them as an animated avatar.

Transporting the audience with 
mixed reality theater

For Joe Geigel, professor of computer science, all the world’s a stage—especially the virtual world.

In college, Geigel was working backstage for a student players group. While setting up lights for Cabaret, a friend asked, “Wouldn’t it be cool if we could do all this on the computer?” From that point on, it became Geigel’s dream.

Professor Joe Geigel standing in a computer lab.

Joe Geigel


Professor of computer science

At RIT, Geigel began working with Marla Schweppe, a now Emeritus Professor of 3D digital design. In 2004, the interdisciplinary duo published a paper on theatrical storytelling in a virtual space. The paper led to an NSF grant to make it happen.

“When you put on that headset and you’re in a 3D world, it becomes an impressively immersive experience that you don’t get through 2D video,” said Geigel. “Using XR in the theatrical process opens new avenues for artistic expression and allows us to create effects that would be difficult or impossible to achieve on a physical stage.”

Since then, Geigel has worked on more than 15 performances and productions with XR, with several taking place in RIT’s MAGIC Spell Studios. “Been Set Free” was one live dance interpretation of a song. It was set on a virtual stage with participants in different physical locations. The audience used headsets to watch a dancing avatar, whose motions were guided by a live performer outfitted with a full-body motion capture system.

“Next, we’re looking at applications in facial motion capture,” said Geigel. “This has been a hard problem for a long time—it’s even expensive for movie makers. But now, the tech is getting to the point where you can do good facial capture with your smartphone.”

Geigel and seven student researchers are now creating avatars for live performances of virtual karaoke and skits. It’s called the XRLive project, and it’s run through RIT’s new Vertically Integrated Projects research program for undergraduate and graduate students.

Rafa Davis, a first-year game design and development major, started XRLive on the acting side. Davis joined the student-run theater club RIT Players to make new friends and ended up volunteering for an XR performance of the comedy routine “Who’s on First?”

“You’re wearing motion capture bands on your wrists and ankles and a bike helmet with a phone filming your facial expressions, but you eventually forget that it’s all there,” said Davis, who is from Oakland, Calif.

Davis is researching the technical side as well, creating open-source tools and technical support for the live XR-enhanced performances. The team works to integrate and sync motion capture systems with Unreal Engine. Davis said the challenge comes from using different technologies that have never been connected before.

“It’s a win-win because I really like the people on the XR project and this will give me practical experience for my major,” said Davis. “We can create all kinds of full experiences that engage your senses using XR. I think it’s the future.”

 
 
Omolayo Olawuyi, a second-year graphic design major, uses an RIT-developed VR app to learn how a famous cast iron printing press works.

19th-century printing press 
comes to life

RIT students and faculty are using a new technology to capture an old experience.

Their VR app simulates printing on a 19th-century cast iron hand press once owned by British designer William Morris.

The app, published earlier this year on the Steam gaming platform, gives virtual life to the Kelmscott/Goudy Albion printing press, a prized object in the Cary Graphic Arts Collection at RIT.

Professor Shaun Foster standing in a hallway.

Shaun Foster


Professor of 3D digital design

The cast iron hand press had noteworthy owners—including Morris, leader of the British Arts and Crafts movement; Frederic Goudy, American type designer and typographer; and Melbert B. Cary Jr., the RIT special collection’s namesake. Its storied past makes the press a destination for cultural heritage enthusiasts around the world, said Steven Galbraith, curator of the Cary Collection.

Galbraith sees a growing role for VR technology in libraries and museums.

“Simulating a rare artifact like the Kelmscott/Goudy Press helps expand teaching and research possibilities to more people,” said Galbraith. “It triggers imagination and curiosity.”

The student-led project grew from a collaboration between Galbraith and Shaun Foster ’01 MFA (3D computer animation), professor of 3D digital design.

“Creating novel uses for access to archival and historical objects for learning is an interest of mine,” Foster said.

Several RIT students contributed to the virtual printing press in its various stages of development.

Boyu Xu ’18 MS (media arts and technology) built the initial 3D model for the Cary Collection. Aidan Grant ’23 (3D digital design) built the VR experience. Grant gamified the hand press using Epic Games’ Unreal Engine, a 3D graphics software tool, for use with HTC Vive VR hardware.

“My job was basically to take the model, trim it down, because it was ultra-high definition, and make it interactive in VR to guide a user through it,” said Grant, who is now a VR software engineer at TRU Simulation, located near Tampa, Fla.

Epic Games featured Grant’s work in the 2023 Unreal Academic Partner Student Showcase. Another student, Hunter Ostrander ’24 (3D digital design), later added menus and sound effects.

The VR application includes haptic feedback in the controller to mimic sounds and tactile pressure of inking a roller or pulling a handle. The software also features diegetic interfaces as a new element within VR to enable pop-up credits and prompts, Foster said.

The software creates a digital twin of the Kelmscott/Goudy press that preserves the experience of printing on the machine.

Users can simulate the physical process of putting paper on the press, inking a brayer, and rolling it over the set type. Prompts within the app direct them to turn the “rounce” handle and pull the bar to make a printed impression.

The result is a virtually printed proof sheet from The Works of Geoffrey Chaucer, the most famous book published by Morris’ Kelmscott Press, Galbraith said. The work includes The Canterbury Tales.

“Books such as the ‘Kelmscott Chaucer,’ as it is more commonly called, influenced generations of fine-press printers who expanded the art of the book and aspired to the highest design standard,” said Galbraith.

 
 
Associate Professor Gabriel Diaz, left, and imaging science Ph.D. student Arianna Giguere, right, monitor a participant using a VR headset and steering wheel to study the effects of cortical blindness on the ability to drive.

Rehabilitating vision loss 
with virtual reality

In RIT’s Chester F. Carlson Center for Imaging Science, study participants use VR gear and a steering wheel to drive a simulated vehicle on a road that winds through fields and forests.

Associate Professor Gabriel Diaz is using VR to study the effects of cortical blindness on the processing of visual information used to guide behavior.

Cortical blindness is a condition caused by damage to the brain’s occipital cortex that often results in the loss of vision across half of the visual field. It affects hundreds of thousands of stroke patients each year.

Associate professor Gabriel Diaz standing in front of a checkered background.

Gabriel Diaz


Associate professor of imaging science

For those suffering from cortical blindness, navigating turns and reacting to objects along the roadway can be dangerous. The vision loss can also drastically impact independence.

Diaz has created the VR simulation to see how visually impaired individuals react while navigating a road. Studying how cortical blindness affects a real-world action like driving provides new insight to examine how the loss of vision impacts a person’s quality of life.

“Virtual reality gives us the ability to systematically manipulate variables in a controlled context,” explained Diaz. “It allows interactivity in a way that was never really possible before in a practical manner.”

Diaz’s current work is funded by an award from the Research to Prevent Blindness organization and is in collaboration with Krystel Huxlin and Matthew Cavanaugh from the University of Rochester. Imaging science Ph.D. student Arianna Giguere is also part of the research team.

“VR is incredibly beneficial in studying behavior because we can look for fine details and changes of where people are looking or how they are looking, depending on how much vision loss there is and the unique characteristics of their vision loss,” said Giguere. “We would like to use what we learn to develop a training program to help rehabilitate their vision. Our approach is to try to address the root of the problem.”

The use of this technology opens up new ways for scientists to investigate problems, especially in research areas where real-world applications are necessary.

In the future, Diaz hopes to continue to use VR simulations to research other visual impairments, all with the goal of improving the quality of life for others.

“We’re trying to use this general approach to understand other issues with low vision and how they affect daily life,” said Diaz. “Long term, we want to develop ways to help reduce the effects that low vision has on the quality of life.”

 
 
Ph.D. researcher Sanzida Mojib Luna, left, observes a participant in her study of how people play AR mobile games.

Ph.D. student aims to make 
AR more accessible

Pokémon Go is popular. The augmented reality mobile game has been downloaded nearly 630 million times.

However, RIT computing and information sciences Ph.D. student Sanzida Mojib Luna thinks that number should be even higher.

Luna is studying how diverse user groups—including people who are deaf and hard of hearing—use AR mobile games. With those findings, she hopes to enhance the accessibility and inclusivity of AR experiences for all people.

Associate professor Sanzida Mojib Luna standing in a hallway.

Gabriel Diaz


Associate professor of imaging science

“It took mainstream TV more than four decades to include closed captioning, and we don’t want to make that mistake with AR,” said Luna. “We don’t want to eliminate people with any disability from this emerging technology. We’re building a future, why not build it for everyone?”

As an undergraduate in Bangladesh, Luna enjoyed working on accessibility projects, including an Internet of Things-enabled home security system for people who are blind.

When searching for graduate programs, she noticed work from Assistant Professor Konstantinos Papangelis, who runs an RIT lab focused on location-based games. They first connected through a Reddit post and found that their research interests aligned.

Now working in the Niantic x RIT Geo Games and Media Research Lab, Luna is researching how deaf and hard-of-hearing people communicate, collaborate, and coordinate in co-located collaborative multiplayer AR environments. She creates user studies to observe as participants play through different AR mobile games.

Her research has found that AR game designers should be providing communication options across multiple modalities—including verbal, visual, and haptic. To improve the experience, designers can also empower deaf and hard-of-hearing users to customize presentations by including options for adjustable captions, movable floating icons, player-placed map beacons, and camera feeds of fellow participants’ faces and hands.

“We also noticed that deaf and hard-of-hearing participants were more concerned about the external physical environment than their hearing peers,” said Luna. “For example, they may rely on vibrations and haptic feedback from the game in order to stay safe while playing.”

With her findings, Luna hopes to build comprehensive accessibility guidelines for AR designers that could be used for any AR application. In the future, the research could be expanded to create more guidelines, for users who are low vision or with cognitive or motor impairments, to make the technology more inclusive for all.