Sorry, you need to enable JavaScript to visit this website.

Adobe Voco: Photoshop for Audio

This review of Adobe Voco is written by IGM Lecturer, Cody Van De Mark. 


Recently Adobe announced an experimental technology called #VoCo, “Photoshop for audio” [link]. The idea behind #VoCo is that it can pull the dialogue from audio clips as text that can then be edited. The edited text will change the audio in the speaker’s voice to say the newly edited phrase. At first, it sounded too good to be true, but Adobe’s engineers have never ceased to amaze. In the live demo of #VoCo, Adobe edited text and showed that the audio clip was updated to say the new phrase. Astonishingly, it sounded just like the original speaker’s voice. Even after multiple changes to the clip, the audio still sounded authentic. The speaker in the audio clip was saying entirely new phrases never heard in the original audio clip.

In terms of the games and media industry, this is incredibly exciting news. Numerous games and media experiences use voice recording [link]. In many products, it’s a crucial part of the experience. Currently any required changes in audio must be re-recorded with the original voice actor. These changes may come from alterations to the script, poorly recorded audio, background noises in the audio, misspoken words, stuttering or other recorded speech disfluencies. Adobe’s technology may ease that work by allowing editors to correct phrases without the need of the original voice actor or the need to rerecord the audio clip. In mere seconds, an editor could correct audio to remove disfluencies and change vocal phrases. 

We don’t know the current limitations of the #VoCo technology, but there is exciting potential for technology like this. Though it is probably beyond the current technology, it is wonderful to imagine a world where we could create compelling procedural dialogue audio in games. Many games use procedural dialogue to create randomized quests and player goals that keep the player engaged. If technology advances to a point that we can procedurally generate matching and compelling voice audio, this could add a lot of immersion to games.

It is already known that voice acting adds an extra layer of immersion to a game. Traditionally this has not worked well with procedural dialogue because that would mean a voice actor would need to record every possible combination of dialogue. Since procedural dialogue can create millions of combinations, it is nigh impossible to record audio clips for every phrase. Recording this many audio clips would also mean the game would need to store millions of additional files, which would drastically increase the installation size of a game. The idea of procedural voice generation could eventually solve that problem and create better player experiences. For the time being, we can be excited that Adobe has pushed technology further and eased the difficulty of voice recording. Hopefully this is a testament to the future of audio manipulation.


Home Page Thumbnail: 

Alumni Spotlight on Tom Conroy and Joe Pietruch

In this IGM spotlight we’re highlighting Tom Conroy (New Media Interactive Development ‘14) and Joe Pietruch (New Media Design ’08, Game Design & Development ’10, and former IGM instructor) from Forbes Media, LLC.

Current Job Title:

Tom: I'm a front-end web developer for the Design/UX team at Forbes Media, a sub section of the larger Product Team.

Joe: I am a Senior Front-End Developer, Mobile Focused for Forbes Media, LLC.


ImagineCup 2016

IGM hosted the ImagineCup from Friday November 11th - Sunday November 13th.  Teams of students spent 36 hours (we made them get sleep from 1am - 7am!) building games, solving problems, or creating cool applications.  This was open to students around RIT.  Students had a great time and the winners were:

Category Winner
Overall Winner Lyra - Frankie DiPietro, Dave Erbelding, Jaber McCormack, Dillon Guscott
Best 3D Game Play Animal Stackers - Beau Marwaha, James Troup, Emily Haldeman
Funnest Game Super Button Masher Extreme Turbo - Norman Greenberg, John Park, Joe Scotchmer, Sam Sternklar
Best Puzzle ColorCoded - John Palermo, Kenneth Probeck, Robert Bailey
Most Clever Game Giant Understaffed Mech Warrior - John Palsipher
Best Co-op Slime Spree - Charles Williams, Joel Shuart, Kevin Idzik, Josh Malmquist, Satcha Puri
Best 2D Platform Frontier - Logan Guidry, Brandon Valenzuela, Julian Januszka
Greatest Potential Blind Sight - Nick Wilk

Congratulations to all the winners and we look forward to what you can create next year!

Home Page Thumbnail: 

IGM Faculty Shine at the Fringe Festival

Several IGM Faculty members participated in the fifth annual Rochester Fringe Festival held September 15-24, 2016. According to the Rochester Fringe Festival website, this celebration is a “10-day, all-out, no-holds-barred, multi-disciplinary visual and performing arts festival featuring international, national, and local artists.” All entertainment mediums are appreciated including theater, comedy, music, dance, and more!

ANOMALY with IGM Associate Professor W. Michelle Harris

The trio of BIODANCE, Sound ExChange and IGM Associate Professor W. Michelle Harris wowed sold-out crowds at the 2013 Fringe Festival. And this year was no different! Playing again to sold out crowds in the four-story dome of the Strasenburgh Planetarium, ANOMALY combined dance, music, and live cinematic effects into a beautiful immersive experience. You can read the Rochester City Paper’s review of ANOMALY here:

Algorave Lite: Live Coding with Algorithmic Dance Music with IGM Assistant Professor Charlie Roberts

Charlie performs in an experimental performance genre called “live coding” (see for more information). In a live coding performance, artists program algorithmic music / visuals in realtime while projecting their code for the audience members to follow along with. At events called “Algoraves” ( live coders perform one after another creating generative beats and dance music. During the Fringe Festival, Charlie did the same, throwing in some audio-reactive visuals as well using a browser-based creative coding platform he develops named Gibber ( He also performed using a new system, gibberwocky, that adds live-coding capabilities to Ableton Live, a popular commercial audio application. Charlie created gibberwocky in cooperation with Graham Wakefield of York University, and they will present a research paper on it next month at the 2016 International Live Coding Conference (

Resonant Freqs: Surveying the Spectrum with IGM Associate Professor Jay Alan Jackson

Bandmates Jay Alan Jackson, Babak Elahi (Associate Dean and Professor in the College of Liberal Arts at RIT), and Adam Wilcox brought their band Resonant Freqs to the Little Theatre during the Rochester Fringe. This interactive, multi-media experience invites “passengers” to participate in a performance aboard Riff Raft, a make-believe show-boat cruising the mysterious Caribbean Trapezoid. Audience members engage in interactive routines, including singing, dancing, and playing rhythmic patterns using percussion instruments made from recycled plastic bottles, tubes, and cardboard.  In this immersive sci-fi, musical comedy, imagination is key!  This production also included motion graphics from IGM Lecturer, Sten McKinzie.  Follow Resonant Freqs on Facebook here:​​​

Gen Jam with IGM Professor Al Biles

Al Biles performed three sets with GenJam at the Little Theater Café during the Fringe Festival. Each set had a different theme: Hard Bop tunes from the 1950’s and 60’s on the first Friday, movie music on that Saturday, and an all Latin set on the last Friday of the festival. Crowds were good, and a couple of his former students stopped in to listen and touch base. For the uninitiated, GenJam is a real-time interactive improvisation system that Al uses to play straight-up jazz gigs, including ear candy for all the RIT open house recruiting events.

Home Page Thumbnail: 

Professor Decker secures NSF Grant

Congratulations to Assistant Professor Adrienne Decker! She is the Principal Investigator (PI) of a 5-year $965,000 National Science Foundation – Improving Undergraduate STEM Education (NSF IUSE) grant titled: "Collaborative Research: Establishing and Propagating a Model for Evaluating the Long Term Impact of College Computing Activities”.



Subscribe to School of Interactive Games and Media | B. Thomas Golisano College of Computing & Information Sciences (GCCIS) | RIT RSS