Sorry, you need to enable JavaScript to visit this website.

Win up to $250 Tiger Bucks!

The RIT Fram Chair offers awards for Excellence in Applied Critical Thinking (ACT).  Winners will be announced prior to Imagine RIT and awards presented at Imagine RIT 2018.

The deadline for 2018 submissions is Friday, April 13, 2018 at 12:00pm - Complete your application using Google Forms HERE!

We seek to recognize excellence in the APPLICATION OF CRITICAL THINKING that was used in the preparation and creation of an exhibit at Imagine RIT.  This recognition is NOT for the exhibit itself, but the thought processes used to arrive at or create the exhibit.

Application of critical thinking connects the performance chain of knowing-doing-creating.  At RIT, ACT is engaged thinking used to effectively comprehend and analyze a context, develop a point of view, and implement a strategy to solve a complex problem or creatively manifest new ideas, both individually and collectively.

Eligibility, Rules & Limitations:

Each application must include at least one actively enrolled RIT student.  Names of all RIT contributors (RIT students, faculty, staff) must be listed on the response, and the group type (small group, large group) must be specified.

All deadlines must be met to be considered, and the final exhibit must be presented at Imagine RIT.

Awarded groups will share $250 Tiger Bucks. This year the committee seeks to recognize excellence in the following categories:

Small group exhibit (1-5 participants)
Large group exhibit (6 or more participants)

Fram Award Winner's Circle

2017 Small Group

2017 Small Group Award (Exhibit BOO-1441): Ambio & Critical Thinking

Team: Julianne Burke, Josh Ladisic, Conner Hasbrouck, Jess Wiltey, Bennoni Thomas, Colton Woytas, Sudarshan Ashok, Vincent Lin

Synopsis: A team of 5 designers and 3 developers on the path to creating a digital platform that breaks distance to help people to reconnect with their separated loved ones in a naturally intimate way, leveraging latest wearable technology. In their research of current technological innovations that would serve our goals, the team discovered a technology that determines a user's mood through biological data such as blood pressure, heart rate, breathing pattern and vocal tone. Brainstorming how this technology could be sculpted into a delightful product experience to solve our problem in mind, we conducted ideation sessions to come up with various concepts through multiple rounds of sketching and iteration. Conducting business research to understand which platforms would solve our user needs the best, we noticed trends that showed smartwatch products such as Apple watch or Android wear with increasingly growing market sizes and technological offerings that make implementation of our solution easy. Leveraging the API(Application Program Interface) created by the development team, we worked towards coming up with various iterations of different use cases thus furnishing a quick prototype through which rounds of feedback and additional rounds of iteration were followed. Through further analysis of the wearables market and mentorship advice from professors and alumni mentors from the Silicon Valley, we understood the unique opportunity space to create our own wearable product that lets users share their mood.

 

2017 Large Group

2017 Large Group Award (GOR-0600): Using Innovations in Technology to Combat Violence (Simulation & Behavioral Health: Meet Avatars/SimMan)

Faculty, Staff & Community Industry Mentors: Dr. Caroline J. Easton (CHST, Professor/Researcher and PI); Dr. Richard L Doolittle (CHST, Professor/Faculty Researcher); Meghan Lewis, Alli O’Malley, Nicole Trabold, Cassandra Berbary, Lindsay Chatmon, Brittany St. Jean, Joshua Aldred, Akshay Kumar Arun, Anthony Perez, Karie Carita, Jason Chung, Keli DiRisio

Synopsis: Violence is escalating and continues to contribute to the burden of disease at the worldwide level. The negative consequences are devastating to families and society as a whole. The health consequences are numerous and include trauma, substance misuse, depression, anxiety disorders, medical problems and loss of work. The estimated cost to treat victims of violence and offenders is $5.8 billion dollars each year. Compounding this problem is the absence of effective treatments for violence, especially among those who offend violence. Clinicians and policy makers look for answers to help intervene to treat the after effects of violence. We believe that advances in science (e.g., targeted and evidenced based behavioral therapy strategies) can be delivered by interactive technological platforms, which can standardize the way behavioral health problems are screened and treated. Violence is contagious and often multi-generational, which, in turn, requires all necessary means to intervene in ways that are easily disseminated allowing access to evidenced based screening and treatment strategies.

 

2016 Small Group

2016 Small Group Award (Exhibit INS-1160):  Robotic Eye Motion Simulator

Team: Amy Zeller, Joshua Long, Nathan Twichel, Peter Cho, Jordan Blandford

Synopsis: The objective of this project is to develop a robotic eye that mimics human eye movement to provide a standard for eye tracker testing and to do this within a budget of $2,000.  Our senior design team has utilized critical thinking since day one in senior design. It has allowed us to evaluate ideas and to make informed decisions about our project. One of the biggest challenges that our team had to overcome was coming up with a motor to use for our design that both our team and customer agreed upon. An eye tracker is a device that tracks human eye movement and estimates gaze position. Eye trackers have long been used in psychology research, visual system research, marketing, and, recently, as an input device for human-computer interaction. The quality of the data eye trackers output is a fundamental aspect for any research based on eye tracking. There is currently no standardized test method for evaluating the quality of data collected from eye trackers. The lack of standard may lead to research being based on unreliable data. Different manufacturers measure quality using their own methods and researchers either measure it again using different methods or simply report whatever numbers the manufacturer provides. However, the goal of this project is to make the robotic eye affordable, which is necessary to make the use of this eye practical for eye tracker manufacturers and eye tracking researchers to use as a standard. Therefore, our team set out to find a motor that was less than $2,000, had a velocity of 8.73 rad/s and a repeatability of 0.015 degrees.

 

2016 Large Group

2016 Large Group Award (Exhibit SUS-3260):  Your Decisions Make Sustainability Possible!

Team: Jennifer Russell (Coordinator, Golisano Institute for Sustainability)  Reema Aldossari, Yi Feng, Shih-Hsuan Huang, Michael Kelly,  Nicolas Matthew Miclette, Wilson Sparberg Patton, Wenjing Qi, Kaining Qiu, Elizabeth Stegner ,Jiahe Tian, Akanksha Vishwakarma, Hui-Yu Yang, Yue Zhang, Runhao Zhao (Industrial Design Graduate Students)

Synopsis: Our society is facing some significant environmental and social challenges; some of these must be tackled through government and industry initiative, but for many of those challenges the most effective solutions can start right at home with the individual. Our increasing consumption of goods and services is putting significant pressure on our natural systems, as well as on our communities as they deal with increasing waste and resource constraints.   We believe that, although consumers may feel helpless to fix some of these problems, in fact they have the potential to be among the most powerful drivers of needed change.

We use this Imagine RIT exhibit to explore how consumers make choices about the products they consume, and how their behaviors and evaluations are affected by new information. Specifically, we seek to understand how the consumer evaluates a ‘greener’ product relative to a ‘normal’ product, and how the product characteristics of ‘cool’ and ‘innovative’ interplay in their choices.

*