The RIT Fram Chair will offer awards for Excellence in Applied Critical Thinking (ACT) for exhibits at Imagine RIT.
Critical thinking analyzes information to produce reliable knowledge. It can involve hypothesis testing, application of methods to support evaluation and creation of new ideas, products or views. Critical thinking also seeks to consider weaknesses in thinking such as insufficient inquiry, ambiguity, unexamined assumptions, biases, and subjectivity. At RIT, ACT is an active form of engagement, drawing from our diverse domains and deep expertise to address the questions and challenges that we face.
Small Group Award (Exhibit BOO-1441): Ambio & Critical Thinking
Team: Julianne Burke, Josh Ladisic, Conner Hasbrouck, Jess Wiltey, Bennoni Thomas, Colton Woytas, Sudarshan Ashok, Vincent Lin
Synopsis: A team of 5 designers and 3 developers on the path to creating a digital platform that breaks distance to help people to reconnect with their separated loved ones in a naturally intimate way, leveraging latest wearable technology. In their research of current technological innovations that would serve our goals, the team discovered a technology that determines a user's mood through biological data such as blood pressure, heart rate, breathing pattern and vocal tone. Brainstorming how this technology could be sculpted into a delightful product experience to solve our problem in mind, we conducted ideation sessions to come up with various concepts through multiple rounds of sketching and iteration. Conducting business research to understand which platforms would solve our user needs the best, we noticed trends that showed smartwatch products such as Apple watch or Android wear with increasingly growing market sizes and technological offerings that make implementation of our solution easy. Leveraging the API(Application Program Interface) created by the development team, we worked towards coming up with various iterations of different use cases thus furnishing a quick prototype through which rounds of feedback and additional rounds of iteration were followed. Through further analysis of the wearables market and mentorship advice from professors and alumni mentors from the Silicon Valley, we understood the unique opportunity space to create our own wearable product that lets users share their mood.
Large Group Award (GOR-0600): Using Innovations in Technology to Combat Violence (Simulation & Behavioral Health: Meet Avatars/SimMan):
Faculty, Staff & Community Industry Mentors: Dr. Caroline J. Easton (CHST, Professor/Researcher and PI); Dr. Richard L Doolittle (CHST, Professor/Faculty Researcher); Meghan Lewis, Alli O’Malley, Nicole Trabold, Cassandra Berbary, Lindsay Chatmon, Brittany St. Jean, Joshua Aldred, Akshay Kumar Arun, Anthony Perez, Karie Carita, Jason Chung, Keli DiRisio
Synopsis: Violence is escalating and continues to contribute to the burden of disease at the worldwide level. The negative consequences are devastating to families and society as a whole. The health consequences are numerous and include trauma, substance misuse, depression, anxiety disorders, medical problems and loss of work. The estimated cost to treat victims of violence and offenders is $5.8 billion dollars each year. Compounding this problem is the absence of effective treatments for violence, especially among those who offend violence. Clinicians and policy makers look for answers to help intervene to treat the after effects of violence. We believe that advances in science (e.g., targeted and evidenced based behavioral therapy strategies) can be delivered by interactive technological platforms, which can standardize the way behavioral health problems are screened and treated. Violence is contagious and often multi-generational, which, in turn, requires all necessary means to intervene in ways that are easily disseminated allowing access to evidenced based screening and treatment strategies.
2016 Small Group
Small Group Award (Exhibit INS-1160): Robotic Eye Motion Simulator
Team: Amy Zeller, Joshua Long, Nathan Twichel, Peter Cho, Jordan Blandford
Synopsis: The objective of this project is to develop a robotic eye that mimics human eye movement to provide a standard for eye tracker testing and to do this within a budget of $2,000. Our senior design team has utilized critical thinking since day one in senior design. It has allowed us to evaluate ideas and to make informed decisions about our project. One of the biggest challenges that our team had to overcome was coming up with a motor to use for our design that both our team and customer agreed upon. An eye tracker is a device that tracks human eye movement and estimates gaze position. Eye trackers have long been used in psychology research, visual system research, marketing, and, recently, as an input device for human-computer interaction. The quality of the data eye trackers output is a fundamental aspect for any research based on eye tracking. There is currently no standardized test method for evaluating the quality of data collected from eye trackers. The lack of standard may lead to research being based on unreliable data. Different manufacturers measure quality using their own methods and researchers either measure it again using different methods or simply report whatever numbers the manufacturer provides. However, the goal of this project is to make the robotic eye affordable, which is necessary to make the use of this eye practical for eye tracker manufacturers and eye tracking researchers to use as a standard. Therefore, our team set out to find a motor that was less than $2,000, had a velocity of 8.73 rad/s and a repeatability of 0.015 degrees.
2016 Large Group
Large Group Award (Exhibit SUS-3260): Your Decisions Make Sustainability Possible!
Team: Jennifer Russell (Coordinator, Golisano Institute for Sustainability) Reema Aldossari, Yi Feng, Shih-Hsuan Huang, Michael Kelly, Nicolas Matthew Miclette, Wilson Sparberg Patton, Wenjing Qi, Kaining Qiu, Elizabeth Stegner ,Jiahe Tian, Akanksha Vishwakarma, Hui-Yu Yang, Yue Zhang, Runhao Zhao (Industrial Design Graduate Students)
Synopsis: Our society is facing some significant environmental and social challenges; some of these must be tackled through government and industry initiative, but for many of those challenges the most effective solutions can start right at home with the individual. Our increasing consumption of goods and services is putting significant pressure on our natural systems, as well as on our communities as they deal with increasing waste and resource constraints. We believe that, although consumers may feel helpless to fix some of these problems, in fact they have the potential to be among the most powerful drivers of needed change.
We use this Imagine RIT exhibit to explore how consumers make choices about the products they consume, and how their behaviors and evaluations are affected by new information. Specifically, we seek to understand how the consumer evaluates a ‘greener’ product relative to a ‘normal’ product, and how the product characteristics of ‘cool’ and ‘innovative’ interplay in their choices.
The Fram Advisory Board (FAB) or designated committee will select an exhibit to be recognized for demonstrating Applied Critical Thinking:
Submissions are due Friday, April 13, 2018 at NOON to RITFramChair@rit.edu.
Since Applied Critical Thinking at RIT is an active process, awards will be based upon the applicant’s ability in their response to show evidence of the quality of the critical thinking process used to arrive at the final outcome of the exhibit.
Responses will be evaluated to the degree with which they demonstrate the criteria below. As applied critical thinking can take various forms, not all criteria must be demonstrated.
Applied: Explain how you considered an issue, solved a problem, arrived at or delivered a solution, project, etc. Assessment will consider the degree to which the application is clearly defined, with attention to addressing the root issue at hand.
Critical: The critical review of data and information, resolving weaknesses in thinking through:
Thinking: The measured evaluation and possible resolution through: