Cog. Sci. Connections, From sounds to signs: What drives deaf signers’ skilled and efficient reading behaviors?
Speaker: Frances Cooley, Ph.D.
Title: From sounds to signs: What drives deaf signers’ skilled and efficient reading behaviors?
Short Bio: Frances Cooley double-majored in Brain and Cognitive Science and American Sign Language at the University of Rochester in 2013 before spending two years at Boston Children’s Hospital conducting eye-tracking and EEG studies with children with developmental disabilities. From there, she completed her PhD in Linguistics in 2021 at UT Austin under David Quinto-Pozos where she investigated the eye-movements deaf child signers during reading. During her time at UT, she was a National Academy of Education/Spencer Foundation Dissertation Fellow. Once completing her PhD, she worked for two years as a postdoctoral researcher at the University of South Florida under Liz Schotter on a nation-wide eye-tracking study to assess the individual differences in deaf and hearing readers that best predict their visual and perceptual reading spans. Since arriving at RIT/NTID in January 2024, Frances has published several first- and middle-authored manuscripts, presented at several international conferences, and recently received an NIH R21 Exploratory Research Grant to use eye-tracking and behavioral testing to assess the role of first- and second-language vocabulary knowledge during reading.
Abstract: Reading is tricky business regardless of your hearing status. However, due to widespread dependence on phonics and instructing children to read by exploiting sound-to-letter correspondences, reading is particularly tricky for deaf individuals who do not have auditory access to speech sounds. Despite this, profoundly deaf children who have early and robust access to a first-language in the visual modality (e.g., American Sign Language; ASL) develop into highly efficient and skilled readers of the written form of the ambient spoken language (e.g., English) as a second-language, and their eye-movements suggest that efficiency may be due in part to reduced activation of speech sounds when reading. In this talk, I will describe the small literature pertaining to the eye-movements of deaf child signers while reading, including my own dissertation study (Cooley & Quinto-Pozos, 2023) that tested the degree to which deaf children who were native signers of ASL activate speech-sounds when reading. These results suggest that deaf children do not activate speech-sounds to the same extent as age- and reading-matched hearing children, but one fundamental question remains: what mechanism do they rely on to activate word meaning in their second language? Here, I will pass the mic to Dr. Agnes Villwock who will explain how we can use EEG to test those neural mechanisms and begin to unpack what the eye-movements cannot. After Dr. Villwock’s presentation, we will reconvene to propose a study where we co-register the eye-movements and event-related potentials of deaf readers to assess these mechanisms.
ASL-English interpreters have been requested. Light refreshments will be provided.
Event Snapshot
When and Where
Who
Open to the Public
Interpreter Requested?
Yes