Regular colloquia are Wednesdays, 2:00 P.M. – 4:00 P.M., in 280 Park Hall (unless otherwise noted), North Campus, and are open to the public. To receive email announcements of each event, please subscribe to one of our mailing lists by clicking the link that best describes you: student, UB Faculty and Staff, or Non-UB Cognitive Scientist. You can also subscribe to our calendar.
Background readings for each lecture are available to UB faculty and students on UB Learns. To access, please log in to UB Learns and select "Center for Cognitive Science" → "Course Documents" → "Background Readings for (Semester/Year)." If you are affiliated with UB and do not have access to the UBLearns website, please contact Eduardo Mercado III, director of the Center for Cognitive Science.
September 6, 2 p.m.
Speaker: Allison Fitch
Assistant Professor, Department of Psychology, Rochester Institute of Technology
Early word learning involves forming connections between input (the words) and the objects and events to which the words refer. In spoken language, this typically occurs through a multimodal process of auditory linguistic input and visual information about objects and events. In contrast, deaf children learning American Sign Language (ASL) perceive both linguistic and non-linguistic information in a single modality–vision. Thus, in order to perceive language, deaf children must learn to look in the right place (e.g., to their interaction partners) at the right time (i.e. when language input is directed towards them). In this presentation, I will discuss my recent observational and experimental work that describes attentional dynamics in young deaf children acquiring ASL. I will provide evidence that mainstream theories of joint attention, sustained attention, and gaze-shifting do not account for the demands of signed interactions, and that deaf children are well-equipped to manage these demands from an early age. Finally, I will argue that studying language development in the context of ASL and other signed languages can provide unique insight into the language acquisition machinery that is made opaque in spoken language development.
September 27, 2 p.m.
Speaker: Christopher Hoadley
Professor, Department of Learning and Instruction, UB, SUNY
In this talk, I describe the winding path of a research project on bringing multilingual learners into computer science. In 2015, the New York City Department of Education undertook a ten-year program to bring CS education to all learners citywide. A multidisciplinary team of experts in bilingual education, applied linguistics, computer science, learning sciences, and design-based research convened to study how multilingual learners could be supported in this ambitious project. This talk describes some of the key empirical insights derived from the project, and how those insights led to explorations of educational change far from the original research aims. I relate this trajectory to how research and design in the learning sciences may support responses to pressing social problems more generally.
October 18, 2 p.m.
Speaker: Jon Rawski
Assistant Professor, Department of Linguistics, San Jose State University/Massachussetts Institute of Technology
October 25, 2 p.m.
Speaker: Lewis Powell
Associate Professor, Department of Philosophy, UB, SUNY
November 8, 2 p.m.
Speaker: Robert Hawkins
Assistant Professor, Department of Psychology, University of Wisconsin - Madison
December 8, 2 p.m.