How Important is “Starting Small” in Language Acquisition – David Plaut (Carnegie Mellon University)

October 30, 2001 all-day

View Seminar Video
Elman (1993, Cognition) reported that recurrent connectionist networks could learn the structure of English-like artificial grammars by performing implicit word prediction, but that learning was successful only when “starting small” (e.g., starting with limited memory that only gradually improves). This finding provided critical computational support for Newport’s (1990, Cognitive Science) “less is more” account of critical period effects in language acquisition—that young children are aided rather than hindered by limited cognitive resources. The current talk presents connectionist simulations that indicate, to the contrary, that language learning by recurrent networks does not depend on starting small; in fact, such restrictions hinder acquisition as the languages are made more natural by introducing graded semantic constraints. Such networks can nonetheless exhibit apparent critical-period effects as a result of the entrenchment of representations learned in the service of performing other tasks, including other languages. Finally, although the word prediction task may appear unrelated to actual language processing, a preliminary large-scale simulation illustrates how performing implicit prediction during sentence comprehension can provide indirect training for sentence production. The results suggest that language learning may succeed in the absence of innate maturational constraints or explicit negative evidence by taking advantage of the indirect negative evidence that is available in performing online implicit prediction.

Dr. Plaut is an Associate Professor of Psychology and Computer Science at Carnegie Mellon University, with a joint appointment in the Center for the Neural Basis of Cognition. His research involves using connectionist modeling, complemented by empirical studies, to extend our understanding of normal and impaired cognitive processing in the domains of reading, language, and semantics. For more details, see his vitae at

Center for Language and Speech Processing