From schuler.77 at osu.edu Sat Apr 1 16:58:34 2023 From: schuler.77 at osu.edu (Schuler, William) Date: Sat, 1 Apr 2023 20:58:34 +0000 Subject: [CaCL] cacl reading for this week Message-ID: Hi all, Let's read this, on HPSG: http://www.sfs.uni-tuebingen.de/~dm/papers/ell2-hpsg.pdf wm -------------- next part -------------- An HTML attachment was scrubbed... URL: From oh.531 at buckeyemail.osu.edu Thu Apr 6 14:19:41 2023 From: oh.531 at buckeyemail.osu.edu (Oh, Byung-Doh) Date: Thu, 6 Apr 2023 18:19:41 +0000 Subject: [CaCL] =?windows-1252?q?CaCL_4/13=3A_Modern_language_models_refut?= =?windows-1252?q?e_Chomsky=92s_approach_to_language?= Message-ID: Hi everyone, Next week, we'll discuss the following paper: Modern language models refute Chomsky?s approach to language https://ling.auf.net/lingbuzz/007180 The rise and success of large language models undermines virtually every strong claim for the innateness of language that has been proposed by generative linguistics. Modern machine learning has subverted and bypassed the entire theoretical framework of Chomsky's approach, including its core claims to particular insights, principles, structures, and processes. I describe the sense in which modern language models implement genuine theories of language, including representations of syntactic and semantic structure. I highlight the relationship between contemporary models and prior approaches in linguistics, namely those based on gradient computations and memorized constructions. I also respond to several critiques of large language models, including claims that they can't answer ``why'' questions, and skepticism that they are informative about real life acquisition. Most notably, large language models have attained remarkable success at discovering grammar without using any of the methods that some in linguistics insisted were necessary for a science of language to progress. Best, Byung-Doh ================= Byung-Doh Oh (he/him/his) Ph.D. Student Department of Linguistics The Ohio State University -------------- next part -------------- An HTML attachment was scrubbed... URL: From clark.3664 at buckeyemail.osu.edu Thu Apr 13 14:22:05 2023 From: clark.3664 at buckeyemail.osu.edu (Clark, Christian) Date: Thu, 13 Apr 2023 18:22:05 +0000 Subject: [CaCL] CaCL reading for 4/20 Message-ID: Hi CaCL members, Our reading for next Thursday will be "A resource-rational model of human processing of recursive linguistic structure" by Hahn et al. (2022). Link: https://www.pnas.org/doi/10.1073/pnas.2122602119 Abstract: A major goal of psycholinguistic theory is to account for the cognitive constraints limiting the speed and ease of language comprehension and production. Wide-ranging evidence demonstrates a key role for linguistic expectations: A word?s predictability, as measured by the information-theoretic quantity of surprisal, is a major determinant of processing difficulty. But surprisal, under standard theories, fails to predict the difficulty profile of an important class of linguistic patterns: the nested hierarchical structures made possible by recursion in human language. These nested structures are better accounted for by psycholinguistic theories of constrained working memory capacity. However, progress on theory unifying expectation-based and memory-based accounts has been limited. Here we present a unified theory of a rational trade-off between precision of memory representations with ease of prediction, a scaled-up computational implementation using contemporary machine learning methods, and experimental evidence in support of the theory?s distinctive predictions. We show that the theory makes nuanced and distinctive predictions for difficulty patterns in nested recursive structures predicted by neither expectation-based nor memory-based theories alone. These predictions are confirmed 1) in two language comprehension experiments in English, and 2) in sentence completions in English, Spanish, and German. More generally, our framework offers computationally explicit theory and methods for understanding how memory constraints and prediction interact in human language comprehension and production. ---- Christian Clark Ph.D. Student Department of Linguistics The Ohio State University -------------- next part -------------- An HTML attachment was scrubbed... URL: