From schuler.77 at osu.edu Mon Aug 8 15:53:55 2022 From: schuler.77 at osu.edu (Schuler, William) Date: Mon, 8 Aug 2022 19:53:55 +0000 Subject: [CaCL] FW: Seminar announcement and RA opportunity In-Reply-To: References: Message-ID: Hi all, You may be interested in Dan Parker?s new seminar in psycholinguistics, below (he is our new faculty member in psycholinguistics)! wm From: Ling-Faculty on behalf of Parker, Dan via Lingosu via Ling-Faculty Date: Monday, August 8, 2022 at 2:55 PM To: lingosu at ling.osu.edu Subject: [Ling-Faculty] [Lingosu] Seminar announcement and RA opportunity Hi Everyone, I?m recirculating the announcement that I will be offering a graduate seminar in psycholinguistics for the upcoming autumn semester (LING 8700-10, TR 11:10am-12:30pm, Oxley 122) and I?d like to encourage those interested to enroll. Please see below for the course description. The course is highly interdisciplinary, drawing on work across the language sciences and cognitive (neuro)science, and assumes no specific background in psycholinguistics (including experimentation or statistics). I?d also like to announce that I?ll be hiring a graduate RA in spring 2023 to work on my NSF grant, which focuses on the exact topic of the course. So the course is a great opportunity to learn more! Feel free to email me (parker.1758 at osu.edu) if you have any questions! Thanks, Dan ______________________ Dan Parker Associate Professor Department of Linguistics The Ohio State University Course description This seminar will focus on so-called ?linguistic illusions,? which arise when people fail to accurately apply grammatical constraints during real-time sentence processing, leading to systematic misinterpretations. Linguistic illusions have played an important role in our understanding of how we mentally build and manipulate linguistic representations, but there is still a lot that we do not know. In this seminar, we will examine leading accounts about the source and scope of linguistic illusions, drawing on data from diverse languages and linguistic phenomena. -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ATT00001.txt URL: -------------- next part -------------- An embedded and charset-unspecified text was scrubbed... Name: ATT00002.txt URL: From schuler.77 at osu.edu Wed Aug 24 14:42:54 2022 From: schuler.77 at osu.edu (Schuler, William) Date: Wed, 24 Aug 2022 18:42:54 +0000 Subject: [CaCL] Cognitive and Computational Approaches to Language (CaCL) Discussion Group: Thursdays 12:45 in Oxley 122 Message-ID: Hello all, The reading group on Cognitive and Computational Approaches to Language (CaCL) will meet Thursdays at 12:45 starting this week in Oxley 122 (in person). We can also accommodate online participation through Zoom; please email me if you would prefer to participate online. Topics include: broad-coverage computational models of sentence processing in human memory, computational models of human memory, statistical modeling for human linguistic performance data, neural sentence processing models, Bayesian and neural induction models, experimental techniques in neurolinguistics and brain imaging, and much more. The first session will be an organizational meeting, during which we will vote on papers to discuss (feel free to suggest any papers you'd like to discuss!). We will read papers and give software tutorials and practice talks all semester. Please join us! Also, if you are a student this summer, and you are planning to attend, please enroll for 1-3 credits: CaCL is listed as "LING 7890.12?. You can also sign up for the CaCL mailing list at https://lists.osu.edu/mailman/listinfo/cacl, which continues to serve the regular CaCL reading group throughout the academic year. Hope to see you there! William -------------- next part -------------- An HTML attachment was scrubbed... URL: From oh.531 at buckeyemail.osu.edu Thu Aug 25 13:29:40 2022 From: oh.531 at buckeyemail.osu.edu (Oh, Byung-Doh) Date: Thu, 25 Aug 2022 17:29:40 +0000 Subject: [CaCL] Cognitive and Computational Approaches to Language (CaCL) Discussion Group: Thursdays 12:45 in Oxley 122 In-Reply-To: References: Message-ID: CaCL paper selection form: https://docs.google.com/spreadsheets/d/1_GQV39sobbik4kKS5tIuCeEuRGRLcGvfgxq4rJt_tOU/edit#gid=0 [https://lh4.googleusercontent.com/1pjG8vla6Mt0fMkP8avLllBw-RXMP6M5guh2inY-45KpRVovh-88su1hlzD5W48LANA4NFqIkCSJ3w=w1200-h630-p] CaCL paper selection form Sheet1 topic ,notes,link,sponsor,date,Votes,BD,WC,CC,NJ,MW,JY,MB,TC,Dr. CAS,Dr. EJ,PHL,AL Nominated Liang et al. ICLR 2021,Can a Fruit Fly Learn Word Embeddings?,Learning of binary vectors, inspired by fruit fly brains,https://openreview.net/pd... docs.google.com ================= Byung-Doh Oh (he/him/his) Ph.D. Student Department of Linguistics The Ohio State University ________________________________ From: CaCL on behalf of Schuler, William via CaCL Sent: Wednesday, August 24, 2022 2:42 PM To: cacl at ling.osu.edu ; lingosu at ling.osu.edu Subject: [CaCL] Cognitive and Computational Approaches to Language (CaCL) Discussion Group: Thursdays 12:45 in Oxley 122 Hello all, The reading group on Cognitive and Computational Approaches to Language (CaCL) will meet Thursdays at 12:45 starting this week in Oxley 122 (in person). We can also accommodate online participation through Zoom; please email me if you would prefer to participate online. Topics include: broad-coverage computational models of sentence processing in human memory, computational models of human memory, statistical modeling for human linguistic performance data, neural sentence processing models, Bayesian and neural induction models, experimental techniques in neurolinguistics and brain imaging, and much more. The first session will be an organizational meeting, during which we will vote on papers to discuss (feel free to suggest any papers you'd like to discuss!). We will read papers and give software tutorials and practice talks all semester. Please join us! Also, if you are a student this summer, and you are planning to attend, please enroll for 1-3 credits: CaCL is listed as "LING 7890.12?. You can also sign up for the CaCL mailing list at https://lists.osu.edu/mailman/listinfo/cacl, which continues to serve the regular CaCL reading group throughout the academic year. Hope to see you there! William -------------- next part -------------- An HTML attachment was scrubbed... URL: From cheung.179 at buckeyemail.osu.edu Tue Aug 30 10:15:11 2022 From: cheung.179 at buckeyemail.osu.edu (Cheung, Willy) Date: Tue, 30 Aug 2022 14:15:11 +0000 Subject: [CaCL] paper for Thursday 9/1 Message-ID: Hi CaCLers, Sorry for the late posting of the paper - this Thursday we will discuss Schuster and Linzen 2022. Title: When a sentence does not introduce a discourse entity, Transformer-based models still sometimes refer to it. Paper link: https://arxiv.org/pdf/2205.03472.pdf arXiv:2205.03472v1 [cs.CL] 6 May 2022 Lastly, modals such as want also block the intro-duction of a discourse entity, as shown in (4): (4)a. Mary got a pet rat and it is very loud at arxiv.org ? Abstract: Understanding longer narratives or participating in conversations requires tracking of discourse entities that have been mentioned. Indefinite noun phrases (NPs), such as a dog, frequently introduce discourse entities but this behavior is modulated by sentential operators such as negation. For example, a dog in Arthur doesn?t own a dog does not introduce a discourse entity due to the presence of negation. In this work, we adapt the psycholinguistic assessment of language models paradigm to higher-level linguistic phenomena and introduce an English evaluation suite that targets the knowledge of the interactions between sentential operators and indefinite NPs. We use this evaluation suite for a fine-grained investigation of the entity tracking abilities of the Transformer-based models GPT-2 and GPT-3. We find that while the models are to a certain extent sensitive to the interactions we investigate, they are all challenged by the presence of multiple NPs and their behavior is not systematic, which suggests that even models at the scale of GPT-3 do not fully acquire basic entity tracking abilities. -------------- next part -------------- An HTML attachment was scrubbed... URL: From cheung.179 at buckeyemail.osu.edu Tue Aug 30 10:15:16 2022 From: cheung.179 at buckeyemail.osu.edu (Cheung, Willy) Date: Tue, 30 Aug 2022 14:15:16 +0000 Subject: [CaCL] paper for Thursday 9/1 Message-ID: Hi CaCLers, Sorry for the late posting of the paper - this Thursday we will discuss Schuster and Linzen 2022. Title: When a sentence does not introduce a discourse entity, Transformer-based models still sometimes refer to it. Paper link: https://arxiv.org/pdf/2205.03472.pdf arXiv:2205.03472v1 [cs.CL] 6 May 2022 Lastly, modals such as want also block the intro-duction of a discourse entity, as shown in (4): (4)a. Mary got a pet rat and it is very loud at arxiv.org ? Abstract: Understanding longer narratives or participating in conversations requires tracking of discourse entities that have been mentioned. Indefinite noun phrases (NPs), such as a dog, frequently introduce discourse entities but this behavior is modulated by sentential operators such as negation. For example, a dog in Arthur doesn?t own a dog does not introduce a discourse entity due to the presence of negation. In this work, we adapt the psycholinguistic assessment of language models paradigm to higher-level linguistic phenomena and introduce an English evaluation suite that targets the knowledge of the interactions between sentential operators and indefinite NPs. We use this evaluation suite for a fine-grained investigation of the entity tracking abilities of the Transformer-based models GPT-2 and GPT-3. We find that while the models are to a certain extent sensitive to the interactions we investigate, they are all challenged by the presence of multiple NPs and their behavior is not systematic, which suggests that even models at the scale of GPT-3 do not fully acquire basic entity tracking abilities. -------------- next part -------------- An HTML attachment was scrubbed... URL: