Tue, Jul 19, 2022 @ 03:30 PM - 05:30 PM
Conferences, Lectures, & Seminars
Speaker: Yury Zemlyanskiy, PhD Candidate
Talk Title: Parametric and semi-parametric methods for knowledge acquisition from text
Abstract: Knowledge acquisition, the process of extracting, processing, and storing new information, is critical to any intelligent system. Nonetheless, modern neural networks (e.g., BERT) used in natural language processing typically do not have an explicit memory component. Instead, the knowledge about the world that the models acquire is stored implicitly in the model's parameters. This proves unreliable and makes the models ill-suited for knowledge-intensive tasks that require reasoning over vast amounts of textual data. My thesis explores alternative parametric and semi-parametric methods to extract and represent knowledge from text. The main hypothesis is that we can improve the performance of modern NLP models by representing acquired knowledge in a dedicated memory. The models can access knowledge explicitly through interacting with the memory. The thesis consists of three sections: the first section focuses on parametric memory for a pre-defined set of entities. The second part explores a semi-parametric approach to representing entity-centric knowledge in a long document or entire corpus. Finally, the last part discusses memory for semantic parsing.
Committee: Leana Golubchik, Fei Sha, Robin Jia, and Meisam Razaviyayn (external).
Host: CSCI Department
More Info: https://usc.zoom.us/j/5525470171
Audiences: Everyone Is Invited
Contact: Lizsl De Leon