-
PhD Dissertation Defense - Neal Lawton
Thu, Oct 24, 2024 @ 01:00 PM - 03:00 PM
Thomas Lord Department of Computer Science
University Calendar
Title: Learning at the Local Level
Date: Thursday, October 24, 2024
Time: @ 1:00pm - 3:00pm
Location: DMC 160
Committee: Aram Galstyan, Greg Ver Steeg, Bistra Dilkina, and Assad Oberai
Abstract
In this dissertation, I present a perspective of machine learning that views feature learning as the fundamental strategy by which deep machine learning models learn to solve complex problems: when trained to perform one specific task, deep machine learning models tend to learn generalizable features that are useful for solving many different tasks. In this way, deep machine learning models learn at a local level by automatically breaking down complex problems into simple relevant subproblems. I then present a diverse collection of works that put this perspective into action to design better machine learning algorithms. These works include efficient optimization algorithms, including an algorithm for block-free parallel inference in exponential families (Chapter 2) and a novel second-order algorithm for training neural networks (Chapter 3); algorithms for efficient neural architecture search (NAS), including a morphism-based NAS algorithm for growing neural networks (Chapter 4) and a pruning-based NAS algorithm for finding more parameter-efficient PEFT architectures (Chapter 5); and algorithms for efficient fine-tuning of large language models, including an algorithm for increasing the performance of fine-tuning quantized models (Chapter 6) and a joint fine-tuning algorithm for retrieval augmented generation (RAG) pipelines (Chapter 7).Location: DMC 160
Audiences: Everyone Is Invited
Contact: Ellecia Williams