Select a calendar:
Filter October Events by Event Type:
Events for October 24, 2024
-
NL Seminar-Mission: Impossible Language Models
Thu, Oct 24, 2024 @ 11:00 AM - 12:00 PM
Information Sciences Institute
Conferences, Lectures, & Seminars
Speaker: Julie Kallini, Stanford University
Talk Title: Mission: Impossible Language Models
Abstract: REMINDER: Meeting hosts only admit on-line guests that they know to the Zoom meeting. Hence, you’re highly encouraged to use your USC account to sign into Zoom. If you’re an outside visitor, please inform us at (nlg-seminar-host(at)isi.edu) to make us aware of your attendance so we can admit you. Specify if you will attend remotely or in person at least one business day prior to the event Provide your: full name, job title and professional affiliation and arrive at least 10 minutes before the seminar begins. If you do not have access to the 6th Floor for in-person attendance, please check in at the 10th floor main reception desk to register as a visitor and someone will escort you to the conference room location. ZOOM INFO: https://usc.zoom.us/j/97400245543?pwd=uo9TL9Ss4TA4Wa4TPtfDQnedE7Va8B.1 Meeting ID: 974 0024 5543 Passcode: 407395 Chomsky and others have very directly claimed that large language models (LLMs) are equally capable of learning languages that are possible and impossible for humans to learn. However, there is very little published experimental evidence to support such a claim. Here, we develop a set of synthetic impossible languages of differing complexity, each designed by systematically altering English data with unnatural word orders and grammar rules. These languages lie on an impossibility continuum: at one end are languages that are inherently impossible, such as random and irreversible shuffles of English words, and on the other, languages that may not be intuitively impossible but are often considered so in linguistics, particularly those with rules based on counting word positions. We report on a wide range of evaluations to assess the capacity of GPT-2 small models to learn these uncontroversially impossible languages, and crucially, we perform these assessments at various stages throughout training to compare the learning process for each language. Our core finding is that GPT-2 struggles to learn impossible languages when compared to English as a control, challenging the core claim. More importantly, we hope our approach opens up a productive line of inquiry in which different LLM architectures are tested on a variety of impossible languages in an effort to learn more about how LLMs can be used as tools for these cognitive and typological investigations.
Biography: Julie Kallini is a second-year Computer Science Ph.D. student at Stanford University advised by Christopher Potts and Dan Jurafsky. Her research spans several topics in natural language processing, including computational linguistics, cognitive science, interpretability, and model architecture. Julie's work is generously supported by the NSF Graduate Research Fellowship, the Stanford School of Engineering Graduate Fellowship, and the Stanford EDGE Fellowship. Before starting her Ph.D., Julie was a software engineer at Meta, where she worked on machine learning for advertisements. Julie graduated summa cum laude from Princeton University with a B.S.E. in Computer Science and a minor in Linguistics.
Host: Jonathan May and Katy Felkner
More Info: https://www.isi.edu/research-groups-nlg/nlg-seminars/
Webcast: https://www.youtube.com/watch?v=sDMUu8rrgV8Location: Information Science Institute (ISI) - Conf Rm#689
WebCast Link: https://www.youtube.com/watch?v=sDMUu8rrgV8
Audiences: Everyone Is Invited
Contact: Pete Zamar
Event Link: https://www.isi.edu/research-groups-nlg/nlg-seminars/
-
PhD Dissertation Defense - Neal Lawton
Thu, Oct 24, 2024 @ 01:00 PM - 03:00 PM
Thomas Lord Department of Computer Science
University Calendar
Title: Learning at the Local Level
Date: Thursday, October 24, 2024
Time: @ 1:00pm - 3:00pm
Location: DMC 160
Committee: Aram Galstyan, Greg Ver Steeg, Bistra Dilkina, and Assad Oberai
Abstract
In this dissertation, I present a perspective of machine learning that views feature learning as the fundamental strategy by which deep machine learning models learn to solve complex problems: when trained to perform one specific task, deep machine learning models tend to learn generalizable features that are useful for solving many different tasks. In this way, deep machine learning models learn at a local level by automatically breaking down complex problems into simple relevant subproblems. I then present a diverse collection of works that put this perspective into action to design better machine learning algorithms. These works include efficient optimization algorithms, including an algorithm for block-free parallel inference in exponential families (Chapter 2) and a novel second-order algorithm for training neural networks (Chapter 3); algorithms for efficient neural architecture search (NAS), including a morphism-based NAS algorithm for growing neural networks (Chapter 4) and a pruning-based NAS algorithm for finding more parameter-efficient PEFT architectures (Chapter 5); and algorithms for efficient fine-tuning of large language models, including an algorithm for increasing the performance of fine-tuning quantized models (Chapter 6) and a joint fine-tuning algorithm for retrieval augmented generation (RAG) pipelines (Chapter 7).Location: DMC 160
Audiences: Everyone Is Invited
Contact: Ellecia Williams
-
Viterbi - Interview Success: Turning Interviews into Offers
Thu, Oct 24, 2024 @ 04:00 PM - 05:00 PM
Viterbi School of Engineering Career Connections
Workshops & Infosessions
This event is for Viterbi engineering students only. Please register through Handshake.
Receiving an offer is exciting, but should you accept it right away? Learn an offer's ins and outs to ensure you get the best compensation package. Join Viterbi Career Connections for "Increase Your Salary: Evaluating & Negotiating Your Job Offer," an interactive workshop designed to help you evaluate and get the most out of your offer.
In this interactive session, you will:
Assess how well a job offer matches your expectations.
Understand the components of a job offer, including salary, benefits, bonuses, and more.
Discover how to assess the full value of a job offer, considering both financial and non-financial aspects.
Learn negotiation techniques to discuss better compensation and benefits.
Learn how to articulate your value and negotiate without compromising the job offer.
Participate in a role-playing exercise to practice your negotiation skills and receive feedback.
Walk away with action items and resources, making you confident in your offer negotiation skills.
Location: Ronald Tutor Hall of Engineering (RTH) - 211
Audiences: Everyone Is Invited
Contact: RTH 218 Viterbi Career Connections
Event Link: https://usc.joinhandshake.com/
-
Whiting-Turner Contracting Information Session
Thu, Oct 24, 2024 @ 05:30 PM - 06:30 PM
Viterbi School of Engineering Career Connections
Workshops & Infosessions
This event is for Viterbi engineering students only. Please register on Handshake in order to attend the event.
Come meet & greet with project managers from The Whiting-Turner Contracting Company. Dinner will be provided and opportunities for internships & full-time construction engineer positions discussed.
Degree Levels: Bachelors, Masters
Majors: Civil Engineering, Industrial and Systems Engineering, Material Science
Are you able to potentially sponsor or hire on CPT/OPT? Unfortunately, NoLocation: Ronald Tutor Hall of Engineering (RTH) - 211
Audiences: Everyone Is Invited
Contact: RTH 218 Viterbi Career Connections
Event Link: https://usc.joinhandshake.com/