DevConf.US 2025

Continual Post-Training - "Sculpting Subspaces: Constrained Full Fine-Tuning for Continual Learning in LLMs"
2025-09-19 , Ladd Room (Capacity 96)

Large Language Models (LLMs) are increasingly used in real-world applications, but continually adapting them to new tasks without catastrophic forgetting remains a major challenge. In this talk, I introduce a novel full-parameter continual learning method that leverages adaptive Singular Value Decomposition (SVD) to dynamically identify and protect task-critical subspaces. By constraining updates to orthogonal low-rank directions, our method enables models to retain previous knowledge without adding extra parameters. Compared to strong baselines like O-LoRA, our approach achieves up to 7% higher accuracy while maintaining general language capabilities and safety. This talk will present the methodology, theoretical foundations, and extensive empirical results, demonstrating a scalable path toward continually evolving LLMs.


What level of experience should the audience have to best understand your session?

Intermediate - attendees should be familiar with the subject

Nikhil Nayak is a researcher specializing in large language models, with a focus on parameter-efficient adaptation, fine-tuning strategies, and trustworthy machine learning. He holds a Master’s in Data Science from Harvard University and has been working on advancing continual learning and interpretability in LLMs.