BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//pretalx//pretalx.devconf.info//devconf-us-2025//talk//LCCYW7
BEGIN:VTIMEZONE
TZID:EST
BEGIN:STANDARD
DTSTART:20001029T030000
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10;UNTIL=20061029T070000Z
TZNAME:EST
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
END:STANDARD
BEGIN:STANDARD
DTSTART:20071104T030000
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
END:STANDARD
BEGIN:DAYLIGHT
DTSTART:20000402T030000
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=4;UNTIL=20060402T080000Z
TZNAME:EDT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
END:DAYLIGHT
BEGIN:DAYLIGHT
DTSTART:20070311T030000
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
END:DAYLIGHT
END:VTIMEZONE
BEGIN:VEVENT
UID:pretalx-devconf-us-2025-LCCYW7@pretalx.devconf.info
DTSTART;TZID=EST:20250919T130000
DTEND;TZID=EST:20250919T133500
DESCRIPTION:Large Language Models (LLMs) are increasingly used in real-worl
 d applications\, but continually adapting them to new tasks without catast
 rophic forgetting remains a major challenge. In this talk\, I introduce a 
 novel full-parameter continual learning method that leverages adaptive Sin
 gular Value Decomposition (SVD) to dynamically identify and protect task-c
 ritical subspaces. By constraining updates to orthogonal low-rank directio
 ns\, our method enables models to retain previous knowledge without adding
  extra parameters. Compared to strong baselines like O-LoRA\, our approach
  achieves up to 7% higher accuracy while maintaining general language capa
 bilities and safety. This talk will present the methodology\, theoretical 
 foundations\, and extensive empirical results\, demonstrating a scalable p
 ath toward continually evolving LLMs.
DTSTAMP:20260310T061005Z
LOCATION:Ladd Room (Capacity 170)
SUMMARY:Continual Post-Training - "Sculpting Subspaces: Constrained Full Fi
 ne-Tuning for Continual Learning in LLMs" - Nikhil Shivakumar Nayak
URL:https://pretalx.devconf.info/devconf-us-2025/talk/LCCYW7/
END:VEVENT
END:VCALENDAR
