In-Context Learning for Differential Equations

-
Abstract

This talk introduces In-Context Operator Networks (ICON), a neural framework that learns and applies operators directly from prompted data at inference—without weight updates. Rather than training a new model for each equation, boundary condition, or inverse setting, ICON trains a single network to act as a general operator learner. At test time, a small set of input–output demonstrations conditions the model, which then maps new queries to solutions. By exploiting shared structure across operator families, ICON achieves few-shot generalization to previously unseen forward and inverse problems in ordinary differential equations (ODEs), partial differential equations (PDEs), and mean-field control (MFC). We present training objectives, prompting formats, and ablations clarifying when in-context operator learning succeeds, and compare ICON against retraining/fine-tuning baselines. Empirically, ICON matches or exceeds task-specific models while adapting through prompting alone. Joint work with Liu Yang (NUS), Tingwei Meng (Amazon), and Stanley Osher (UCLA).
 

Description

DoMSS Seminar and Research Innovations in Mathematical Sciences
Monday, October 6
12:00pm MST/AZ
ECA 221

Speaker

Siting Liu
Assistant Professor of Mathematics
UC Riverside
 

 

Location
ECA 221