news

Mar 12, 2025 Our new paper, Denoising Score Distillation: From Noisy Diffusion Pretraining to One-Step High-Quality Generation, introduces a novel approach to training a one-step image generator using only noisy images. Remarkably, our method achieves FID scores comparable to diffusion models trained on clean images. Beyond proposing a new solution to inverse problems through distillation, our work demonstrates that distillation is not merely an acceleration technique but also enhances generation quality compared to the teacher diffusion model—both empirically and theoretically.
Feb 01, 2025 Our new paper Conditional diffusions for amortized neural posterior estimation in using diffusion model to do amortized simulation based inference is accepted by AISTATS 2025.
Oct 01, 2024 Our new papers Diffusion Policies Creating a Trust Region for Offline Reinforcement Learning and Identifying General Mechanism Shifts in Linear Causal Representations are accepted by NeurIPS 2024. See you in Vancouver! :rocket: