
Diffusion-based generative models offer a principled framework for probabilistic forecasting, but we show they suffer from a fundamental spectral collapse when applied to turbulent flows. A Fourier-space analysis of the forward SDE reveals that the mode-wise signal-to-noise ratio decays monotonically in wavenumber for power-law spectra, rendering high-wavenumber content indistinguishable from noise. We reinterpret the noise schedule as a spectral regularizer and introduce power-law schedules that preserve fine-scale structure deeper into diffusion time. We further propose Lazy Diffusion, a one-step distillation method that leverages the learned score geometry to bypass long reverse trajectories and prevent high-wavenumber degradation. Applied to high-Reynolds-number 2D Kolmogorov turbulence and ocean reanalysis data, these methods resolve spectral collapse and enable stable long-horizon autoregressive emulation.
Event Host: Anish Sambamurthy, Ph.D. Student, Applied Mathematics
Advisor: Ashesh Chattopadhyay
Zoom- https://ucsc.zoom.us/j/5144530307?pwd=TllaWnNDc01tcVNpa1NNeVVIMnp5QT09
Passcode- 55555