View a PDF of the paper titled Block Diffusion: Interpolating Between Autoregressive and Diffusion Language Models, by Marianne Arriola and 7 other authors
Abstract:Diffusion language models offer unique benefits over autoregressive models due to their potential for parallelized generation and controllability, yet they lag in likelihood modeling and are limited to fixed-length generation. In this work, we introduce a class of block diffusion language models that interpolate between discrete denoising diffusion and autoregressive models. Block diffusion overcomes key limitations of both approaches by supporting flexible-length generation and improving inference efficiency with KV caching and parallel token sampling. We propose a recipe for building effective block diffusion models that includes an efficient training algorithm, estimators of gradient variance, and data-driven noise schedules to minimize the variance. Block diffusion sets a new state-of-the-art performance among diffusion models on language modeling benchmarks and enables generation of arbitrary-length sequences. We provide the code, along with the model weights and blog post on the project page: this https URL
Submission history
From: Marianne Arriola [view email]
[v1]
Wed, 12 Mar 2025 17:43:40 UTC (294 KB)
[v2]
Tue, 18 Mar 2025 15:58:18 UTC (291 KB)