Progressive Tempering Sampler with Diffusion

Loading...
Thumbnail Image

Access rights

openAccess
CC BY
publishedVersion

URL

Journal Title

Journal ISSN

Volume Title

A4 Artikkeli konferenssijulkaisussa

Date

Major/Subject

Mcode

Degree programme

Language

en

Pages

23

Series

Proceedings of Machine Learning Research, Volume 267, pp. 51724-51746

Abstract

Recent research has focused on designing neural samplers that amortize the process of sampling from unnormalized densities. However, despite significant advancements, they still fall short of the state-of-the-art MCMC approach, Parallel Tempering (PT), when it comes to the efficiency of target evaluations. On the other hand, unlike a well-trained neural sampler, PT yields only dependent samples and needs to be rerun—at considerable computational cost—whenever new samples are required. To address these weaknesses, we propose the Progressive Tempering Sampler with Diffusion (PTSD), which trains diffusion models sequentially across temperatures, leveraging the advantages of PT to improve the training of neural samplers. We also introduce a novel method to combine high-temperature diffusion models to generate approximate lower-temperature samples, which are minimally refined using MCMC and used to train the next diffusion model. PTSD enables efficient reuse of sample information across temperature levels while generating well-mixed, uncorrelated samples. Our method significantly improves target evaluation efficiency, outperforming diffusion-based neural samplers.

Description

Publisher Copyright: © 2025, by the authors.

Keywords

Other note

Citation

Rissanen, S, Ouyang, R, He, J, Chen, W, Heinonen, M, Solin, A & Hernández-Lobato, J M 2025, 'Progressive Tempering Sampler with Diffusion', Proceedings of Machine Learning Research, vol. 267, pp. 51724-51746. < https://proceedings.mlr.press/v267/rissanen25a.html >