by R. De Santi, M. Vlastelica, Y. P. Hsieh, Z. Shen, N. He, A. Krause
Abstract:
Adapting large-scale foundation flow and diffusion generative models to optimize task-specific objectives while preserving prior information is crucial for real-world applications such as molecular design, protein docking, and creative image generation. Existing principled fine-tuning methods aim to maximize the expected reward of generated samples, while retaining knowledge from the pre-trained model via KL-divergence regularization. In this work, we tackle the significantly more general problem of optimizing general utilities beyond average rewards, including risk-averse and novelty-seeking reward maximization, diversity measures for exploration, and experiment design objectives among others. Likewise, we consider more general ways to preserve prior information beyond KL-divergence, such as optimal transport distances and Rényi divergences. To this end, we introduce Flow Density Control (FDC), a simple algorithm that reduces this complex problem to a specific sequence of simpler fine-tuning tasks, each solvable via scalable established methods. We derive convergence guarantees for the proposed scheme under realistic assumptions by leveraging recent understanding of mirror flows. Finally, we validate our method on illustrative settings, text-to-image, and molecular design tasks, showing that it can steer pre-trained generative models to optimize objectives and solve practically relevant tasks beyond the reach of current fine-tuning schemes.
Reference:
Flow Density Control: Generative Optimization Beyond Entropy-Regularized Fine-Tuning R. De Santi, M. Vlastelica, Y. P. Hsieh, Z. Shen, N. He, A. KrauseIn Advances in Neural Information Processing Systems (NeurIPS), 2025Spotlight (NeurIPS) and Oral (ICML Generative AI and Biology Workshop)
Bibtex Entry:
@inproceedings{de2025flow,
author = {De Santi, Riccardo and Vlastelica, Marin and Hsieh, Ya-Ping and Shen, Zebang and He, Niao and Krause, Andreas},
booktitle = {Advances in Neural Information Processing Systems (NeurIPS)},
month = {December},
pdf = {https://www.arxiv.org/abs/2511.22640},
title = {Flow Density Control: Generative Optimization Beyond Entropy-Regularized Fine-Tuning},
year = {2025}}