Back to Search Start Over

Neural Network Parameter Diffusion

Authors :
Wang, Kai
Xu, Zhaopan
Zhou, Yukun
Zang, Zelin
Darrell, Trevor
Liu, Zhuang
You, Yang
Publication Year :
2024

Abstract

Diffusion models have achieved remarkable success in image and video generation. In this work, we demonstrate that diffusion models can also \textit{generate high-performing neural network parameters}. Our approach is simple, utilizing an autoencoder and a standard latent diffusion model. The autoencoder extracts latent representations of a subset of the trained network parameters. A diffusion model is then trained to synthesize these latent parameter representations from random noise. It then generates new representations that are passed through the autoencoder's decoder, whose outputs are ready to use as new subsets of network parameters. Across various architectures and datasets, our diffusion process consistently generates models of comparable or improved performance over trained networks, with minimal additional cost. Notably, we empirically find that the generated models are not memorizing the trained networks. Our results encourage more exploration on the versatile use of diffusion models.<br />Comment: We introduce a novel approach for parameter generation, named neural network parameter diffusion (\textbf{p-diff}), which employs a standard latent diffusion model to synthesize a new set of parameters

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.13144
Document Type :
Working Paper