Back to Search Start Over

Generalized deterministic annealing

Authors :
Acton, Scott Thomas
Bovik, Alan Conrad
Source :
IEEE Transactions on Neural Networks. May, 1996, Vol. 7 Issue 3, p686, 14 p.
Publication Year :
1996

Abstract

We develop a general formalism for computing high quality, low-cost solutions to nonconvex combinatorial optimization problems expressible as distributed interacting local constraints. For problems of this type, generalized deterministic annealing (GDA) avoids the performance-related sacrifices of current techniques. GDA exploits the localized structure of such problems by assigning K-state neurons to each optimization variable. The neuron values correspond to the probability densities of K-state local Markov chains and may be updated serially or in parallel; the Markov model is derived from the Markov model of simulated annealing (SA), although it is greatly simplified. Theorems are presented that firmly establish the convergence properties of GDA, as well as supplying practical guidelines for selecting the initial and final temperatures in the annealing process. A benchmark image enhancement application is provided where the performance of GDA is compared to other optimization methods. The empirical data taken in conjunction with the formal analytical results suggest that GDA enjoys significant performance advantages relative to current methods for combinatorial optimization.

Details

ISSN :
10459227
Volume :
7
Issue :
3
Database :
Gale General OneFile
Journal :
IEEE Transactions on Neural Networks
Publication Type :
Academic Journal
Accession number :
edsgcl.18445798