1. Prior Distribution Selection for a Mixture of Experts.
- Author
-
Grabovoy, A. V. and Strijov, V. V.
- Subjects
- *
PROBLEM solving , *IMAGE recognition (Computer vision) , *EXPECTATION-maximization algorithms , *TEST methods - Abstract
The paper investigates a mixture of expert models. The mixture of experts is a combination of experts, local approximation model, and a gate function, which weighs these experts and forms their ensemble. In this work, each expert is a linear model. The gate function is a neural network with softmax on the last layer. The paper analyzes various prior distributions for each expert. The authors propose a method that takes into account the relationship between prior distributions of different experts. The EM algorithm optimises both parameters of the local models and parameters of the gate function. As an application problem, the paper solves a problem of shape recognition on images. Each expert fits one circle in an image and recovers its parameters: the coordinates of the center and the radius. The computational experiment uses synthetic and real data to test the proposed method. The real data is a human eye image from the iris detection problem. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF