1. An outer-approximation guided optimization approach for constrained neural network inverse problems
- Author
-
Myun-Seok Cheon
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Mathematical optimization ,021103 operations research ,Optimization problem ,Artificial neural network ,General Mathematics ,Computation ,Numerical analysis ,0211 other engineering and technologies ,Process (computing) ,010103 numerical & computational mathematics ,02 engineering and technology ,Inverse problem ,01 natural sciences ,Machine Learning (cs.LG) ,Optimization and Control (math.OC) ,Convergence (routing) ,FOS: Mathematics ,0101 mathematics ,Mathematics - Optimization and Control ,Gradient method ,Software ,Mathematics - Abstract
This paper discusses an outer-approximation guided optimization method for constrained neural network inverse problems with rectified linear units. The constrained neural network inverse problems refer to an optimization problem to find the best set of input values of a given trained neural network in order to produce a predefined desired output in presence of constraints on input values. This paper analyzes the characteristics of optimal solutions of neural network inverse problems with rectified activation units and proposes an outer-approximation algorithm by exploiting their characteristics. The proposed outer-approximation guided optimization comprises primal and dual phases. The primal phase incorporates neighbor curvatures with neighbor outer-approximations to expedite the process. The dual phase identifies and utilizes the structure of local convex regions to improve the convergence to a local optimal solution. At last, computation experiments demonstrate the superiority of the proposed algorithm compared to a projected gradient method.
- Published
- 2021