96 results on '"Spectral gradient method"'
Search Results
2. A convergence analysis of hybrid gradient projection algorithm for constrained nonlinear equations with applications in compressed sensing.
- Author
-
Li, Dandan, Wang, Songhua, Li, Yong, and Wu, Jiaqi
- Subjects
- *
NONLINEAR equations , *COMPRESSED sensing , *IMAGE reconstruction , *CONJUGATE gradient methods , *ALGORITHMS , *ORTHOGONAL matching pursuit - Abstract
In this paper, we propose a projection-based hybrid spectral gradient algorithm for nonlinear equations with convex constraints, which is based on a certain line search strategy. Convex combination technique is used to design a novel spectral parameter that is inspired by some classical spectral gradient methods. The search direction also meets the sufficient descent condition and trust region feature. The global convergence of the proposed algorithm has been established under reasonable assumptions. The results of the experiment demonstrate the proposed algorithm is both more promising and robust than some similar methods, and it is also capable of handling large-scale optimization problems. Furthermore, we apply it to problems involving sparse signal recovery and blurred image restoration. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Structured adaptive spectral-based algorithms for nonlinear least squares problems with robotic arm modelling applications.
- Author
-
Yahaya, Mahmoud Muhammad, Kumam, Poom, Chaipunya, Parin, and Seangwattana, Thidaporn
- Subjects
MULTI-degree of freedom ,QUASI-Newton methods ,ALGORITHMS ,BENCHMARK problems (Computer science) ,ROBOTICS - Abstract
This research article develops two adaptive, efficient, structured non-linear least-squares algorithms, NLS. The approach taken to formulate these algorithms is motivated by the classical Barzilai and Borwein (BB) (IMA J Numer Anal 8(1):141–148, 1988) parameters. The structured vector approximation, which is an action of a vector on a matrix, is derived from higher order Taylor series approximations of the Hessian of the objective function, such that a quasi-Newton condition is satisfied. This structured approximation is incorporated into the BB parameters' weighted adaptive combination. We show that the algorithm is globally convergent under some standard assumptions. Moreover, the algorithms' robustness and effectiveness were tested numerically by solving some benchmark test problems. Finally, we apply one of the algorithms to solve a robotic motion control model with three degrees of freedom, 3DOF. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. A new spectral method with inertial technique for solving system of nonlinear monotone equations and applications
- Author
-
Aliyu Muhammed Awwal, Ahmadu Bappah Muhammadu, Chainarong Khunpanuk, Nuttapol Pakkaranang, and Bancha Panyanak
- Subjects
derivative-free method ,spectral gradient method ,inertial step ,nonlinear monotone equations ,Mathematics ,QA1-939 - Abstract
Many problems arising from science and engineering are in the form of a system of nonlinear equations. In this work, a new derivative-free inertial-based spectral algorithm for solving the system is proposed. The search direction of the proposed algorithm is defined based on the convex combination of the modified long and short Barzilai and Borwein spectral parameters. Also, an inertial step is introduced into the search direction to enhance its efficiency. The global convergence of the proposed algorithm is described based on the assumption that the mapping under consideration is Lipschitz continuous and monotone. Numerical experiments are performed on some test problems to depict the efficiency of the proposed algorithm in comparison with some existing ones. Subsequently, the proposed algorithm is used on problems arising from robotic motion control.
- Published
- 2023
- Full Text
- View/download PDF
5. A new spectral method with inertial technique for solving system of nonlinear monotone equations and applications.
- Author
-
Aji, Sani, Awwal, Aliyu Muhammed, Muhammadu, Ahmadu Bappah, Khunpanuk, Chainarong, Pakkaranang, Nuttapol, and Panyanak, Bancha
- Subjects
NONLINEAR equations ,CONVEX functions ,STOCHASTIC convergence ,LIPSCHITZ spaces ,CONTINUOUS functions - Abstract
Many problems arising from science and engineering are in the form of a system of nonlinear equations. In this work, a new derivative-free inertial-based spectral algorithm for solving the system is proposed. The search direction of the proposed algorithm is defined based on the convex combination of the modified long and short Barzilai and Borwein spectral parameters. Also, an inertial step is introduced into the search direction to enhance its efficiency. The global convergence of the proposed algorithm is described based on the assumption that the mapping under consideration is Lipschitz continuous and monotone. Numerical experiments are performed on some test problems to depict the efficiency of the proposed algorithm in comparison with some existing ones. Subsequently, the proposed algorithm is used on problems arising from robotic motion control. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
6. A new inertial-based method for solving pseudomonotone operator equations with application.
- Author
-
Aji, Sani, Kumam, Poom, Awwal, Aliyu Muhammed, Yahaya, Mahmoud Muhammad, and Bakoji, Abubakar Muhammad
- Subjects
OPERATOR equations ,QUASI-Newton methods ,NONLINEAR equations ,JACOBIAN matrices ,NEWTON-Raphson method - Abstract
Many efforts have been made to develop efficient algorithms for solving system of nonlinear equations due to their applications in different branches of science. Some of the classical techniques such as Newton and quasi-Newton methods involve computing Jacobian matrix or an approximation to it at every iteration, which affects their adequacy to handle large scale problems. Recently, derivative-free algorithms have been developed to solve this system. To establish global convergence, most of these algorithms assumed the operator under consideration to be monotone. In this work, instead of been monotone, our operator under consideration is considered to be pseudomonotone which is more general than the usual monotonicity assumption in most of the existing literature. The proposed method is derivative-free, and also, an inertial step is incorporated to accelerate its speed of convergence. The global convergence of the proposed algorithm is proved under the assumptions that the underlying mapping is Lipschitz continuous and pseudomonotone. Numerical experiments on some test problems are presented to depict the advantages of the proposed algorithm in comparison with some existing ones. Finally, an application of the proposed algorithm is shown in motion control involving a 3-degrees of freedom (DOF) planar robot arm manipulator. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Nonmonotone spectral gradient method based on memoryless symmetric rank-one update for large-scale unconstrained optimization.
- Author
-
Sim, Hong Seng, Chen, Chuei Yee, Leong, Wah June, and Li, Jiao
- Subjects
CONJUGATE gradient methods ,SYMMETRIC matrices ,CONVEX functions ,EIGENVALUES ,MATRICES (Mathematics) - Abstract
This paper proposes a nonmonotone spectral gradient method for solving large-scale unconstrained optimization problems. The spectral parameter is derived from the eigenvalues of an optimally sized memoryless symmetric rank-one matrix obtained under the measure defined as a ratio of the determinant of updating matrix over its largest eigenvalue. Coupled with a nonmonotone line search strategy where backtracking-type line search is applied selectively, the spectral parameter acts as a stepsize during iterations when no line search is performed and as a milder form of quasi-Newton update when backtracking line search is employed. Convergence properties of the proposed method are established for uniformly convex functions. Extensive numerical experiments are conducted and the results indicate that our proposed spectral gradient method outperforms some standard conjugate-gradient methods. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
8. On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems.
- Author
-
Awwal, Aliyu Muhammed, Kumam, Poom, Wang, Lin, Yahaya, Mahmoud Muhammad, and Mohammad, Hassan
- Subjects
- *
NONLINEAR equations , *CONJUGATE gradient methods , *BENCHMARK problems (Computer science) , *LEAST squares , *ALGORITHMS - Abstract
We propose structured spectral gradient algorithms for solving nonlinear least squares problems based on a modified structured secant equation. The idea was to integrate more details of the Hessian of the objective function into the standardized spectral parameters with the goal of improving numerical efficiency. We safeguard the structured spectral parameters to avoid negative curvature search direction. The sequence of the search direction generated by the respective proposed algorithm satisfies the sufficient descent condition. Using a nonmonotone line search strategy, we establish the global convergence of the proposed algorithms under some standard assumptions. Numerical experiments on some benchmark test problems show that the proposed algorithms are efficient and outperform some existing competitors. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. Spectral Gradient Method with Log-determinant Norm for Solving Non-Linear System of Equations.
- Author
-
Yeong Lin Koay, Hong Seng Sim, Yong Kheng Goh, and Sing Yee Chua
- Subjects
NONLINEAR equations ,NONLINEAR systems ,DETERMINANTS (Mathematics) - Abstract
Solving a system of non-linear equations has always been a complex issue whereby various methods were carried out. However, most of the methods used are optimization-based methods. This paper has modified the spectral gradient method with the backtracking line search technique to solve the non-linear systems. The efficiency of the modified spectral gradient method is tested by comparing the number of iterations, the number of function calls, and computational time with some existing methods. As a result, the proposed method shows better performance and gives more stable results than some existing methods. Moreover, it can be useful in solving some non-linear application problems. Therefore, the proposed method can be considered an alternative for solving non-linear systems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
10. Inexact variable metric method for convex-constrained optimization problems.
- Author
-
Gonçalves, Douglas S., Gonçalves, Max L. N., and Menezes, Tiago C.
- Subjects
- *
CONVEX sets , *CONVEX functions , *CONSTRAINED optimization , *SHIFT registers - Abstract
This paper is concerned with the inexact variable metric method for solving convex-constrained optimization problems. At each iteration of this method, the search direction is obtained by inexactly minimizing a strictly convex quadratic function over the closed convex feasible set. Here, we propose a new inexactness criterion for the search direction subproblems. Under mild assumptions, we prove that any accumulation point of the sequence generated by the new method is a stationary point of the problem under consideration. In order to illustrate the practical advantages of the new approach, we report some numerical experiments. In particular, we present an application where our concept of the inexact solutions is quite appealing. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
11. A Modified Spectral Gradient Projection Method for Solving Non-Linear Monotone Equations With Convex Constraints and Its Application
- Author
-
Li Zheng, Lei Yang, and Yong Liang
- Subjects
Non-linear equations ,non-smooth equation ,derivative free method ,spectral gradient method ,projection method ,signal reconstruction ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
In this paper, we propose a derivative free algorithm for solving non-linear monotone equations with convex constraints. The proposed algorithm combines the method of spectral gradient and the projection method. We also modify the backtracking line search technique. The global convergence of the proposed method is guaranteed, under the mild conditions. Further, the numerical experiments show that the large-scale non-linear equations with convex constraints can be effectively solved with our method. The L1-norm regularized problems in signal reconstruction are studied by using our method.
- Published
- 2020
- Full Text
- View/download PDF
12. A note on the spectral gradient projection method for nonlinear monotone equations with applications.
- Author
-
Abubakar, Auwal Bala, Kumam, Poom, and Mohammad, Hassan
- Abstract
In this work, we provide a note on the spectral gradient projection method for solving nonlinear equations. Motivated by recent extensions of the spectral gradient method for solving nonlinear monotone equations with convex constraints, in this paper, we note that choosing the search direction as a convex combination of two different positive spectral coefficients multiplied with the residual vector is more efficient and robust compared with the standard choice of spectral gradient coefficients combined with the projection strategy of Solodov and Svaiter (A globally convergent inexact newton method for systems of monotone equations. In: Reformulation: Nonsmooth. Piecewise Smooth, Semismooth and Smoothing Methods, pp 355–369. Springer, 1998). Under suitable assumptions, the convergence of the proposed method is established. Preliminary numerical experiments show that the method is promising. In this paper, the proposed method was used to recover sparse signal and restore blurred image arising from compressive sensing. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
13. A positive spectral gradient-like method for large-scale nonlinear monotone equations
- Author
-
Hassan Mohammad and Auwal Bala Abubakar
- Subjects
Non-linear equations ,monotone equations ,spectral gradient method ,projection method ,Mathematics ,QA1-939 - Abstract
In this work, we proposed a combine form of a positive spectral gradient-like method and projection method for solving nonlinear monotone equations. The spectral gradient-like coefficient is obtained using a convex combination of two different positive spectral coefficients. Under the monotonicity and Lipschitz continuity assumptions, the method is shown to be globally convergent. We show the numerical efficiency of the method by comparing it with the existing methods.
- Published
- 2017
14. A modified conjugate gradient method for monotone nonlinear equations with convex constraints.
- Author
-
Awwal, Aliyu Muhammed, Kumam, Poom, and Abubakar, Auwal Bala
- Subjects
- *
CONJUGATE gradient methods , *NONLINEAR equations - Abstract
In this paper, a modified Hestenes-Stiefel (HS) spectral conjugate gradient (CG) method for monotone nonlinear equations with convex constraints is proposed based on projection technique. The method can be viewed as an extension of a modified HS-CG method for unconstrained optimization proposed by Amini et al. (Optimization Methods and Software, pp: 1-13, 2018). A new search direction is obtained by incorporating the idea of spectral gradient parameter and some modification of the conjugate gradient parameter. The proposed method is derivative-free and requires low memory which makes it suitable for large scale monotone nonlinear equations. Global convergence of the method is established under suitable assumptions. Preliminary numerical comparisons with some existing methods are given to show the efficiency of our proposed method. Furthermore, the proposed method is successfully applied to solve sparse signal reconstruction in compressive sensing. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
15. A family of spectral gradient methods for optimization.
- Author
-
Dai, Yu-Hong, Huang, Yakui, and Liu, Xin-Wei
- Subjects
LEAST squares - Abstract
We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the long Barzilai–Borwein (BB) stepsize and the short BB stepsize. Each member of the family is shown to share certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is R-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is R-linearly convergent in the any-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
16. Structured Two-Point Stepsize Gradient Methods for Nonlinear Least Squares.
- Author
-
Mohammad, Hassan and Waziri, Mohammed Yusuf
- Subjects
- *
CONJUGATE gradient methods , *LEAST squares - Abstract
In this paper, we present two choices of structured spectral gradient methods for solving nonlinear least squares problems. In the proposed methods, the scalar multiple of identity approximation of the Hessian inverse is obtained by imposing the structured quasi-Newton condition. Moreover, we propose a simple strategy for choosing the structured scalar in the case of negative curvature direction. Using the nonmonotone line search with the quadratic interpolation backtracking technique, we prove that these proposed methods are globally convergent under suitable conditions. Numerical experiment shows that the methods are competitive with some recently developed methods. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
17. A convex optimization approach for solving large scale linear systems
- Author
-
Debora Cores and Johanna Figueroa
- Subjects
Nonlinear convex optimization ,spectral gradient method ,large-scale linear systems ,Mathematics ,QA1-939 - Abstract
The well-known Conjugate Gradient (CG) method minimizes a strictly convex quadratic function for solving large-scale linear system of equations when the coefficient matrix is symmetric and positive definite. In this work we present and analyze a non-quadratic convex function for solving any large-scale linear system of equations regardless of the characteristics of the coefficient matrix. For finding the global minimizers, of this new convex function, any low-cost iterative optimization technique could be applied. In particular, we propose to use the low-cost globally convergent Spectral Projected Gradient (SPG) method, which allow us to extend this optimization approach for solving consistent square and rectangular linear system, as well as linear feasibility problem, with and without convex constraints and with and without preconditioning strategies. Our numerical results indicate that the new scheme outperforms state-of-the-art iterative techniques for solving linear systems when the symmetric part of the coefficient matrix is indefinite, and also for solving linear feasibility problems.
- Published
- 2016
18. A Two-Step Spectral Gradient Projection Method for System of Nonlinear Monotone Equations and Image Deblurring Problems
- Author
-
Aliyu Muhammed Awwal, Lin Wang, Poom Kumam, and Hassan Mohammad
- Subjects
spectral gradient method ,nonlinear monotone equations ,projection method ,line search ,image deblurring ,Mathematics ,QA1-939 - Abstract
In this paper, we propose a two-step iterative algorithm based on projection technique for solving system of monotone nonlinear equations with convex constraints. The proposed two-step algorithm uses two search directions which are defined using the well-known Barzilai and Borwein (BB) spectral parameters.The BB spectral parameters can be viewed as the approximations of Jacobians with scalar multiple of identity matrices. If the Jacobians are close to symmetric matrices with clustered eigenvalues then the BB parameters are expected to behave nicely. We present a new line search technique for generating the separating hyperplane projection step of Solodov and Svaiter (1998) that generalizes the one used in most of the existing literature. We establish the convergence result of the algorithm under some suitable assumptions. Preliminary numerical experiments demonstrate the efficiency and computational advantage of the algorithm over some existing algorithms designed for solving similar problems. Finally, we apply the proposed algorithm to solve image deblurring problem.
- Published
- 2020
- Full Text
- View/download PDF
19. Spectral projected gradient methods Spectral Projected Gradient Methods
- Author
-
Birgin, Ernesto G., Martínez, J. M., Raydan, Marcos, Floudas, Christodoulos A., editor, and Pardalos, Panos M., editor
- Published
- 2009
- Full Text
- View/download PDF
20. Inexact variable metric method for convex-constrained optimization problems
- Author
-
Douglas Soares Gonçalves, Max L. N. Gonçalves, and Tiago C. Menezes
- Subjects
Mathematical optimization ,Control and Optimization ,Optimization problem ,Applied Mathematics ,Spectral gradient method ,MathematicsofComputing_GENERAL ,Mathematics::Optimization and Control ,Regular polygon ,Management Science and Operations Research ,Computer Science::Numerical Analysis ,Statistics::Computation ,Mathematics::Numerical Analysis ,Statistics::Machine Learning ,Constrained optimization problem ,Metric (mathematics) ,Approximate solution ,Variable (mathematics) ,Mathematics - Abstract
This paper is concerned with the inexact variable metric method for solving convex-constrained optimization problems. At each iteration of this method, the search direction is obtained by inexactly...
- Published
- 2021
21. On the Barzilai–Borwein gradient methods with structured secant equation for nonlinear least squares problems
- Author
-
Mahmoud Muhammad Yahaya, Hassan Mohammad, Poom Kumam, Aliyu Muhammed Awwal, and Lin Wang
- Subjects
Hessian matrix ,021103 operations research ,Control and Optimization ,Applied Mathematics ,Spectral gradient method ,0211 other engineering and technologies ,010103 numerical & computational mathematics ,02 engineering and technology ,01 natural sciences ,symbols.namesake ,Non-linear least squares ,symbols ,Applied mathematics ,0101 mathematics ,Software ,Mathematics - Abstract
We propose structured spectral gradient algorithms for solving nonlinear least squares problems based on a modified structured secant equation. The idea was to integrate more details of the Hessian...
- Published
- 2020
22. Alternating Direction Multiplier Method for Matrix l2,1-Norm Optimization in Multitask Feature Learning Problems
- Author
-
Yujie Wang, Yaping Hu, and Liying Liu
- Subjects
Optimization problem ,Computer science ,General Mathematics ,Spectral gradient method ,General Engineering ,Feature selection ,010103 numerical & computational mathematics ,01 natural sciences ,Regularization (mathematics) ,010101 applied mathematics ,Matrix (mathematics) ,Multiplier method ,Norm (mathematics) ,0101 mathematics ,Feature learning ,Algorithm - Abstract
The joint feature selection problem can be resolved by solving a matrix l2,1-norm minimization problem. For l2,1-norm regularization, one of the most fascinating features is that some similar sparsity structures can be employed by multiple predictors. However, the nonsmooth nature of the problem brings great challenges to the problem. In this paper, an alternating direction multiplier method combined with the spectral gradient method is proposed for solving the matrix l2,1-norm optimization problem involved with multitask feature learning. Numerical experiments show the effectiveness of the proposed algorithm.
- Published
- 2020
23. Newton Spectral Gradient Method of Unconstrained Optimization Problems
- Subjects
Spectral gradient method ,Applied mathematics ,Unconstrained optimization ,Mathematics - Published
- 2020
24. Solving Nonlinear Systems of Equations Via Spectral Residual Methods: Stepsize Selection and Applications
- Author
-
Margherita Porcelli, Benedetta Morini, Cristina Sgattoni, Enrico Meli, Meli E., Morini B., Porcelli M., and Sgattoni C.
- Subjects
Numerical Analysis ,Work (thermodynamics) ,Spectral gradient method ,Applied Mathematics ,General Engineering ,Approximate norm descent methods ,Residual ,Spectral gradient methods ,Nonlinear systems of equation ,Theoretical Computer Science ,Nonlinear systems of equations ,Computational Mathematics ,Nonlinear system ,Approximate norm descent method ,Steplength selection ,Computational Theory and Mathematics ,Applied mathematics ,Software ,Selection (genetic algorithm) ,Mathematics - Abstract
Spectral residual methods are derivative-free and low-cost per iteration procedures for solving nonlinear systems of equations. They are generally coupled with a nonmonotone linesearch strategy and compare well with Newton-based methods for large nonlinear systems and sequences of nonlinear systems. The residual vector is used as the search direction and choosing the steplength has a crucial impact on the performance. In this work we address both theoretically and experimentally the steplength selection and provide results on a real application such as a rolling contact problem.
- Published
- 2021
25. A family of spectral gradient methods for optimization
- Author
-
Yakui Huang, Xin-Wei Liu, and Yu-Hong Dai
- Subjects
021103 operations research ,Control and Optimization ,Property (philosophy) ,Applied Mathematics ,Spectral gradient method ,Mathematics::Optimization and Control ,0211 other engineering and technologies ,010103 numerical & computational mathematics ,02 engineering and technology ,Unconstrained optimization ,01 natural sciences ,Least squares ,90C20, 90C25, 90C30 ,Computational Mathematics ,Optimization and Control (math.OC) ,FOS: Mathematics ,Method of steepest descent ,Applied mathematics ,Convex combination ,0101 mathematics ,Convex function ,Mathematics - Optimization and Control ,Mathematics - Abstract
We propose a family of spectral gradient methods, whose stepsize is determined by a convex combination of the long Barzilai-Borwein (BB) stepsize and the short BB stepsize. Each member of the family is shown to share certain quasi-Newton property in the sense of least squares. The family also includes some other gradient methods as its special cases. We prove that the family of methods is $R$-superlinearly convergent for two-dimensional strictly convex quadratics. Moreover, the family is $R$-linearly convergent in the any-dimensional case. Numerical results of the family with different settings are presented, which demonstrate that the proposed family is promising., 22 pages, 2figures
- Published
- 2019
26. Spectral gradient method for impulse noise removal.
- Author
-
Liu, Jinkui and Li, Shengjie
- Abstract
In this paper, we propose a new spectral gradient method for removing impulse noise in the second phase of the two-phase method. An attractive property of the proposed method is that the search direction satisfies the sufficient descent property at each iteration, which is independent of any line search. Under Armijo-type line search, the global convergence of the proposed method is simplify established for general smooth functions. The preliminary numerical experiments are given to indicate the efficiency of the proposed method for impulse noise removal. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
27. Two spectral gradient projection methods for constrained equations and their linear convergence rate.
- Author
-
Liu, Jing and Duan, Yongrui
- Subjects
- *
LINEAR equations , *STOCHASTIC convergence , *MATHEMATICAL optimization , *CONJUGATE gradient methods , *MATHEMATICAL mappings , *LIPSCHITZ spaces , *NUMERICAL analysis - Abstract
Due to its simplicity and numerical efficiency for unconstrained optimization problems, the spectral gradient method has received more and more attention in recent years. In this paper, two spectral gradient projection methods for constrained equations are proposed, which are combinations of the well-known spectral gradient method and the hyperplane projection method. The new methods are not only derivative-free, but also completely matrix-free, and consequently they can be applied to solve large-scale constrained equations. Under the condition that the underlying mapping of the constrained equations is Lipschitz continuous or strongly monotone, we establish the global convergence of the new methods. Compared with the existing gradient methods for solving such problems, the new methods possess a linear convergence rate under some error bound conditions. Furthermore, a relax factor γ is attached in the update step to accelerate convergence. Preliminary numerical results show that they are efficient and promising in practice. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
28. A Two-Step Spectral Gradient Projection Method for System of Nonlinear Monotone Equations and Image Deblurring Problems
- Author
-
Lin Wang, Aliyu Muhammed Awwal, Hassan Mohammad, and Poom Kumam
- Subjects
Deblurring ,Physics and Astronomy (miscellaneous) ,Computer science ,Iterative method ,General Mathematics ,0211 other engineering and technologies ,010103 numerical & computational mathematics ,02 engineering and technology ,01 natural sciences ,spectral gradient method ,Computer Science (miscellaneous) ,Projection method ,Applied mathematics ,nonlinear monotone equations ,0101 mathematics ,Projection (set theory) ,Eigenvalues and eigenvectors ,021103 operations research ,Line search ,projection method ,lcsh:Mathematics ,lcsh:QA1-939 ,image deblurring ,Nonlinear system ,Monotone polygon ,Chemistry (miscellaneous) ,line search - Abstract
In this paper, we propose a two-step iterative algorithm based on projection technique for solving system of monotone nonlinear equations with convex constraints. The proposed two-step algorithm uses two search directions which are defined using the well-known Barzilai and Borwein (BB) spectral parameters.The BB spectral parameters can be viewed as the approximations of Jacobians with scalar multiple of identity matrices. If the Jacobians are close to symmetric matrices with clustered eigenvalues then the BB parameters are expected to behave nicely. We present a new line search technique for generating the separating hyperplane projection step of Solodov and Svaiter (1998) that generalizes the one used in most of the existing literature. We establish the convergence result of the algorithm under some suitable assumptions. Preliminary numerical experiments demonstrate the efficiency and computational advantage of the algorithm over some existing algorithms designed for solving similar problems. Finally, we apply the proposed algorithm to solve image deblurring problem.
- Published
- 2020
- Full Text
- View/download PDF
29. Multi-step spectral gradient methods with modified weak secant relation for large scale unconstrained optimization
- Author
-
Chuei Yee Chen, Siti Nur Iqmal Ibrahim, Wah June Leong, and Hong Seng Sim
- Subjects
Control and Optimization ,Algebra and Number Theory ,Applied Mathematics ,Norm (mathematics) ,Spectral gradient method ,MathematicsofComputing_NUMERICALANALYSIS ,Applied mathematics ,Unconstrained optimization ,Executable ,computer.file_format ,computer ,Mathematics - Abstract
In this paper, we aim to propose some spectral gradient methods via variational technique under log-determinant norm. The spectral parameters satisfy the modified weak secant relations that inspired by the multistep approximation for solving large scale unconstrained optimization. An executable code is developed to test the efficiency of the proposed method with spectral gradient method using standard weak secant relation as constraint. Numerical results are presented which suggest a better performance has been achieved.
- Published
- 2018
30. A SEQUENTIAL LINEAR CONSTRAINT PROGRAMMING ALGORITHM FOR NLP.
- Author
-
FLETCHER, ROGER
- Subjects
- *
LINEAR programming , *CONSTRAINT programming , *HESSIAN matrices , *COMPUTER software , *DERIVATIVES (Mathematics) - Abstract
A new method for nonlinear programming (NLP) using sequential linear constraint programming (SLCP) is described. Linear constraint programming (LCP) subproblems are solved by a new code using a recently developed spectral gradient method for minimization. The method requires only first derivatives and avoids having to store and update approximate Hessian or reduced Hessian matrices. Globalization is provided by a trust region filter scheme. Open source production quality software is available. Results on a large selection of CUTEr test problems are presented and discussed and show that the method is reliable and reasonably efficient. [ABSTRACT FROM AUTHOR]
- Published
- 2012
- Full Text
- View/download PDF
31. Non-smooth equations based method for -norm problems with applications to compressed sensing
- Author
-
Xiao, Yunhai, Wang, Qiuyu, and Hu, Qingjie
- Subjects
- *
INVERSE problems , *NONSMOOTH optimization , *STOCHASTIC convergence , *LEAST squares , *JACOBIAN matrices , *ALGORITHMS , *NUMERICAL analysis , *MATRICES (Mathematics) - Abstract
Abstract: In this paper, we propose, analyze, and test a new method for solving -norm regularization problems arising from the spare solution recovery in compressed sensing, basis-pursuit problems, and linear inverse problems. The method aims to minimize a non-smooth minimization problem consisting of a least-squares data fitting term and a -norm regularization term. The problem is firstly formulated for a convex quadratic program problem, and then for an equivalent non-smooth equation. At each iteration, a spectral gradient method is applied to the resulting problem without requiring Jacobian matrix information. Convergence of the proposed method follows directly from the results which already exist. The algorithm is easily performed, where only the matrix–vector inner product is required at each and every step. Numerical experiments to decode a sparse signal arising in compressed sensing and image deconvolution are performed. The numerical results illustrate that the proposed method is practical and promising. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
32. Modified active set projected spectral gradient method for bound constrained optimization
- Author
-
Xiao, Yun-Hai, Hu, Qing-Jie, and Wei, Zengxin
- Subjects
- *
CONSTRAINED optimization , *STOCHASTIC convergence , *NUMERICAL analysis , *SET theory , *MATHEMATICAL optimization , *EXPERIMENTS - Abstract
Abstract: In this paper, by means of an active set strategy, we present a projected spectral gradient algorithm for solving large-scale bound constrained optimization problems. A nice property of the active set estimation technique is that it can identify the active set at the optimal point without requiring strict complementary condition, which is potentially used to solve degenerated optimization problems. Under appropriate conditions, we show that this proposed method is globally convergent. We also do some numerical experiments by using some bound constrained problems from CUTEr library. The numerical comparisons with SPG, TRON, and L-BFGS-B show that the proposed method is effective and promising. [Copyright &y& Elsevier]
- Published
- 2011
- Full Text
- View/download PDF
33. Notes on the Dai–Yuan–Yuan modified spectral gradient method
- Author
-
Xiao, Yunhai, Wang, Qiuyu, and Wang, Dong
- Subjects
- *
SPECTRAL theory , *MATHEMATICAL formulas , *GAUSS-Newton method , *STOCHASTIC convergence , *NUMERICAL analysis - Abstract
Abstract: In this paper, we give some notes on the two modified spectral gradient methods which were developed in . These notes present the relationship between their stepsize formulae and some new secant equations in the quasi-Newton method. In particular, we also introduce another two new choices of stepsize. By using an efficient nonmonotone line search technique, we propose some new spectral gradient methods. Under some mild conditions, we show that these proposed methods are globally convergent. Numerical experiments on a large number of test problems from the CUTEr library are also reported, which show that the efficiency of these proposed methods. [Copyright &y& Elsevier]
- Published
- 2010
- Full Text
- View/download PDF
34. Residual algorithm for large-scale positive definite generalized eigenvalue problems.
- Author
-
Bello, Lenys, La Cruz, William, and Raydan, Marcos
- Subjects
EIGENVALUES ,NONLINEAR systems ,SYSTEMS theory ,MATRICES (Mathematics) ,NUMERICAL analysis - Abstract
In the positive definite case, the extreme generalized eigenvalues can be obtained by solving a suitable nonlinear system of equations. In this work, we adapt and study the use of recently developed low-cost derivative-free residual schemes for nonlinear systems, to solve large-scale generalized eigenvalue problems. We demonstrate the effectiveness of our approach on some standard test problems, and also on a problem associated with the vibration analysis of large structures. In our numerical results we use preconditioning strategies based on incomplete factorizations, and we compare with and without preconditioning with a well-known available package. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
35. Spectral gradient projection method for monotone nonlinear equations with convex constraints
- Author
-
Yu, Zhensheng, Lin, Ji, Sun, Jing, Xiao, Yunhai, Liu, Liying, and Li, Zhanhui
- Subjects
- *
SPECTRAL theory , *CONJUGATE gradient methods , *MONOTONIC functions , *NONLINEAR differential equations , *CONVEX functions , *ITERATIVE methods (Mathematics) , *STOCHASTIC convergence , *NUMERICAL analysis - Abstract
Abstract: A spectral gradient projection algorithm for monotone nonlinear equations with convex constraints is proposed. This new procedure is obtained by combining a modified spectral gradient method and a projection method. A relevant property of the presented method is that the computation of the iteration sequence does not require the solution of any subproblem. The algorithm is shown to be globally convergent under mild assumptions and it can be also applied to nonsmooth equations. Preliminary numerical tests show that the proposed method works quite well even for large scale problems. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
36. The spectral gradient method for unconstrained optimal control problems.
- Author
-
ARDENGHI, J. I., GIBELLI, T. I., and MACIEL, M. C.
- Subjects
MATHEMATICAL optimization ,NEWTON-Raphson method ,ALGORITHMS ,STOCHASTIC convergence ,NUMERICAL analysis - Abstract
Optimal control problems and their discretized form can be viewed as optimization problems. Kelley and Sachs have already solved the discretized problem by using quasi-Newton methods. In this contribution, the problem is solved by a low-cost algorithm, the spectral gradient method, which is suitable for large-scale problems. The convergence behaviour of the method to finite-dimensional approximation is analysed. Numerical examples are given and the reported results show the good performance of the algorithm when it is applied to large optimal control problems. [ABSTRACT FROM PUBLISHER]
- Published
- 2009
- Full Text
- View/download PDF
37. Nonsmooth spectral gradient methods for unconstrained optimization
- Author
-
Loreto, Milagros, Aponte, Hugo, Cores, Debora, and Raydan, Marcos
- Published
- 2017
- Full Text
- View/download PDF
38. A scaled nonlinear conjugate gradient algorithm for unconstrained optimization.
- Author
-
Andrei, Neculai
- Subjects
- *
MATRIX derivatives , *CONJUGATE gradient methods , *MATHEMATICAL optimization , *NONLINEAR difference equations , *NEWTON-Raphson method , *CONVEX functions , *REAL variables - Abstract
The best spectral conjugate gradient algorithm by (Birgin, E. and Martinez, J.M., 2001, A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization, 43, 117-128). which is mainly a scaled variant of (Perry, J.M., 1977, A class of Conjugate gradient algorithms with a two step varaiable metric memory, Discussion Paper 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University), is modified in such a way as to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded into the restart philosophy of Beale-Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative way by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Computational results and performance profiles for a set consisting of 700 unconstrained optimization problems show that this new scaled nonlinear conjugate gradient algorithm substantially outperforms known conjugate gradient methods including: the spectral conjugate gradient SCG by Birgin and Martinez, the scaled Fletcher and Reeves, the Polak and Ribiere algorithms and the CONMIN by (Shanno, D.F. and Phua, K.H., 1976, Algorithm 500, Minimization of unconstrained multivariate functions. ACM Transactions on Mathematical Software, 2, 87-94). [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
39. Residual iterative schemes for large-scale nonsymmetric positive definite linear systems.
- Author
-
LA CRUZ, WILLIAM and RAYDAN, MARCOS
- Subjects
LINEAR systems ,ITERATIVE methods (Mathematics) ,NUMERICAL analysis ,FUNCTIONAL analysis ,NONLINEAR theories - Abstract
A new iterative scheme that uses the residual vector as search direction is proposed and analyzed for solving large-scale nonsymmetric linear systems, whose matrix has a positive (or negative) definite symmetric part. It is closely related to Richardson's method, although the stepsize and some other new features are inspired by the success of recently proposed residual methods for nonlinear systems. Numerical experiments are included to show that, without preconditioning, the proposed scheme outperforms some recently proposed variations on Richardson's method, and competes with well-known and well-established Krylov subspace methods: GMRES and BiCGSTAB. Our computational experiments also show that, in the presence of suitable preconditioning strategies, residual iterative methods can be competitive, and sometimes advantageous, when compared with Krylov subspace methods. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
40. Scaled conjugate gradient algorithms for unconstrained optimization.
- Author
-
Andrei, Neculai
- Subjects
CONJUGATE gradient methods ,ALGORITHMS ,SPECTRUM analysis ,MATRICES (Mathematics) ,MATHEMATICAL functions ,COMPUTATIONAL complexity - Abstract
In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry's (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale-Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
41. Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization.
- Author
-
Andrei, Neculai
- Subjects
- *
ALGORITHMS , *APPROXIMATION theory , *MATHEMATICAL functions , *MATHEMATICAL optimization , *MATHEMATICS , *MATHEMATICAL programming - Abstract
A scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale-Powell restart criterion holds. The parameter scaling the gradient is selected as the spectral gradient. In very mild conditions, it is shown that, for strongly convex functions, the algorithm is globally convergent. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new scaled conjugate gradient algorithm substantially outperforms the known conjugate gradient methods including the spectral conjugate gradient by Birgin and Martínez [Birgin, E. and Martínez, J.M., 2001, A spectral conjugate gradient method for unconstrained optimization. Applied Mathematics and Optimization, 43, 117-128], the conjugate gradient by Polak and Ribière [Polak, E. and Ribière, G., 1969, Note sur la convergence de méthodes de directions conjuguées. Revue Francaise Informat. Reserche Opérationnelle, 16, 35-43], as well as the most recent conjugate gradient method with guaranteed descent by Hager and Zhang [Hager, W.W. and Zhang, H., 2005, A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM Journal on Optimization, 16, 170-192; Hager, W.W. and Zhang, H., 2004, CG-DESCENT, A conjugate gradient method with guaranteed descent ACM Transactions on Mathematical Software, 32, 113-137]. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
42. Spectral gradient projection method for solving nonlinear monotone equations
- Author
-
Zhang, Li and Zhou, Weijun
- Subjects
- *
ALGEBRA , *EQUATIONS , *ALGORITHMS , *MATHEMATICS - Abstract
Abstract: An algorithm for solving nonlinear monotone equations is proposed, which combines a modified spectral gradient method and projection method. This method is shown to be globally convergent to a solution of the system if the nonlinear equations to be solved is monotone and Lipschitz continuous. An attractive property of the proposed method is that it can be applied to solving nonsmooth equations. We also give some preliminary numerical results to show the efficiency of the proposed method. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
43. Optimal control of viscous Burgers equation via an adaptive nonmonotone Barzilai–Borwein gradient method
- Author
-
Omid Solaymani Fard, Hadi Nosratipour, and Akbar Hashemi Borzabadi
- Subjects
Applied Mathematics ,Spectral gradient method ,Mathematical analysis ,010103 numerical & computational mathematics ,Optimal control ,01 natural sciences ,Implicit function theorem ,Computer Science Applications ,Burgers' equation ,010101 applied mathematics ,Computational Theory and Mathematics ,0101 mathematics ,Gradient method ,Mathematics - Abstract
An adaptive nonmonotone spectral gradient method for the solution of distributed optimal control problem (OCP) for the viscous Burgers equation is presented in a black-box framework. Regarding the ...
- Published
- 2017
44. Spectral Gradient Methods for Linearly Constrained Optimization.
- Author
-
Martínez, J. M., Pilotta, E. A., and Raydan, M.
- Subjects
- *
MATHEMATICAL optimization , *MATHEMATICAL analysis , *MATHEMATICS , *NEWTON-Raphson method , *ITERATIVE methods (Mathematics) , *EXPONENTIAL functions , *CONJUGATE gradient methods - Abstract
Linearly constrained optimization problems with simple bounds are considered in the present work. First, a preconditioned spectral gradient method is defined for the case in which no simple bounds are present. This algorithm can be viewed as a quasi-Newton method in which the approximate Hessians satisfy a weak secant equation. The spectral choice of steplength is embedded into the Hessian approximation and the whole process is combined with a nonmonotone line search strategy. The simple bounds are then taken into account by placing them in an exponential penalty scheme defines the outer iterations of the process. Each outer iteration involves the application of the previously defined preconditioned spectral gradient method for linear equality constrained problems. Therefore, an equality constrained convex quadratic programming problem needs to be solved at every inner iteration. The associated extended KKT matrix remains constant unless the process is reinitiated. In ordinary inner iterations, only the right-hand side of the KKT system changes. Therefore, suitable sparse factorization techniques can be applied and exploited effectively. Encouraging numerical experiments are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
45. PRECONDITIONED SPECTRAL PROJECTED GRADIENT METHOD ON CONVEX SETS.
- Author
-
Bello, Lenys and Raydan, Marcos
- Subjects
- *
CONVEX sets , *MATHEMATICAL optimization , *CONJUGATE gradient methods , *SPECTRUM analysis , *SET theory , *STOCHASTIC convergence - Abstract
The spectral gradient method has proved to be effective for solving large-scale unconstrained optimization problems. It has been recently extended and combined with the projected gradient method for solving optimization problems on convex sets. This combination includes the use of nonmonotone line search techniques to preserve the fast local convergence. In this work we further extend the spectral choice of steplength to accept pre-conditioned directions when a good preconditioner is available. We present an algorithm that combines the spectral projected gradient method with preconditioning strategies to increase the local speed of convergence while keeping the global properties. We discuss implementation details for solving large-scale problems. [ABSTRACT FROM AUTHOR]
- Published
- 2005
46. NONMONOTONE SPECTRAL METHODS FOR LARGE-SCALE NONLINEAR SYSTEMS.
- Author
-
LA CRUZ, WILLIAM and RAYDAN, MARCOS
- Subjects
- *
SPECTRAL theory , *NONLINEAR systems , *EQUATIONS , *STOCHASTIC convergence , *MATHEMATICAL optimization - Abstract
The spectral gradient method has proved to be effective for solving large-scale optimization problems. In this work we extend the spectral approach to solve nonlinear systems of equations. We consider a strategy based on nonmonotone line search techniques to guarantee global convergence, and discuss implementation details for solving large-scale problems. We compare the performance of our new method with recent implementations of inexact Newton schemes based on Krylov subspace inner iterative methods for the linear systems. Our numerical experiments indicate that the spectral approach for solving nonlinear systems competes favorably with well-established numerical methods. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
47. Preconditioned Spectral Gradient Method.
- Author
-
Luengo, F., Raydan, M., Glunt, W., and Hayden, T.L.
- Abstract
The spectral gradient method is a nonmonotone gradient method for large-scale unconstrained minimization. We strengthen the algorithm by modifications which globalize the method and present strategies to apply preconditioning techniques. The modified algorithm replaces a condition of uniform positive definitness of the preconditioning matrices, with mild conditions on the search directions. The result is a robust algorithm which is effective on very large problems. Encouraging numerical experiments are presented for a variety of standard test problems, for solving nonlinear Poisson-type equations, an also for finding molecular conformations by distance geometry. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
48. Spectral Variants of Krylov Subspace Methods.
- Author
-
Molina, Brígida and Raydan, Marcos
- Abstract
Krylov iterative methods usually solve an optimization problem, per iteration, to obtain a vector whose components are the step lengths associated with the previous search directions. This vector can be viewed as the solution of a multiparameter optimization problem. In that sense, Krylov methods can be combined with the spectral choice of step length that has recently been developed to accelerate descent methods in optimization. In this work, we discuss different spectral variants of Krylov methods and present encouraging preliminary numerical experiments, with and without preconditioning. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
49. Exact Spectral-Like Gradient Method for Distributed Optimization
- Author
-
Nataša Krejić, Natasa Krklec Jerinkic, and Dusan Jakovetic
- Subjects
Mathematical optimization ,021103 operations research ,Control and Optimization ,Optimization problem ,Scale (ratio) ,Applied Mathematics ,Spectral gradient method ,0211 other engineering and technologies ,Regular polygon ,Context (language use) ,010103 numerical & computational mathematics ,02 engineering and technology ,01 natural sciences ,Computational Mathematics ,Optimization and Control (math.OC) ,Convergence (routing) ,FOS: Mathematics ,90C25, 90C53, 65K05 ,0101 mathematics ,Gradient method ,Mathematics - Optimization and Control ,Mathematics - Abstract
© 2019, Springer Science+Business Media, LLC, part of Springer Nature. Since the initial proposal in the late 80s, spectral gradient methods continue to receive significant attention, especially due to their excellent numerical performance on various large scale applications. However, to date, they have not been sufficiently explored in the context of distributed optimization. In this paper, we consider unconstrained distributed optimization problems where n nodes constitute an arbitrary connected network and collaboratively minimize the sum of their local convex cost functions. In this setting, building from existing exact distributed gradient methods, we propose a novel exact distributed gradient method wherein nodes’ step-sizes are designed according to the novel rules akin to those in spectral gradient methods. We refer to the proposed method as Distributed Spectral Gradient method. The method exhibits R-linear convergence under standard assumptions for the nodes’ local costs and safeguarding on the algorithm step-sizes. We illustrate the method’s performance through simulation examples.
- Published
- 2019
50. New hybrid algorithm based on nonmonotone spectral gradient and simultaneous perturbation
- Author
-
Habbal Abderrahmane, Tabbakh Zineb, Ellaia Rachid, Laboratoire d'Etudes et Recherche en Mathématiques Appliquées (LERMA), Ecole Mohammadia d'Ingénieurs (EMI), Analysis and Control of Unsteady Models for Engineering Sciences (ACUMES), Inria Sophia Antipolis - Méditerranée (CRISAM), Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria), Laboratoire Jean Alexandre Dieudonné (JAD), Université Côte d'Azur (UCA)-Université Nice Sophia Antipolis (... - 2019) (UNS), COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Centre National de la Recherche Scientifique (CNRS), Laboratoire Jean Alexandre Dieudonné (LJAD), Université Nice Sophia Antipolis (1965 - 2019) (UNS), and COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-COMUE Université Côte d'Azur (2015-2019) (COMUE UCA)-Centre National de la Recherche Scientifique (CNRS)-Université Côte d'Azur (UCA)
- Subjects
Numerical Analysis ,Simultaneous perturbation stochastic approximation ,Line search ,Nonmonotone line search ,Spectral gradient method ,Applied Mathematics ,Modeling and Simulation ,Perturbation (astronomy) ,Applied mathematics ,[MATH.MATH-OC]Mathematics [math]/Optimization and Control [math.OC] ,Mathematics - Abstract
International audience; In this paper, we introduce a new hybrid method called nonmonotone spectral gradient and simultaneous perturbation (NSGSP). It combines the advantages of nonmonotone spectral gradient (NSG), and simultaneous perturbation (SP) methods. The main idea of our approach is to use the simultaneous perturbation (SP) method in order to get a non expensive estimate of the gradient, and exploit the good properties of the nonmonotone spectral gradient (NSG) method in order to compute an efficient line search. Several numerical experiments are provided. The results indicate that the new method is effective and outperforms most of other popular methods.
- Published
- 2019
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.