12 results on '"Karas, Elizabeth"'
Search Results
2. AN OPTIMAL ALGORITHM FOR CONSTRAINED DIFFERENTIABLE CONVEX OPTIMIZATION.
- Author
-
GONZAGA, CLÓVIS C., KARAS, ELIZABETH W., and ROSSETTO, DIANE R.
- Subjects
- *
ALGORITHMS , *MATHEMATICAL optimization , *SET theory , *ITERATIVE methods (Mathematics) , *CONVEX domains - Abstract
We describe two algorithms for solving differentiable convex optimization problems constrained to simple sets in ℝn, i.e., sets on which it is easy to project an arbitrary point. The algorithms are optimal in the sense that they achieve an absolute precision of ε in relation to the optimal value in O (1√ε) iterations using only first-order information. This complexity depends on an (unknown) Lipschitz constant L* for the function derivatives and on a (known) strong convexity constant μ* ≥ 0 . The algorithms are extensions of well-known methods devised by Nesterov [Introductory Lectures on Convex Optimization, Kluwer Academic, Boston, 2004], without the need for estimates of L* and including (in the second algorithm) line searches and an adaptive procedure for estimating a strong convexity constant. The algorithms are such that all computed points are feasible, and the complexity analysis follows a simple geometric approach. Numerical tests for box-constrained quadratic problems are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
3. Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming.
- Author
-
Gonzaga, Clóvis and Karas, Elizabeth
- Subjects
- *
ALGORITHMS , *CONVEX programming , *LIPSCHITZ spaces , *MATHEMATICAL optimization , *MATHEMATICAL programming , *STOCHASTIC convergence , *MATHEMATICAL analysis - Abstract
We modify the first order algorithm for convex programming described by Nesterov in his book (in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, ). In his algorithms, Nesterov makes explicit use of a Lipschitz constant L for the function gradient, which is either assumed known (Nesterov in Introductory lectures on convex optimization. A basic course. Kluwer, Boston, ), or is estimated by an adaptive procedure (Nesterov ). We eliminate the use of L at the cost of an extra imprecise line search, and obtain an algorithm which keeps the optimal complexity properties and also inherit the global convergence properties of the steepest descent method for general continuously differentiable optimization. Besides this, we develop an adaptive procedure for estimating a strong convexity constant for the function. Numerical tests for a limited set of toy problems show an improvement in performance when compared with the original Nesterov's algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
4. Fenchel–Moreau conjugation for lower semi-continuous functions.
- Author
-
Cotrina, John, Karas, Elizabeth W., Ribeiro, Ademir A., Sosa, Wilfredo, and Yuan, Jin Y.
- Subjects
- *
CONJUGATE gradient methods , *CONTINUOUS functions , *CONVEX functions , *DUALITY theory (Mathematics) , *TOPOLOGICAL spaces , *MATHEMATICAL optimization - Abstract
We introduce a modification of Fenchel's conjugation which is a particular case of Moreau's conjugation. We obtain good properties such as convexity of the conjugate function even though the function is not convex. We also introduce the concept of conjugate dual space as a class of continuous operators, while in the Fenchel conjugation the conjugate dual space is the classical topological dual space. Finally, we present some examples for illustrating the difference between the Fenchel–Moreau conjugation and our modification. [ABSTRACT FROM PUBLISHER]
- Published
- 2011
- Full Text
- View/download PDF
5. Local convergence of filter methods for equality constrained non-linear programming.
- Author
-
Karas, Elizabeth W., Gonzaga, Clóvis C., and Ribeiro, Ademir A.
- Subjects
- *
STOCHASTIC convergence , *ALGORITHMS , *NONLINEAR programming , *LINEAR substitutions , *QUADRATIC programming - Abstract
In Gonzaga et al. [A globally convergent filter method for nonlinear programming, SIAM J. Optimiz. 14 (2003), pp. 646-669] we discuss general conditions to ensure global convergence of inexact restoration filter algorithms for non-linear programming. In this article we show how to avoid the Maratos effect by means of a second-order correction. The algorithms are based on feasibility and optimality phases, which can be either independent or not. The optimality phase differs from the original one only when a full Newton step for the tangential minimization of the Lagrangian is efficient but not acceptable by the filter method. In this case a second-order corrector step tries to produce an acceptable point keeping the efficiency of the rejected step. The resulting point is tested by trust region criteria. Under the usual hypotheses, the algorithm inherits the quadratic convergence properties of the feasibility and optimality phases. This article includes a comparison between classical Sequential Quadratic Programming (SQP) and Inexact Restoration (IR) iterations, showing that both methods share the same asymptotic convergence properties. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
6. A bundle-filter method for nonsmooth convex constrained optimization.
- Author
-
Karas, Elizabeth, Ribeiro, Ademir, Sagastizábal, Claudia, and Solodov, Mikhail
- Subjects
- *
CONSTRAINED optimization , *NONSMOOTH optimization , *MATHEMATICAL optimization , *FILTERS (Mathematics) , *ALGORITHMS , *MATHEMATICAL programming - Abstract
For solving nonsmooth convex constrained optimization problems, we propose an algorithm which combines the ideas of the proximal bundle methods with the filter strategy for evaluating candidate points. The resulting algorithm inherits some attractive features from both approaches. On the one hand, it allows effective control of the size of quadratic programming subproblems via the compression and aggregation techniques of proximal bundle methods. On the other hand, the filter criterion for accepting a candidate point as the new iterate is sometimes easier to satisfy than the usual descent condition in bundle methods. Some encouraging preliminary computational results are also reported. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
7. GLOBAL CONVERGENCE OF FILTER METHODS FOR NONLINEAR PROGRAMMING.
- Author
-
RIBEIRO, ADEMIR A., KARAS, ELIZABETH W., and GONZAGA, CLÓVIS C.
- Subjects
- *
ALGORITHMS , *NONLINEAR programming , *STOCHASTIC convergence , *ASYMPTOTIC expansions , *QUADRATIC programming - Abstract
We present a general filter algorithm that allows a great deal of freedom in the step computation. Each iteration of the algorithm consists basically in computing a point which is not forbidden by the filter, from the current point. We prove its global convergence, assuming that the step must be efficient, in the sense that, near a feasible nonstationary point, the reduction of the objective function is "large." We show that this condition is reasonable, by presenting two classical ways of performing the step which satisfy it. In the first one, the step is obtained by the inexact restoration method of Martínez and Pilotta. In the second, the step is computed by sequential quadratic programming. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
8. Global convergence of slanting filter methods for nonlinear programming
- Author
-
Karas, Elizabeth W., Oening, Ana P., and Ribeiro, Ademir A.
- Subjects
- *
NONLINEAR programming , *ALGORITHMS , *MATHEMATICAL optimization , *MATHEMATICAL analysis - Abstract
Abstract: In this paper, we present a general algorithm for nonlinear programming which uses a slanting filter criterion for accepting the new iterates. Independently of how these iterates are computed, we prove that all accumulation points of the sequence generated by the algorithm are feasible. Computing the new iterates by the inexact restoration method, we prove stationarity of all accumulation points of the sequence. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
9. A GLOBALLY CONVERGENT FILTER METHOD FOR NONLINEAR PROGRAMMING.
- Author
-
Gonzaga, Clóvis C., Karas, Elizabeth, and Vanti, Márcia
- Subjects
- *
ALGORITHMS , *NONLINEAR programming , *STOCHASTIC convergence , *MATHEMATICAL functions , *MATHEMATICAL programming - Abstract
In this paper we present a filter algorithm for nonlinear programming and prove its global convergence to stationary points. Each iteration is composed of a feasibility phase, which reduces a measure of infeasibility and an optimality phase, which reduces the objective function in a tangential approximation of the feasible set. These two phases are totally independent,and the only coupling between them is provided by the filter. The method is independent of the internal algorithms used in each iteration, as long as these algorithms satisfy reasonable assumptions on their efficiency. Under standard hypotheses, we show two results: for a filter with minimum size, the algorithm generates a stationary accumulation point; for a slightly larger filter, all accumulation points are stationary. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
10. Optimal non-anticipative scenarios for nonlinear hydro-thermal power systems.
- Author
-
Periçaro, Gislaine A., Karas, Elizabeth W., Gonzaga, Clóvis C., Marcílio, Débora C., Oening, Ana Paula, Matioli, Luiz Carlos, Detzel, Daniel H.M., de Geus, Klaus, and Bessa, Marcelo R.
- Subjects
- *
QUADRATIC programming , *LAPTOP computers , *NONLINEAR programming , *ALGORITHMS , *HYDROELECTRIC power plants , *NONLINEAR equations , *STOCHASTIC models - Abstract
The long-term operation of hydro-thermal power generation systems is modeled by a large-scale stochastic optimization problem that includes nonlinear constraints due to the head computation in hydroelectric plants. We do a detailed development of the problem model and state it by a non-anticipative scenario analysis, leading to a large-scale nonlinear programming problem. This is solved by a filter algorithm with sequential quadratic programming iterations that minimize quadratic Lagrangian approximations using exact hessians in L ∞ trust regions. The method is applied to the long-term planning of the Brazilian system, with over 100 hydroelectric and 50 thermoelectric plants, distributed in 5 interconnected subsystems. This problem with 50 synthetically generated inflow scenarios and a horizon of 60 months, amounting to about one million variables and 15000 nonlinear constraints was solved by the filter algorithm in a standard 2016 notebook computer in 10 h of CPU. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
11. Global convergence of a general filter algorithm based on an efficiency condition of the step.
- Author
-
Periçaro, Gislaine A., Ribeiro, Ademir A., and Karas, Elizabeth W.
- Subjects
- *
STOCHASTIC convergence , *COMPUTER algorithms , *FEASIBILITY studies , *MATHEMATICAL functions , *PROOF theory , *PERFORMANCE evaluation , *CRITERION (Theory of knowledge) - Abstract
Abstract: In this work we discuss global convergence of a general filter algorithm that depends neither on the definition of the forbidden region, which can be given by the original or slanting filter rule, nor on the way in which the step is computed. This algorithm basically consists of calculating a point not forbidden by the filter from the current point. Assuming that this step must be efficient, in the sense that near a feasible non-stationary point the decrease in the objective function is relatively large, we prove the global convergence of the algorithm. We also discuss that such a condition is satisfied if the step is computed by the SQP or Inexact Restoration methods. For SQP we present a general proof of this result that is valid for both the original and the slanting filter criterion. In order to compare the performance of the general filter algorithm according to the method used to calculate the step and the filter rule regarded, we present numerical experiments performed with problems from CUTEr collection. [Copyright &y& Elsevier]
- Published
- 2013
- Full Text
- View/download PDF
12. Examples of ill-behaved central paths in convex optimization.
- Author
-
Gilbert, J. Charles, Gonzaga, Clovis C., and Karas, Elizabeth
- Subjects
- *
NONLINEAR programming , *MATHEMATICAL programming , *MATHEMATICAL optimization , *MATHEMATICS , *ALGORITHMS - Abstract
This paper presents some examples of ill-behaved central paths in convex optimization. Some contain infinitely many fixed length central segments; others manifest oscillations with infinite variation. These central paths can be encountered even for infinitely differentiable data. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.