Abstract
Convergence rates of Newton-Goldstein sequences are estimated for convex constrained minimization problems with singular solutionsξ, i.e., solutions at which the local quadratic approximationQ(ξ, x) to the objective functionF grows more slowly than ∥x − ξ∥2 for admissible vectorsx nearξ. For a large class of iterative minimization methods with quadratic subproblems, it is shown that the valuesr n =F(x n )−infΩ F are of orderO(n −1/3) at least. For the Newton—Goldstein method this estimate is sharpened slightly tor n =O(n −1/2) when the second Fréchet differentialF″ is Lipschitz continuous and the admissible set Ω is bounded. Still sharper estimates are derived when certain growth conditions are satisfied byF or its local linear approximation atξ. The most surprising conclusion is that Newton—Goldstein sequences can convergesuperlinearly to a singular extremalξ when〈F′(ξ), x − ξ〉 ≥ A∥x − ξ∥ v for someA > 0, somev ∈ (2,2.5) and allx in Ω nearξ, and that this growth condition onF′(ξ) is entirely natural for a nontrivial class of constrained minimization problems on feasible sets Ω =ℒ 1{[0,1],U} withU a uniformly convex set in ℝd. Feasible sets of this kind are commonly encountered in the optimal control of continuous-time dynamical systems governed by differential equations, and may be viewed as infinite-dimensional limits of Cartesian product setsU k in ℝkd. Superlinear convergence of Newton—Goldstein sequences for the problem (Ω,F) suggests that analogous sequences for increasingly refined finite-dimensional approximation (U kd,F k ) to (Ω,F) will exhibit convergence properties that are in some sense “uniformly good” ink ask → ∞.
Similar content being viewed by others
References
Hughes GC (1981) Convergence rate analysis for iterative minimization schemes with quadratic subproblems. Ph.D. Thesis, Mathematics Dept. North Carolina State University
Hestenes M (1975) Optimization Theory: The Finite-Dimensions Case. Wiley, New York
Kantorovich LV (1948) Functional analysis and applied mathematics, Uspekhi Math Nauk 3:89–185
Goldstein AA (1965) On Newton's method, Numer. Math. 7:391–393
Levitin ES and Poljak BT (1966) Constrained minimization methods. USSR Comput Math and Math Phys 6:1–50
Danilin YM (1970) Minimization methods based on approximation of the initial functional by a convex functional. USSR Comp Math and Math Phys 10:1–18
Demyanov VF and Rubinov AM (1971) Approximate Methods in Optimization Problems. Elsevier: New York
Pshenichny BN and Danilin YuM (1978) Numerical Methods in Extremal Problems, MIR Publishers, Moscow
Bulavskii VA (1972) On extensions of the convergence domain of higher precision iterative methods, Soviet Math Dokl 13:915–917
Gruver WT and Sachs E (1980) Algorithmic Methods in Optimal Control. Pitman, London
Dunn JC (1980) Newton's method and the Goldstein step length rule for constrained minimization problems. SIAM J Contr Opt 18:659–674
Rustem B (1981) Projection Methods in Constrained Optimisation and Applications to Optimal Policy Decisions. Springer-Verlag, Berlin
Bertsekas DP (1982) Projected Newton methods for optimization problems with simple constraints. SIAM J Contr Opt 20:221–246
Reddien GW (1978) On Newton's method for singular problems. SIAM J Numer Anal 15:993–996
Kelley CT and Decker DW (1980) Newton's method at singular points. SIAM J Numer Anal 17:465–471
Decker DW, Keller HB and Kelley CT (1983) Convergence rates for Newton's method at singular points. SIAM J Number Anal 20:296–313
Dunn JC (1979) Rates of convergence for conditional gradient algorithms near singular and nonsingular extremals. SIAM J Contr Opt 17:187–211
Dunn JC (1980) Convergence rates for conditional gradient sequences generated by implicit step length rules. SIAM J Contr Opt 18:473–487
Dunn JC (1981) Global and asymptotic convergence rate estimates for a class of projected gradient processes. SIAM J Contr Opt 19:368–400
Dunn JC (1983) Extremal types for certainL p minimization problems and associated large scale nonlinear programs. Appl. Math Optim 9:303–335
Dunn JC and Sachs EW (1983) The effect of perturbations on the convergence rates of optimization algorithms. Appl Math Optim 10:143–157
Vainberg MM (1973) Variational Method and Method of Monotone Operators in the Theory of Nonlinear Equations. Wiley, New York
Cannon MD and Cullum CD (1968) A tight upper bound on the rate of convergence of the Frank-Wolfe algorithm. SIAM J Contr Opt 6:509–516
Cromme L (1978) Strong Uniqueness. Numer Math 29:179–193
Author information
Authors and Affiliations
Additional information
Communicated by A. V. Balakrishnan
Investigation partially supported by the U.S. Air Force through the Air Force Institute of Technology, and by NSF Grant ECS-8005958.
Rights and permissions
About this article
Cite this article
Hughes, G.C., Dunn, J.C. Newton—Goldstein convergence rates for convex constrained minimization problems with singular solutions. Appl Math Optim 12, 203–230 (1984). https://doi.org/10.1007/BF01449042
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/BF01449042