ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 2 (1972), S. 383-387 
    ISSN: 1436-4646
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract Four theorems are presented that indicate that the sequence of points generated by different members of Broyden's (1967) family are identical, if the linear search routine is accurate.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 3 (1972), S. 345-358 
    ISSN: 1436-4646
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract This report contains the proofs of four new theorems relating to the behaviour of Broyden's [1] family of variable metric formula for solving unconstrained minimisation problems. In particular, it is shown that if the linear search at each iteration is perfect, then the sequence of points that is generated is independent of the member of the family used at each iteration, provided the matrix remains nonsingular. This result extends Powell's [14] proof of convergence for the original formula on any convex function to all members of the family.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 32 (1980), S. 123-133 
    ISSN: 1573-2878
    Keywords: Nondifferentiable optimization ; subgradients ; ball gradients
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In considering the nondifferentiable optimization problem, a new concept is introduced, known as the ball gradient. The ball-gradient magnitude is positive at any local minimum point, independently of whether the minimum point is well behaved, a cusp, or a sheet minimum. The ball-gradient magnitude is negative outside a ball of radius ε around a local minimum point and thus is usable as a terminating criterion on nondifferentiable functions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 32 (1980), S. 259-275 
    ISSN: 1573-2878
    Keywords: Nondifferentiable optimization ; subgradients ; ball gradients
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract This paper is concerned with the minimization of nondifferentiable functions. Three main results are obtained: (i) convergence of the ball-gradient algorithm, introduced by Dixon, for convex functions; (ii) convergence of the generalized gradient algorithm, as implemented by Shor and Ermol'ev, to a stationary point; and (iii) convergence of an algorithm introduced by Goldstein to a local minimum.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 80 (1994), S. 175-179 
    ISSN: 1573-2878
    Keywords: Unconstrained optimization ; variable metric methods ; quasi-Newton methods ; rounding errors
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract It has become customary to compare the performance of unconstrained optimization algorithms on families of extended symmetric test functions. In this paper, results are presented which indicate that the performance of the variable metric algorithm on such functions is greatly distorted by rounding errors that destroy the special nature of these functions. A simple method of overcoming this difficulty is demonstrated, and it confirms the theoretical result that the number of iterations required to solve such problems is independent of the dimension.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 47 (1985), S. 285-300 
    ISSN: 1573-2878
    Keywords: Unconstrained optimization ; conjugate gradients
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In this paper, we describe an implementation and give performance results for a conjugate gradient algorithm for unconstrained optimization. The algorithm is based upon the Nazareth three-term formula and incorporates Allwright preconditioning matrices and restart tests. The performance results for this combination compare favorably with existing codes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 60 (1989), S. 261-275 
    ISSN: 1573-2878
    Keywords: Optimization ; truncated Newton method ; automatic differentiation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract When solving large complex optimization problems, the user is faced with three major problems. These are (i) the cost in human time in obtaining accurate expressions for the derivatives involved; (ii) the need to store second derivative information; and (iii), of lessening importance, the time taken to solve the problem on the computer. For many problems, a significant part of the latter can be attributed to solving Newton-like equations. In the algorithm described, the equations are solved using a conjugate direction method that only needs the Hessian at the current point when it is multiplied by a trial vector. In this paper, we present a method that finds this product using automatic differentiation while only requiring vector storage. The method takes advantage of any sparsity in the Hessian matrix and computes exact derivatives. It avoids the complexity of symbolic differentiation, the inaccuracy of numerical differentiation, the labor of finding analytic derivatives, and the need for matrix store. When far from a minimum, an accurate solution to the Newton equations is not justified, so an approximate solution is obtained by using a version of Dembo and Steihaug's truncated Newton algorithm (Ref. 1).
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 68 (1991), S. 217-232 
    ISSN: 1573-2878
    Keywords: Mathematical programming ; constrained optimization ; l 1-problem ; decomposition
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A constrainedl 1-problem, involving linear functions only, is considered, and the application of the Benders decomposition method to the solution of the same is discussed. This approach, in principle, seems to be promising and is also applicable to the unconstrained case. Certain small illustrative examples are also presented.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 68 (1991), S. 407-421 
    ISSN: 1573-2878
    Keywords: Sparse equations ; ABS methods ; linear algebra
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In this paper, we examine three algorithms in the ABS family and consider their storage requirements on sparse band systems. It is shown that, when using the implicit Cholesky algorithm on a band matrix with band width 2q+1, onlyq additional vectors are required. Indeed, for any matrix with upper band widthq, onlyq additional vectors are needed. More generally, ifa kj ≠0,j〉k, then thejth row ofH i is effectively nonzero ifj〉i〉k. The arithmetic operations involved in solving a band matrix by this method are dominated by (1/2)n 2 q. Special results are obtained forq-band tridiagonal matrices and cyclic band matrices. The implicit Cholesky algorithm may require pivoting if the matrixA does not possess positive-definite principal minors, so two further algorithms were considered that do not require this property. When using the implicit QR algorithm, a matrix with band widthq needs at most 2q additional vectors. Similar results forq-band tridiagonal matrices and cyclic band matrices are obtained. For the symmetric Huang algorithm, a matrix with band widthq requiresq−1 additional vectors. The storage required forq-band tridiagonal matrices and cyclic band matrices are again analyzed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 10 (1972), S. 34-40 
    ISSN: 1573-2878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract Huang (Ref. 1) introduced a general family of variable metric updating formulas and showed that, for a convex quadratic function, all members of this family generate the same sequence of points and converge in at mostn steps. Huang and Levy (Ref. 2) published numerical data showing the behavior of this family for nonquadratic functions and concluded that this family could be divided into subsets that also generate sequences of identical points on more general functions. In this paper, the necessary and sufficient conditions for a group of algorithms to form part of one of these subsets are given.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...