ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (9)
  • Optimization  (9)
  • Springer  (9)
  • American Institute of Physics
  • Cell Press
  • Oxford University Press
  • 2005-2009
  • 1985-1989
  • 1975-1979  (9)
  • 1945-1949
  • 2010
  • 2007
  • 2005
  • 1978  (9)
  • Mathematics  (9)
  • Nature of Science, Research, Systems of Higher Education, Museum Science
  • Economics
  • Natural Sciences in General
  • Electrical Engineering, Measurement and Control Technology
Collection
  • Articles  (9)
Publisher
  • Springer  (9)
  • American Institute of Physics
  • Cell Press
  • Oxford University Press
Years
  • 2005-2009
  • 1985-1989
  • 1975-1979  (9)
  • 1945-1949
Year
Topic
  • Mathematics  (9)
  • Nature of Science, Research, Systems of Higher Education, Museum Science
  • Economics
  • Natural Sciences in General
  • Electrical Engineering, Measurement and Control Technology
  • +
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 14 (1978), S. 208-223 
    ISSN: 1436-4646
    Keywords: Optimization ; Linear Constraints ; Minimax ; Quadratic Convergence
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We present an algorithm for nonlinear minimax optimization subject to linear equality and inequality constraints which requires first order partial derivatives. The algorithm is based on successive linear approximations to the functions defining the problem. The resulting linear subproblems are solved in the minimax sense subject to the linear constraints. This ensures a feasible-point algorithm. Further, we introduce local bounds on the solutions of the linear subproblems, the bounds being adjusted automatically, depending on the quality of the linear approximations. It is proved that the algorithm will always converge to the set of stationary points of the problem, a stationary point being defined in terms of the generalized gradients of the minimax objective function. It is further proved that, under mild regularity conditions, the algorithm is identical to a quadratically convergent Newton iteration in its final stages. We demonstrate the performance of the algorithm by solving a number of numerical examples with up to 50 variables, 163 functions, and 25 constraints. We have also implemented a version of the algorithm which is particularly suited for the solution of restricted approximation problems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 15 (1978), S. 200-210 
    ISSN: 1436-4646
    Keywords: Minimization ; Optimization ; Variable Metric ; Conjugate-Gradient ; Quasi-Newton
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract Although quasi-Newton algorithms generally converge in fewer iterations than conjugate gradient algorithms, they have the disadvantage of requiring substantially more storage. An algorithm will be described which uses an intermediate (and variable) amount of storage and which demonstrates convergence which is also intermediate, that is, generally better than that observed for conjugate gradient algorithms but not so good as in a quasi-Newton approach. The new algorithm uses a strategy of generating a form of conjugate gradient search direction for most iterations, but it periodically uses a quasi-Newton step to improve the convergence. Some theoretical background for a new algorithm has been presented in an earlier paper; here we examine properties of the new algorithm and its implementation. We also present the results of some computational experience.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 15 (1978), S. 36-52 
    ISSN: 1436-4646
    Keywords: Optimization ; Non-linear Programming ; Unconstrained Optimization ; Gradientpath Algorithms ; Quasi-Newton Methods ; Arc Algorithms
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract The gradient path of a real valued differentiable function is given by the solution of a system of differential equations. For a quadratic function the above equations are linear, resulting in a closed form solution. A quasi-Newton type algorithm for minimizing ann-dimensional differentiable function is presented. Each stage of the algorithm consists of a search along an arc corresponding to some local quadratic approximation of the function being minimized. The algorithm uses a matrix approximating the Hessian in order to represent the arc. This matrix is updated each stage and is stored in its Cholesky product form. This simplifies the representation of the arc and the updating process. Quadratic termination properties of the algorithm are discussed as well as its global convergence for a general continuously differentiable function. Numerical experiments indicating the efficiency of the algorithm are presented.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 14 (1978), S. 41-72 
    ISSN: 1436-4646
    Keywords: Large-scale Systems ; Linear Constraints ; Linear Programming ; Nonlinear Programming ; Optimization ; Quasi-Newton Method ; Reduced-gradient Method ; Simplex Method ; Sparse Matrix ; Variable-metric Method
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract An algorithm for solving large-scale nonlinear programs with linear constraints is presented. The method combines efficient sparse-matrix techniques as in the revised simplex method with stable quasi-Newton methods for handling the nonlinearities. A general-purpose production code (MINOS) is described, along with computational experience on a wide variety of problems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 15 (1978), S. 343-348 
    ISSN: 1436-4646
    Keywords: Minimization ; Optimization ; Variable metric ; Conjugate-gradient ; Quasi-Newton
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We wish to examine the conjugate gradient and quasi-Newton minimization algorithms. A relation noted by Nazareth is extended to an algorithm in which conjugate gradient and quasi-Newton search directions occur together and which can be interpreted as a conjugate gradient algorithm with a changing metric.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 26 (1978), S. 601-636 
    ISSN: 1573-2878
    Keywords: Optimization ; calculus of variations ; state constraints
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract This paper combines the separate works of two authors. Tan proves a set of necessary conditions for a control problem with second-order state inequality constraints (see Ref. 1). Russak proves necessary conditions for an extended version of that problem. Specifically, the extended version augments the original problem by including state equality constraints, differential and isopermetric equality and inequality constraints, and endpoint constraints. In addition, Russak (i) relaxes the solvability assumption on the state constraints, (ii) extends the maximum principle to a larger set, (iii) obtains modified forms of the relationH =H t and of the transversality relation usually obtained in problems of this type, and (iv) proves a condition concerning μα(t 1), the derivative of the multiplier functions at the final time.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 24 (1978), S. 325-335 
    ISSN: 1573-2878
    Keywords: Optimization ; integral constraint ; Lagrange multipliers ; isoperimetric problem ; numerical solution ; convergence
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The classical method for optimizing a functional subject to an integral constraint is to introduce the Lagrange multiplier and apply the Euler-Lagrange equations to the augmented integrand. The Lagrange multiplier is a constant whose value is selected such that the integral constraint is satisfied. This value is frequently an eigenvalue of the boundary-value problem and is determined by a trial-and-error procedure. A new approach for solving this isoperimetric problem is presented. The Lagrange multiplier is introduced as a state variable and evaluated simultaneously with the optimum solution. A numerical example is given and is shown to have a large region of convergence.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 25 (1978), S. 1-9 
    ISSN: 1573-2878
    Keywords: Optimization ; unconstrained minimization ; gradient methods
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract This paper is intended to give a characterization of the minimum point of a function in terms of the gradient of the function at some other point using some concepts from differential geometry. The function is assumed to have continuous partial derivatives up to and including order four. It is also assumed to have a positive-definite Hessian matrix onR n and a unique minimum point.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 26 (1978), S. 453-455 
    ISSN: 1573-2878
    Keywords: Optimization ; Newton-like methods ; numerical integration ; orthogonal trajectories
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract This note discusses a generalization of the trapezoidal method for numerical integration of the differential equations of the orthogonal trajectories of a function as a means of finding maximum or minimum values of the function. Connections between the choice of a parameter in the integration scheme and various modifications of Newton's method are indicated.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...