ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Mathematical programming 47 (1990), S. 305-336 
    ISSN: 1436-4646
    Keywords: Trust region ; linear constraints ; convex constraints ; global convergence ; local convergence ; degeneracy ; rate of convergence ; identification of active constraints ; Newton's method ; sequential quadratic programming ; gradient projection
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science , Mathematics
    Notes: Abstract We develop a convergence theory for convex and linearly constrained trust region methods which only requires that the step between iterates produce a sufficient reduction in the trust region subproblem. Global convergence is established for general convex constraints while the local analysis is for linearly constrained problems. The main local result establishes that if the sequence converges to a nondegenerate stationary point then the active constraints at the solution are identified in a finite number of iterations. As a consequence of the identification properties, we develop rate of convergence results by assuming that the step is a truncated Newton method. Our development is mainly geometrical; this approach allows the development of a convergence theory without any linear independence assumptions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Computational optimization and applications 7 (1997), S. 27-40 
    ISSN: 1573-2894
    Keywords: large-scale optimization ; partial separability ; automatic differentiation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Computer Science
    Notes: Abstract ELSO is an environment for the solution oflarge-scale optimization problems. With ELSO the user is required to provide only code for the evaluation of a partially separable function. ELSO exploits the partialseparability structure of the function to computethe gradient efficiently using automatic differentiation.We demonstrate ELSO's efficiency by comparing thevarious options available in ELSO.Our conclusion is that the hybrid option in ELSOprovides performance comparable to the hand-coded option, while having the significantadvantage of not requiring a hand-coded gradient orthe sparsity pattern of the partially separable function.In our test problems, which have carefully coded gradients,the computing time for the hybrid AD option is within a factor of two of thehand-coded option.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...