ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (15)
  • Articles: DFG German National Licenses  (15)
  • nonlinear programming  (15)
  • 2010-2014
  • 1980-1984  (15)
  • 1950-1954
  • Mathematics  (15)
  • History
Collection
  • Articles  (15)
Source
  • Articles: DFG German National Licenses  (15)
Publisher
Years
Year
Topic
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Applied mathematics & optimization 6 (1980), S. 335-360 
    ISSN: 1432-0606
    Keywords: nonlinear programming ; multiplier methods ; penalty methods ; global convergence ; penalty limitation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract This paper deals with penalty function and multiplier methods for the solution of constrained nonconvex nonlinear programming problems. Starting from an idea introduced several years ago by Polak, we develop a class of implementable methods which, under suitable assumptions, produce a sequence of points converging to a strong local minimum for the problem, regardless of the location of the initial guess. In addition, for sequential minimization type multiplier methods, we make use of a rate of convergence result due to Bertsekas and Polyak, to develop a test for limiting the growth of the penalty parameter and thereby prevent ill-conditioning in the resulting sequence of unconstrained optimization problems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 33 (1981), S. 479-495 
    ISSN: 1573-2878
    Keywords: Lagrangians ; nonlinear programming ; Kuhn-Tucker theory ; convex optimization
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract For convex optimization inR n,we show how a minor modification of the usual Lagrangian function (unlike that of the augmented Lagrangians), plus a limiting operation, allows one to close duality gaps even in the absence of a Kuhn-Tucker vector [see the introductory discussion, and see the discussion in Section 4 regarding Eq. (2)]. The cardinality of the convex constraining functions can be arbitrary (finite, countable, or uncountable). In fact, our main result (Theorem 4.3) reveals much finer detail concerning our limiting Lagrangian. There are affine minorants (for any value 0〈θ≤1 of the limiting parameter θ) of the given convex functions, plus an affine form nonpositive onK, for which a general linear inequality holds onR nAfter substantial weakening, this inequality leads to the conclusions of the previous paragraph. This work is motivated by, and is a direct outgrowth of, research carried out jointly with R. J. Duffin.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 36 (1982), S. 495-519 
    ISSN: 1573-2878
    Keywords: Optimization ; nonlinear programming ; Numerical methods ; computational methods ; augmented Lagrangian functions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In this paper, a new augmented Lagrangian function is introduced for solving nonlinear programming problems with inequality constraints. The relevant feature of the proposed approach is that, under suitable assumptions, it enables one to obtain the solution of the constrained problem by a single unconstrained minimization of a continuously differentiable function, so that standard unconstrained minimization techniques can be employed. Numerical examples are reported.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 30 (1980), S. 161-179 
    ISSN: 1573-2878
    Keywords: Optimization techniques ; nonlinear programming ; direct methods ; numerical methods ; conjugate directions ; nongradient methods ; ridge-path methods
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A modification based on a linearization of a ridge-path optimization method is presented. The linearized ridge-path method is a nongradient, conjugate direction method which converges quadratically in half the number of search directions required for Powell's method of conjugate directions. The ridge-path method and its modification are compared with some basic algorithms, namely, univariate method, steepest descent method, Powell's conjugate direction method, conjugate gradient method, and variable-metric method. The assessment indicates that the ridge-path method, with modifications, could present a promising technique for optimization.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 31 (1980), S. 27-39 
    ISSN: 1573-2878
    Keywords: Least-square methods ; variable-metric methods ; Levenberg-Marquardt methods ; nonlinear programming ; testing algorithms
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract Computational results are presented for Davidon's new least-square algorithm. Computational experience with this algorithm is reported which motivated the development of a production code version of the algorithm. Several heuristic modifications, which have been added, are described. Fifteen zero-residual test problems have been used in comparing the new production code version with two established versions of the Levenberg-Marquardt algorithm. The production code version of Davidon's least-square algorithm performed faster and used less function evaluations than the Levenberg-Marquardt algorithm in almost every case of the test problems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 31 (1980), S. 361-371 
    ISSN: 1573-2878
    Keywords: Nash-equilibrium solutions ; partially controllable strategies ; nonlinear programming ; complementary eigenvalue problems
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The present paper deals with a class of nonzero-sum, two-person games with finite strategies when there are constraints on the strategies selected by the players. The constraints arise due to the subjective difficulty that each player often has in assigning to the states probabilities with which he is completely satisfied, and the model specifies how much each player must perturb his initial probability estimate in order to change his maximum utility alternative from the alternative originally best under the initial estimate. It is shown that the Nash-equilibrium solution of this class of nonzero-sum games can be characterized by an equivalent nonlinear program which leads in some cases to a pair of complementary eigenvalue problems. Applications to normal or approximate solutions of linear programming problems are also indicated.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 32 (1980), S. 407-425 
    ISSN: 1573-2878
    Keywords: Generalized convexity ; global minimality ; nonlinear programming ; nonconvex programming ; optimization theorems
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In this paper, new classes of generalized convex functions are introduced, extending the concepts of quasi-convexity, pseudoconvexity, and their associate subclasses. Functions belonging to these classes satisfy certain local-global minimum properties. Conversely, it is shown that, under some mild regularity conditions, functions for which the local-global minimum properties hold must belong to one of the classes of functions introduced.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 43 (1984), S. 237-263 
    ISSN: 1573-2878
    Keywords: Geometric programming ; computational comparisons ; nonlinear programming ; ellipsoid algorithm ; generalized reduced gradient algorithm
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We study the performance of four general-purpose nonlinear programming algorithms and one special-purpose geometric programming algorithm when used to solve geometric programming problems. Experiments are reported which show that the special-purpose algorithm GGP often finds approximate solutions more quickly than the general-purpose algorithm GRG2, but is usually not significantly more efficient than GRG2 when greater accuracy is required. However, for some of the most difficult test problems attempted, GGP was dramatically superior to all of the other algorithms. The other algorithms are usually not as efficient as GGP or GRG2. The ellipsoid algorithm is most robust.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 43 (1984), S. 527-541 
    ISSN: 1573-2878
    Keywords: Linear complementarity ; nonlinear programming ; gradient projection method
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The Levitin-Poljak gradient-projection method is applied to solve the linear complementarity problem with a nonsymmetric matrixM, which is either a positive-semidefinite matrix or aP-matrix. Further-more, if the quadratic functionx T(Mx + q) is pseudoconvex on the feasible region {x ∈R n |Mx + q ≥ 0,x≥0}, then the gradient-projection method generates a sequence converging to a solution, provided that the problem has a solution. For the case when the matrixM is aP-matrix and the solution is nondegenerate, the gradient-projection method is finite.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 35 (1981), S. 517-533 
    ISSN: 1573-2878
    Keywords: Two-level planning ; multi-objective systems ; decentralized systems ; resource allocation ; nonlinear programming
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We consider optimization methods for hierarchical power-decentralized systems composed of a coordinating central system and plural semi-autonomous local systems in the lower level, each of which possesses a decision making unit. Such a decentralized system where both central and local systems possess their own objective function and decision variables is a multi-objective system. The central system allocates resources so as to optimize its own objective, while the local systems optimize their own objectives using the given resources. The lower level composes a multi-objective programming problem, where local decision makers minimize a vector objective function in cooperation. Thus, the lower level generates a set of noninferior solutions, parametric with respect to the given resources. The central decision maker, then, parametric with respect to the given resources. The central decision maker, then, chooses an optimal resource allocation and the best corresponding noninferior solution from among a set of resource-parametric noninferior solutions. A computational method is obtained based on parametric nonlinear mathematical programming using directional derivatives. This paper is concerned with a combined theory for the multi-objective decision problem and the general resource allocation problem.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 37 (1982), S. 1-21 
    ISSN: 1573-2878
    Keywords: Sensitivity analysis ; geometric programming ; nonlinear programming
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A unified approach to computing first, second, or higher-order derivatives of any of the primal and dual variables or multipliers of a geometric programming problem, with respect to any of the problem parameters (term coefficients, exponents, and constraint right-hand sides) is presented. Conditions under which the sensitivity equations possess a unique solution are developed, and ranging results are also derived. The analysis for approximating second and higher-order sensitivity generalizes to any sufficiently smooth nonlinear program.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 40 (1983), S. 333-348 
    ISSN: 1573-2878
    Keywords: Numerical optimization ; global search ; nonlinear programming
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The paper describes a new version, known as CRS2, of the author's controlled random search procedure for global optimization (CRS). The new procedure is simpler and requires less computer storage than the original version, yet it has a comparable performance. The results of comparative trials of the two procedures, using a set of standard test problems, are given. These test problems are examples of unconstrained optimization. The controlled random search procedure can also be effective in the presence of constraints. The technique of constrained optimization using CRS is illustrated by means of examples taken from the field of electrical engineering.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 44 (1984), S. 701-721 
    ISSN: 1573-2878
    Keywords: Kuhn-Tucker points ; local and global minima ; nonlinear programming ; Morse functions ; convex transformable programs
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract Consider minimizingf onD which is diffeomorphic to a disk. Under a genericity assumption, the number of points onD satisfying the Kuhn-Tucker necessary conditions for minimum is odd. We give conditions which imply that a local minimum is global and a necessary and sufficient condition that a Kuhn-Tucker point is the solution. Convex transformable problems satisfy the latter condition.D may be of full dimension or be embedded on a manifold or it may be given by a system of concave inequalities.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 36 (1982), S. 477-494 
    ISSN: 1573-2878
    Keywords: Unconstrained optimization ; variable-metric methods ; quasi-Newton methods ; numerical algorithms ; nonlinear programming
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract Quasi-Newton algorithms minimize a functionF(x),x ∈R n, searching at any iterationk along the directions k=−H kgk, whereg k=∇F(x k) andH k approximates in some sense the inverse Hessian ofF(x) atx k. When the matrixH is updated according to the formulas in Broyden's family and when an exact line search is performed at any iteration, a compact algorithm (free from the Broyden's family parameter) can be conceived in terms of the followingn ×n matrix: $$H{_R} = H - Hgg{^T} H/g{^T} Hg,$$ which can be viewed as an approximating reduced inverse Hessian. In this paper, a new algorithm is proposed which uses at any iteration an (n−1)×(n−1) matrixK related toH R by $$H_R = Q\left[ {\begin{array}{*{20}c} 0 & 0 \\ 0 & K \\ \end{array} } \right]Q$$ whereQ is a suitable orthogonaln×n matrix. The updating formula in terms of the matrixK incorporated in this algorithm is only moderately more complicated than the standard updating formulas for variable-metric methods, but, at the same time, it updates at any iteration a positive definite matrixK, instead of a singular matrixH R. Other than the compactness with respect to the algorithms with updating formulas in Broyden's class, a further noticeable feature of the reduced Hessian algorithm is that the downhill condition can be stated in a simple way, and thus efficient line searches may be implemented.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Electronic Resource
    Electronic Resource
    Springer
    Journal of optimization theory and applications 35 (1981), S. 159-182 
    ISSN: 1573-2878
    Keywords: Variable penalty methods ; nonlinear programming ; sequential unconstrained minimization technique ; approximations ; Hessian matrix ; penalty methods ; ill-conditioning
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A class of generalized variable penalty formulations for solving nonlinear programming problems is presented. The method poses a sequence of unconstrained optimization problems with mechanisms to control the quality of the approximation for the Hessian matrix, which is expressed in terms of the constraint functions and their first derivatives. The unconstrained problems are solved using a modified Newton's algorithm. The method is particularly applicable to solution techniques where an approximate analysis step has to be used (e.g., constraint approximations, etc.), which often results in the violation of the constraints. The generalized penalty formulation contains two floating parameters, which are used to meet the penalty requirements and to control the errors in the approximation of the Hessian matrix. A third parameter is used to vary the class of standard barrier or quasibarrier functions, forming a branch of the variable penalty formulation. Several possibilities for choosing such floating parameters are discussed. The numerical effectiveness of this algorithm is demonstrated on a relatively large set of test examples.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...