Nonlinear Programming Capability
The key computational kernels of the SOCS library are the sparse nonlinear programming algorithms SPRNLP and BARNLP. For optimal control applications SPRNLP and BARNLP are fully integrated with the SOCS software and are transparent to the control practitioner. However, it is also possible to use SPRNLP and BARNLP independently for applications other than optimal control. The SPRNLP software implements a state-of-the-art sequential quadratic programming (SQP) method, using an augmented Lagrangian merit function and safeguarded line search. SPRNLP can accommodate nonlinear equality and inequality constraints, as well as simple bounds. The BARNLP software implements a sparse primal-dual interior point algorithm, in conjunction with a filter method for globalization. The outstanding performance of the software in benchmark tests has been documented. The package supports four different levels of functionality.
Provides general-purpose constrained optimization capability for very large applications. Unique features include:
- Sparse Quadratic Programming -- Schur-Complement QP Method needs only one sparse matrix factorization, even with active set changes!
- Primal-Dual interior point method efficient even with many inequality constraints.
- Sparse Linear Algebra -- Multifrontal solution of symmetric indefinite systems with pivoting for stability, using award-winning 1 Boeing software BCSLIB-EXT.
- Arbitrary Jacobian and Hessian Sparsity -- Not restricted to block diagonal or other special form.
- Quadratic Convergence -- Efficient solution for very large problems. How large?
- No Restriction on Degrees of Freedom. Unlike reduced Hessian methods, SPRNLP and BARNLP converge efficiently when the final active set is small or large.
- Reverse Communication Format -- Permits analytic and/or finite difference gradients.
Sparse Least Squares
Provides nonlinearly constrained least squares capability with all of the features of the general sparse NLP. Unique features include:
- Numerically Stable Solution -- Augmented QP format avoids formation of the normal matrix.
- Linear Least Squares -- Special option for linearly constrained (e.g., data fitting) applications
Dense NLP -- Simplified Usage
Provides general-purpose constrained optimization capability for small- to moderate- size applications with limited user requirements. Its features include:
- User Supplies Function Box -- SPRNLP or BARNLP does the rest!
- Hessian Options -- Quasi-Newton (SR1, BFGS, SSQN) and finite difference Newton.
- Finite Difference Jacobian/gradient.
Dense NLP -- Sophisticated Usage
Provides general-purpose constrained optimization capability for small- to moderate- size applications with more complex interface requirements. In addition to capabilities of the simplfied usage version, it uses a Reverse Communication Format that permits the user to supply Jacobian and optionally Hessian information.
Sparse Finite Difference Derivatives
In addition to the optimization tools, this package provides a collection of tools for computing first and second derivatives (Jacobian and Hessian) for sparse matrices. Unique features include:
- Number of perturbations much smaller than number of variables!
- Jacobian/Hessian Evaluation -- These procedures compute first and second derivative information using sparse differences.
- Index Set Construction -- Given matrix sparsity, this procedure determines how to group the variables for efficient differentiation.
Minimum Curvature Data Approxmiation
Multivariate tabular data can be approximated using tensor product spline functions. The software computes spline coefficients to:
- minimize the "wiggles" (curvature)
- interpolate and/or approxmiate table data
The software is fully integrated with the sparse nonlinear programming algorithms.
11988 Gordon Bell Award for High-Performance Computing