Professors Byrd and Schnabel will continue their research in unconstrained and constrained optimization. Four topics are investigated. First, tensor methods for unconstrained optimization and initial development of such methods for constrained optimization. Second, trust region methods for constrained optimization that are relatively easy to implement, have satisfactory local and global convergence theory even in the presence of linearly dependent constraint gradients and perform efficiently and robustly in practice. Third, the development of algorithms for solving implicit least squares problems, a version of least squares that arises in curve fitting or where there are no dependent variables. Fourth, a thorough analysis of several quasi-Newton methods for nonlinearly constrained optimization, with the goal of guaranteeing local superlinear convergence with an arbitrary positive definite initial Hessian approximation, extending Powell's results for the BFGS methods. This research addresses important optimization problems that occur in many applications.