Many interesting optimization problems are nonsmooth, i.e. either the function to be optimized or an associated constraint is not differentiable on some part of its domain. The main goal of this project is the design of efficient algorithms for some important classes of nonsmooth optimization problems. This is possible only by taking full advantage of the structure which is present. Because of the lack of smoothness, standard algorithms are not applicable. The investigation primarily addresses three issues. The first is the optimization of eigenvalues of symmetric matrices which depend on parameters. Typically, solutions involve multiple eigenvalues which are not smooth functions of the matrix elements. Applications of interest include the optimal design of columns and plates against buckling. The second major issue is the design of methods for a broad problem class: convex composite optimization. By appropriate parameterization of the generalized gradient set it is hoped to obtain efficient methods with superlinear convergence. The third issue concerns non-Lipschitz optimization problems involving eigenvalues of nonsymmetric matrices. For example, an important application arising in the stability of control systems is to minimize the spectral radius of a matrix.