This proposal concerns the theoretical and computational development of Lagrangian dual methods especially designed to solve large-scale convex network problems. These methods combine conjugate gradient directions with efficient line search procedures. They possess the two basic properties that are necessary to be practical for large- scale nonlinear optimization: they use the structure and sparsity of the problem to save memory requirements and to simplify matrix-vector products; and they achieve rapid convergence because Newton directions are employed. The first objective of the proposed research is to continue the preliminary research work by the principal investigator on separable quadratic networks. Theoretical issues such as the finite convergence property and the use of preconditioning matrices to accelerate the conjugate gradient methods will be investigated. The second objective includes the development of an efficient line search technique necessary for the extension of the Lagrangian dual approach to convex separable networks. Another objective concerns the design of an algorithm based on this dual approach for implementation on parallel computers. Finally, an extensive computational study with large real- world network problems will be performed to test this important new methodology for larger-scale convex network optimization.