In this talk, we present an incremental Broyden-Fletcher-Goldfarb-Shanno (BFGS) method as a quasi-Newton algorithm with a cyclically iterative update scheme for solving large-scale optimization problems. The proposed incremental quasi-Newton (IQN) algorithm reduces computational cost relative to traditional quasi-Newton methods by restricting the update to a single function per iteration and relative to incremental second-order methods by removing the need to compute the inverse of the Hessian. A local superlinear convergence rate is established, and a strong improvement is shown over first order methods numerically for a set of common large-scale optimization problems.