We study networks that display community structure -- groups of nodes within which connections are unusually dense. Using methods from random matrix theory, we calculate the spectra of such networks in the limit of large size, and hence demonstrate the presence of a phase transition in matrix methods for community detection, such as the popular modularity maximization method. The transition separates a regime in which such methods successfully detect the community structure from one in which the structure is present but is not detected. Comparing these results with recent analyses of maximum-likelihood methods suggests that spectral modularity maximization is an optimal detection method in the sense that no other method will succeed in the regime where the modularity method fails.