For sufficiently small bit error probability p, most errors in a long LDPC code, quantum or classical, are formed by a large number of clusters which affect disjoint sets of parity checks [A. A. Kovalev and L. P. Pryadko, Phys. Rev. A 87, 020304(R) (2013)]. The typical cluster size scales logarithmically with the block length n. We propose a decoding algorithm based on constructing error clusters such that each cluster eliminates one or more non-zero syndrome bits, but does not create new ones. Clusters of progressively increasing size are added to a list until the entire syndrome vector can be represented as a linear combination of the listed clusters. The total number of connected clusters of weight w scales exponentially with w ~ log n, thus the overall complexity is polynomial in n.