Packet collisions are generally considered to be a nuisance in networks such as wireless sensor networks. By contrast, we observe that collisions can in fact be used to our advantage to compute aggregate functions over distributed data more \emph{efficiently}. Formally, we model the situation as the following generalization of the well studied decision trees, which we call $2^+$ decision trees. The algorithms in this new model are allowed to query any \emph{subset} of boolean inputs and the answer to the query tells if the number of ones (or zeroes) in the subset is $0,1$ or at least $2$. We also study the natural generalization to $k^+$ decision trees (for $k\ge 1$), where the answers tell whether the number of ones (or zeroes) is $0,\dots, k-1$ or at least $k$. (The query model for $k^+$ decision trees have connections to combinatorial search, compressed sensing and group testing.) Informally, our main result states that the complexity of a boolean function $f$ under $k^+$ decision trees is determined by the ``fatness" of the best possible plain vanilla decision tree that decides $f$. (The notion of fatness of a decision tree is captured by its \emph{rank}, which has been studied before in computational learning theory.) This is joint work with Jim Aspnes (Yale), Murat Demirbas (Buffalo), Ryan O'Donnell (CMU) and Steve Uurtamo (Buffalo).