We propose a framework for dealing with binary hard-margin classification in non-Euclidean Banach spaces, centering on the use of a supporting semi-inner-product taking the place of an inner-product in Hilbert spaces. The theory of semi-inner-product spaces allows for a geometric, Hilbert-like formulation of the problems, and we show that a surprising number of results from the Euclidean case can be appropriately generalised. These include the Representer theorem, convexity of the associated optimization programs, and even, for a particular class of Banach spaces, a ``kernel trick'' for non-linear classification. Statistical properties of the hyperplanes are discussed, as well as applications to learning in metric spaces.