While base 10 is natural for humans, computers find it easier to store bits, since a binary choice is easier to represent (say, a charged vs uncharged capacitor). However, it turns out that non-binary data is not much harder to store efficiently. I will show how to represent an array $A[1..n]$ of digits on a regular computer, using the optimal $\lceil n \log_2 10 \rceil$ bits, such that reading or writing any $A[i]$ can be done in {\bf constant} time.