I’ve read here that in HoTT, compilers could swap out less efficient representations of data for more efficient ones and I’m wondering whether my interpretation of this statement is correct.

Say we have two different ways of representing the natural numbers, unary (zero and successor) and binary. Here is a function that checks evenness on the former representation:

`even : UnaryNat -> Bool even zero = true even (succ zero) = false even (succ (succ n)) = even n `

If we then have an isomorphism between the unary and binary representations, we trivially get an evenness function for the binary representation “for free”, simply by converting a given binary natural number to a unary one, applying the `even`

function, and converting the result back to the binary representation. Obviously, this is not very efficient, and we also don’t need HoTT for this.

A better way to check whether a binary natural number is even would be to check if its least significant digit is a zero. My question is: Could we derive this more efficient algorithm for binary natural numbers from our definition of evenness for unary natural numbers using HoTT? If so, would this also be possible for other data types? I haven’t studied any HoTT yet and since its appears to be a pretty complex subject I would like to find out whether it’s as exciting as I think it is. Thanks!