Data compression – Entropy

Let’s say I have an alphabet $ $ \Sigma = \{A, B, C, D, E\}$ $

with probabilities $ $ P(A) = P(B) = P(C) = 0.25 \text{ and } P(D)=P(E) = 0.125.$ $

I know that the entropy then is: $ $ H(\Sigma) = 3 \cdot 0.25 \cdot \log 4 + 2 \cdot 0.125 \cdot \log 8 = 2.25.$ $

My question now is: What does this mean in relation to the lower limit of compression? How many bits will I at least need to compress a text that consists of the above alphabet?