Prove that the upper bound in the Noiseless-coding theorem is strict

Given a probability distribution $ p$ across an alphabet, we define redundancy as:

Expected Length of codewords – entropy of p = $ \ E(S) – h(p)$

Prove that for each $ \epsilon$ with $ 0 \le \epsilon \lt 1$ there exists a $ p$ such that the optimal encoding has redundancy $ \epsilon$ .

Attempts

I have tried constructing a probability distribution like $ p_o = \epsilon, p_1 = 1 – \epsilon $ based on a previous answer, but I can’t get it to work.

Any help would be much appreciated.