A tricky mutual information inequality


Let $ X_0, X_1, X_2 $ be three independently distributed bits, let $ B$ be a random variable such that $ I(X_0:B)=0$ , $ I(X_1:B)=0$ , and $ I(X_2:B)=0$ , I need to prove that $ I(X_0, X_1, X_2:B)\leq 1$

(where $ I(M:N)$ is Shannon’s mutual information).

I can demonstrate that $ I(X_0, X_1, X_2:B)\leq 2$ , by using chain rule of mutual information $ I(X_0, X_1, X_2 : B)= I(X_0:B)+I(X_1,X_2:B|X_0) = H(X_1,X_2|X_0) – H(X_1,X_2|B,X_0) = 2 – H(X_1,X_2|B,X_0) \leq 2$ .

(where $ H(.)$ is Shannon’s binary entropy).

But I am unable to go further, please help.