A Lower bound on the sum of Bernoulli random variables given a constraint on its distribution

Given a set of Bernoulli random variables $ x_1, \dots, x_n$ (not necessarily identical) with $ X= \sum_{0<i\leq n} x_i$ , I am intrested in finding a lower-bound for $ \frac{\mathbb{E} [ \min (X,k) ]}{\mathbb{E} [X]}$ in terms of $ k$ and $ \alpha$ where $ \alpha > \Pr[X>k]$ . For example, I want to show that this ratio is a large enough constant for $ \alpha=0.2$ and $ k=4$ .

Expected values of packing distance between vectors with Bernoulli trials?

Pick set $ \mathcal T$ of $ 2^k$ vectors in $ \{-1,+1\}^n$ with Bernoulli trials.

What are exact expected values or at least tight bounds of $ $ \min_{\substack {u,v\in\mathcal T\u\neq v}}\|u-v\|_2^2$ $ $ $ \min_{\substack {u,v\in\mathcal T\u\neq v}}\|u-v\|_1$ $ and their distributions?

Reference Request: Total Variation Between Dependent and Independent Bernoulli Processes

Let $ X$ be a random variable taking values in $ \{0,1\}^n$ with the following distribution. For each coordinate $ i$ , we have $ p_i = P(X_i = 1) = c/\sqrt n$ , where $ c$ is a (very small) constant. Coordinates $ i$ and $ j$ have positive correlations exponentially decaying in $ |i-j|$ , with prefactor $ 1/\sqrt n$ , in the following sense: writing $ p_{i,j} = P(X_i = 1, \, X_j = 1)$ , we have $ $ 0 \le p_{i,j} – p_i p_j \le c \exp(-|i-j|)/\sqrt n. $ $ This is a dependent Bernoulli process. Let $ \mu$ denote the law of this.

Also, let $ Y$ be an independent Bernoulli process with the same marginals: $ P(Y_i = 1) = p_i$ and coordinates are independent. Let $ \nu$ denote the law of this.

I want to bound the total variation distance $ \|\mu-\nu\|_\text{TV}$ . In particular, I want to show that the TV distance decays with $ c$ , ie taking $ c \to 0$ gives $ \text{TV} \to 0$ .


I am aware of the Chen-Stein method for approaching questions like this, but to me this seems better suited when the probabilities $ p_i$ are order $ 1/n$ , and so there are a Poisson number of $ 1$ s in the independent case (and the method shows that the same holds for dependent case, under certain conditions). Perhaps one can apply Stein’s method more generally?

Also, the aim of this question isn’t to get a precise answer from someone, but rather a reference or suggested method of approach. The above is a simplified version of my actual problem, but I feel that if I can get a good handle on the above, then I can convert it to my specific case.

Conditional problem with Bernoulli variables

Let $ S_{n}=\sum_{k=1}^{n}X_{k}$ and $ T_{n}=\sum_{k=1}^{n}Y_{k}X_{k}$ with all $ X_{k}$ and $ Y_{k}$ are mutually independent and of law Bernoulli respectively of parameters p and q. Let $ N=inf(n>0, T_{n+1}=1)$

The question is to show that for all k , we have

$ P(X_{k}/N=n)=P(X_{k}/Y_{k}X_{k}=0)$ for all k and $ P(\cap(X_{k}=x_{k}/N=n)=\prod P(X_{k}=x_{k}/N=n)$ for all $ x_{k} \in (0, 1)^{n}$ .

About Bernoulli polynomials

My question is about Bernoulli numbers and Bernoulli polynomials in the $ p$ -adic context. In general in fact Bernoulli numbers are defined as global object so they do not depend on $ p$ . If $ B_k(x)$ is the k-th Bernoulli polynomial then the k-th Bernoulli number is $ B_k=B_k(0)$ . It is a well known fact that in general $ B_k$ is not in $ \mathbb{Z}_p$ but for some $ k$ this is true. Then clearly $ B_k(x)$ not in $ \mathbb{Z}_p[x]$ in general. There are in general for a fixed $ p$ criterium for the $ k$ wich ensures that $ B_k(x) \in \mathbb{Z}_p[x]$ ?

Thanks for the answer!