Do i.i.d Sums Concentrate Any Faster Than Martingales?

Suppose $ X_1,X_2, \ldots, X_N \in \mathbb R^d$ are random variables with each $ \|X_n\|_2 \le 1/2$ (this choice of the constant simplifies later formulae).

The simplest concentration inequality I know only applies in the case $ d=1$ and only when $ X_1,X_2, \ldots, X_N$ are i.i.d. The Hoeffding Lemma gives for each $ \epsilon >0$ the bound

$ $ P(|X_1 + \ldots + X_N| \ge \epsilon) \le \exp\left (-\frac{2\epsilon^2}{N} \right).\tag{1}$ $

On the other end of the spectrum are results that work under the weaker assumption that $ X_1,X_2, \ldots, X_N$ is a martingale, and work for any $ d \in \mathbb N$ , or indeed for infinite dimensional Banach spaces provided some variant of the parallelogram is satisfied. For example Theorem 3.5 of this paper of Pinelis leads to the following variant of the Azuma-Hoeffding inequality.

$ $ P(\|X_1 + \ldots + X_n\|_2 \ge \epsilon \text{ for some }n\le N) \le \exp\left (-\frac{\epsilon^2}{2N} \right).\tag{2}$ $

The exponent is the same as the scalar Azuma Hoeffding. Notice the $ 2$ is now downstairs rather than upstairs like before.

If we are only dealing with i.i.d scalars and only interested in the final element, we should use $ (1)$ because it gives a better bound. If we are dealing with either vectors, martingales, or want a uniform inequality we better use $ (2)$ instead.

My problem is between the two extremes. I am dealing with a sequence of i.i.d vectors in $ \mathbb R^d$ and I am interested in a uniform bound. I wonder does there exist an appropriate middle-ground between these two results? Perhaps combining the $ -2\epsilon^2/N$ of the first with the uniform nature of the second, at the expense of only applying to i.i.d sequences as opposed to martingales.