If $d_{n} = \frac{{\beta}_{n}}{10^n}$ where ${\beta}_{n}$ takes integer values between 0 and 9, does $\Sigma_{n=1}^{\infty}d_{n}$ converge?

The series looks like a converging geometric series: $ $ \Sigma_{n=1}^{\infty}\frac{1}{10^n} = \Sigma_{n=0}^{\infty}\frac{1}{10^n} – 1 = \frac{1}{9}$ $ Where the terms are being arbitrarily multiplied by constants between $ 0$ and $ 9$ . I’m not sure about how this affects the convergence of a series. I suspect that it converges because the constants eventually become small in comparison to the value of the geometric terms, but I’m not sure how I could prove it.

The “worst case” would be one where $ \beta_{n} = 9$ for all $ n$ . In that case:

$ $ \Sigma_{n=1}^{\infty}d_{n} = 9\Sigma_{n=1}^{\infty}\frac{1}{10^n} = 1$ $

So again I’m inclined to think it converges in any case.

Thanks.