Why does a range query on a segment tree return at most $\lceil \log_2{N} \rceil$ nodes?


If an array $ A[1 \ldots N]$ is represented using a segment tree having sets in each interval, why does a range query $ [L\ldots R]$ returns at most $ \lceil \log_2{N} \rceil$ sets (or disjoint intervals)?

If came across this statement while reading this answer.

To quote:

Find a disjoint coverage of the query range using the standard segment tree query procedure. We get $ O(\log n)$ disjoint nodes, the union of whose multisets is exactly the multiset of values in the query range. Let’s call those multisets $ s_1, \dots, s_m$ (with $ m \le \lceil \log_2 n \rceil$ ).

I tried searching for a proof, but couldn’t find it on any site. Can anyone help me prove it?