In general, how does a DFA know how to successfully process a string the intended way?

Suppose we have:

$ $ A\text{ }\colon=\{x, y, z\}$ $

$ $ M\text{ }\colon=\text{some DFA using A}$ $

$ $ S\text{ }\colon=xyzxyzxyz$ $

Intuitively, one might say $ S$ is fed to $ M$ on a per-character basis.

This means that somehow we have an undisclosed mechanism that can tell where a symbol starts and ends.

One might say, simply use the maximum valid substring similar to how Lexers tokenise plaintext. To that I say, suppose instead that we defined $ A$ as: $ $ A\text{}\colon= \{x, xx, xxx\}$ $

Now we have 3 unique symbols, that, as it so happens, using the maximum valid substring will yield in a restriction to what our our $ M$ can actually process, because any string longer than 2 characters will always be assumed to start with $ xxx$ rather than perhaps, $ x$ and $ xx$ .

One way I see around this is to actually have a character synonymous to a symbol. That is, $ x$ and $ xxx$ (from $ A$ ) are both a single character each.