Fixed physics time step and input lag

Everywhere I read says I should fix the time step of my physics simulation, and interpolate the result to the actual frame rate of my graphics. This helps with simplicity, networking, reproducibility, numerical stability, etc.

But as I understand it, fixed time step guarantees between 1 and 2 Δt of input lag, because you have to calculate one step ahead in order to interpolate. If I use 90 Hz physics, it gives me an input lag of about 17 ms, on average.

Since I often see gaming enthusiasts talking about 1 ms input lag, 1 ms delay, and how that makes a difference, I wonder how fast-action games does it, and how they reduce the input lag using fixed time step.

Or they don’t, and 1 ms delay is just marketing mumbo jumbo?