If one were to go for an increased degree of realism, and try to build a probability curve that produces most sensible results (but simplified, of course, since there is no such thing as a perfect simulation), then approximately what sort of correlation should there be between distance to the target, speed of the target, and the chance to hit the target (under otherwise similar circumstances, i.e. same aiming time, weapon, character/skill etc.)?

**Examples:** There are systems which reduce the chance to hit by the same percent per range fixed increment added to the range of the target. There are systems which stack range penalties by a logarithmic function of range (e.g. a stacking penalty per doubling until reaching some cutoff range). There are systems which provide a constant speed penalty entirely separately from range, and systems which add speed and range when calculating the penalty. Some of these systems’ probability effects are complicated by the fact that they use non-linear dice curves. Some argue that the function of probability reduction should be a quadratic relation to range, since for each doubling of range, the target’s projection becomes ΒΌ of its previous observed value (percent of FoV taken up), but I don’t recall any systems that explicitly and deliberately implemented anything like that.

After asking elsewhere, I’ve been pointed to Steering law and Fitt’s law, but seem to be meant for fixed accuracy and variable time, while in RPGs, fixed aim time and variable chance to hit are much more workable.

Note that I’m not asking about which dice mechanics to use for modelling those probability adjustments, as I’m assuming that there are multiple ways of fitting dice to a desired probability function, but first I’d like to learn what probability functions are most fitting (simplified and generalised, of course) representation of real life shooting situations.