Is it possible for the runtime and input size in an algorithm to be inversely related?


I’m wondering if it’s possible for algorithms that have monotonically decreasing runtime with the input-size – just as a fun mental exercise. If not, is it possible to disprove this claim? I haven’t been able to come up with an example or counterexample so far, and this sounds like an interesting problem.

P.S. Something like $ O(\frac{1}{n})$ , I guess (if it exists)