Suppose $ X, Y$ are two independent non-negative random variables. The conditions

(i) $ \mathbb{P}(X > t) = \frac{C}{t^p} + o(t^{-p})$

(ii) $ \mathbb{P}(Y > t) = o(t^{-q})$ for any $ q > 0$

imply

(iii) $ \mathbb{P}(XY > t) = \frac{C \mathbb{E}[Y^p]}{t^p} + o(t^{-p})$ .

(Of course here I am talking about the asymptotic behaviour as $ t \to \infty$ and $ p > 0$ .)

My question concerns a converse of this statement: if I know (ii) and (iii), does that imply (i)?

(While I would very much love to see that this is true, I have the impression that this claim is false but just haven’t come up with a counter-example.)

I am aware that the converse holds at the exponential level, i.e. $ $ \lim_{t \to \infty} \frac{\log \mathbb{P}(X > t)}{\log t} = -p.$ $

One may consider random variable $ Y$ with a density (which is sufficient for my purpose) if that helps. In case a counter-example for the “full” converse can be found, I would like to know if the “full” converse can still hold when (a) $ Y$ is a lognormal random variable or slightly more generally (b) $ Y$ has a tail upper bounded by that of some lognormal.

**Update**: to clarify, $ Y$ is a given random variable, the distribution of which is hence given and cannot be chosen freely. In particular $ Y$ is not a constant (otherwise the converse is trivially true unless $ Y = 0$ a.s., in which case the converse is trivially false).