Expected Solution of a Stochastic Differential Equation Expressed as Conditional Expectation

On all you geniusses out there: this is a tough one.

Preliminaries and Rigorous Technical Framework

  • Let $ T \in (0, \infty)$ be fixed.

  • Let $ d \in \mathbb{N}_{\geq 1}$ be fixed.

  • Let $ $ (\Omega, \mathcal{G}, (\mathcal{G}_t)_{t \in [0,T]}, \mathbb{P})$ $ be a complete probability space with a complete, right-continuous filtration $ (\mathcal{G}_t)_{t \in [0,T]}$ .

  • Let $ $ B : [0,T] \times \Omega \rightarrow \mathbb{R}^d, \quad (t,\omega) \mapsto B_t(\omega)$ $ be a standard $ d$ -dimensional $ (\mathcal{G}_t)_{t \in [0,T]}$ -adapted Brownian motion on $ \mathbb{R}^d$ such that, for every pair $ (t,s) \in \mathbb{R}^2$ with $ 0 \leq t < s$ , the random variable $ B_s-B_t$ is independent of $ \mathcal{G}_t$ .

  • Let \begin{align} &\sigma: \mathbb{R}^d \rightarrow \mathbb{R}^{d \times d}, \ &\mu: \mathbb{R}^d \rightarrow \mathbb{R}^{d}, \end{align} be affine linear transformations, i.e. let there be matrices $ (A^{(\sigma)}_1,…,A^{(\sigma)}_d, \bar{A}^{(\sigma)}):= \theta_{\sigma} \in (\mathbb{R}^{d \times d})^{d+1}$ such that, for all $ x \in \mathbb{R}^d$ , \begin{equation} \sigma(x) = ( A^{(\sigma)}_1 x \mid … \mid A^{(\sigma)}_d x) + \bar{A}^{(\sigma)}, \end{equation} where $ A^{(\sigma)}_i x$ describes the $ i$ -th column of the matrix $ \sigma(x) \in \mathbb{R}^{d \times d}$ , and let there be a matrix-vector pair $ (A^{(\mu)}, \bar{a}^{(\mu)}) := \theta_{\mu} \in \mathbb{R}^{d \times d} \times \mathbb{R}^d$ such that, for all $ x \in \mathbb{R}^d$ , \begin{equation} \mu (x) = A^{(\mu)}x + \bar{a}^{(\mu)}. \end{equation}

  • Let \begin{equation} \varphi : \mathbb{R}^d \rightarrow \mathbb{R} \end{equation} be a fixed, continuous and at most polynomially growing function, i.e. let $ \varphi$ be continuous and let there be a constant $ C \in [1, \infty)$ such that, for all $ x \in \mathbb{R}^d$ it holds that \begin{equation} \lVert \varphi(x) \rVert \leq C (1+\lVert x \rVert )^C. \end{equation}

  • Let $ x_0 \in \mathbb{R}^d$ be fixed.


Consider the following stochastic differential equation, given as an equivalent stochastic integral equation, where the multidimensional integrals are to be read componentwise:

\begin{equation} S_t = x_0 + \int_{0}^{t} \mu(S_t) ds + \int_{0}^{t} \sigma (S_t) dB_s. \end{equation}

Under our assumptions, it is the case that an (up to indistinguishability) unique solution process

$ $ S^{(x_0, \theta_{\sigma}, \theta_{\mu})} :[0,T] \times \Omega \rightarrow \mathbb{R}^d, \quad (t, \omega) \mapsto S_t(\omega),$ $

for this equation exists (to see this, consider for example Theorem 8.3. in Brownian Motion, Martingales and Stochastic Calculus from Le Gall).

I am interested in the expectation of $ S^{(x_0, \theta_{\sigma}, \theta_{\mu})}$ at time $ T$ when passed through the function $ \varphi$ : $ $ \mathbb{E}[\varphi(S^{(x_0, \theta_{\sigma}, \theta_{\mu})}_T)].$ $ More specifically, I want to express $ \mathbb{E}[\varphi(S^{(x_0, \theta_{\sigma}, \theta_{\mu})}_T)]$ in the following way as a conditional expectation: $ $ \mathbb{E}[\varphi(S^{(x_0, \theta_{\sigma}, \theta_{\mu})}_T)] = \mathbb{E}[\varphi(S^{(X_0, \Theta_{\sigma}, \Theta_{\mu})}_T) \mid (X_0, \Theta_{\sigma}, \Theta_{\mu}) = (x_0, \theta_{\sigma}, \theta_{\mu})]. $ $

Here $ $ X_0 : \Omega \rightarrow \mathbb{R}^d, $ $ $ $ \Theta_{\mu} : \Omega \rightarrow \mathbb{R}^{d \times d} \times \mathbb{R}^d,$ $ $ $ \Theta_{\sigma} : \Omega \rightarrow (\mathbb{R}^{d \times d})^{d+1},$ $ are $ \mathcal{G}_0$ -measurable random variables, which define the initial value $ x_0$ of the process at $ t=0$ as well as the entries of the affine-linear coefficient functions $ \mu$ and $ \sigma$ . Moreover, $ \Sigma$ is a random function.

The random variable

$ $ S^{(X_0, \Theta_{\sigma}, \Theta_{\mu})}_T : \Omega \rightarrow \mathbb{R}^d$ $

is implicitly defined by the procedure of first “drawing” the random variables $ (X_0, \Theta_{\sigma}, \Theta_{\mu})$ at time $ t = 0$ in order to obtain fixed values $ $ (X_0, \Theta_{\sigma}, \Theta_{\mu}) = (\tilde{x}_0, \tilde{\theta}_{\sigma}, \tilde{\theta}_{\mu}) $ $ and then “afterwards” set $ $ S^{X_0, \Theta_{\sigma}, \Theta_{\mu})}_T := S^{(\tilde{x}_0, \tilde{\theta}_{\sigma}, \tilde{\theta}_{\mu})}_T, $ $ where
$ $ S^{(\tilde{x}_0, \tilde{\theta}_{\sigma}, \tilde{\theta}_{\mu})} :[0,T] \times \Omega \rightarrow \mathbb{R}^d, \quad (t, \omega) \mapsto S^{(\tilde{x}_0, \tilde{\theta}_{\sigma}, \tilde{\theta}_{\mu})}_t(\omega) $ $ is the (up to indistinguishability) unique solution process of the stochastic differential equation.

\begin{equation} S_t = \tilde{x}_0 + \int_{0}^{t} \tilde{\mu}(S_t) ds + \int_{0}^{t} \tilde{\sigma} (S_t) dB_s. \end{equation}

Here, $ \tilde{\sigma}$ and $ \tilde{\mu}$ are the affine-linear maps associated with the parameter values $ \tilde{\theta}_{\sigma}$ and $ \tilde{\theta}_{\mu}$ as described above.

Now, my questions:

  1. I know that there are technical problems with the way I “defined ” the random variable $ S^{(X_0, \Theta_{\sigma}, \Theta_{\mu})}$ , although I hope the idea is clear. How can I make the definition of $ S^{(X_0, \Theta_{\sigma}, \Theta_{\mu})}$ rigorous in the above framework?
  2. After having obtained a rigorous definition of $ S^{(X_0, \Theta_{\sigma}, \Theta_{\mu})}$ , how can I then show, that $ $ \mathbb{E}[\varphi(S^{(x_0, \theta_{\sigma}, \theta_{\mu})}_T)] = \mathbb{E}[\varphi(S^{(X_0, \Theta_{\sigma}, \Theta_{\mu})}_T) \mid (X_0, \Theta_{\sigma}, \Theta_{\mu}) = (x_0, \theta_{\sigma}, \theta_{\mu})] ?$ $

If further regularity assumptions (for example on the random variables $ X_0, \Theta_{\sigma}, \Theta_{\mu}$ ) are necessary in order to answer the above questions in a satisfactory way, then these can be made without second thoughts.

These questions are at the core of my current research. I am stuck and I would be extremely grateful for any advice!