How to compute $mathbb{E}[X_{s}^{2}e^{lambda X_{s}}]$ where $(X_s)$ is a Brownian motion with drift $mu$?
I'm working on a problem and at a certain point I ran into the problem as described in the title. We have that ${W_t,tgeq 0}$ is a Brownian motion and $mathscr{F}_t$ is the corresponding filtration. We have that $mu>0$ is given in the process ${X_t,tgeq 0}$ defined via $X_t:=mu t+W_t$.
I don't want to post the full problem I was solving yet, rather I'd like to know if what I ended up with is even solvable, because if not, I'll know I'm definitely wrong.
As posted in the title, I came at a point where I was left to compute the expectation:
$mathbb{E}[X_{s}^{2}e^{lambda X_{s}}]$
Earlier in the exercise (it consisted of multiple parts) I used the moment generating function for the normal distribution. However, as far as I know, I cannot take the $X_{s}^{2}$ out of the expectation, stopping me from applying the moment generating function.
Is this expectation solvable in a relatively easy way? If not, I'll know I'm wrong and start over.
probability-theory stochastic-processes brownian-motion expected-value
add a comment |
I'm working on a problem and at a certain point I ran into the problem as described in the title. We have that ${W_t,tgeq 0}$ is a Brownian motion and $mathscr{F}_t$ is the corresponding filtration. We have that $mu>0$ is given in the process ${X_t,tgeq 0}$ defined via $X_t:=mu t+W_t$.
I don't want to post the full problem I was solving yet, rather I'd like to know if what I ended up with is even solvable, because if not, I'll know I'm definitely wrong.
As posted in the title, I came at a point where I was left to compute the expectation:
$mathbb{E}[X_{s}^{2}e^{lambda X_{s}}]$
Earlier in the exercise (it consisted of multiple parts) I used the moment generating function for the normal distribution. However, as far as I know, I cannot take the $X_{s}^{2}$ out of the expectation, stopping me from applying the moment generating function.
Is this expectation solvable in a relatively easy way? If not, I'll know I'm wrong and start over.
probability-theory stochastic-processes brownian-motion expected-value
6
Differentiate twice $E(e^{lambda X_t})$ with respect to $lambda$.
– Did
Jan 4 at 20:16
@Did That works. Thank you very much!
– S. Crim
Jan 5 at 8:50
add a comment |
I'm working on a problem and at a certain point I ran into the problem as described in the title. We have that ${W_t,tgeq 0}$ is a Brownian motion and $mathscr{F}_t$ is the corresponding filtration. We have that $mu>0$ is given in the process ${X_t,tgeq 0}$ defined via $X_t:=mu t+W_t$.
I don't want to post the full problem I was solving yet, rather I'd like to know if what I ended up with is even solvable, because if not, I'll know I'm definitely wrong.
As posted in the title, I came at a point where I was left to compute the expectation:
$mathbb{E}[X_{s}^{2}e^{lambda X_{s}}]$
Earlier in the exercise (it consisted of multiple parts) I used the moment generating function for the normal distribution. However, as far as I know, I cannot take the $X_{s}^{2}$ out of the expectation, stopping me from applying the moment generating function.
Is this expectation solvable in a relatively easy way? If not, I'll know I'm wrong and start over.
probability-theory stochastic-processes brownian-motion expected-value
I'm working on a problem and at a certain point I ran into the problem as described in the title. We have that ${W_t,tgeq 0}$ is a Brownian motion and $mathscr{F}_t$ is the corresponding filtration. We have that $mu>0$ is given in the process ${X_t,tgeq 0}$ defined via $X_t:=mu t+W_t$.
I don't want to post the full problem I was solving yet, rather I'd like to know if what I ended up with is even solvable, because if not, I'll know I'm definitely wrong.
As posted in the title, I came at a point where I was left to compute the expectation:
$mathbb{E}[X_{s}^{2}e^{lambda X_{s}}]$
Earlier in the exercise (it consisted of multiple parts) I used the moment generating function for the normal distribution. However, as far as I know, I cannot take the $X_{s}^{2}$ out of the expectation, stopping me from applying the moment generating function.
Is this expectation solvable in a relatively easy way? If not, I'll know I'm wrong and start over.
probability-theory stochastic-processes brownian-motion expected-value
probability-theory stochastic-processes brownian-motion expected-value
edited Jan 6 at 14:16
Did
246k23221456
246k23221456
asked Jan 4 at 20:13
S. CrimS. Crim
14412
14412
6
Differentiate twice $E(e^{lambda X_t})$ with respect to $lambda$.
– Did
Jan 4 at 20:16
@Did That works. Thank you very much!
– S. Crim
Jan 5 at 8:50
add a comment |
6
Differentiate twice $E(e^{lambda X_t})$ with respect to $lambda$.
– Did
Jan 4 at 20:16
@Did That works. Thank you very much!
– S. Crim
Jan 5 at 8:50
6
6
Differentiate twice $E(e^{lambda X_t})$ with respect to $lambda$.
– Did
Jan 4 at 20:16
Differentiate twice $E(e^{lambda X_t})$ with respect to $lambda$.
– Did
Jan 4 at 20:16
@Did That works. Thank you very much!
– S. Crim
Jan 5 at 8:50
@Did That works. Thank you very much!
– S. Crim
Jan 5 at 8:50
add a comment |
1 Answer
1
active
oldest
votes
Using the setting as stated in the question and working out the expectation by using the tip given in the comments, yields the following. The key observation is that the expectation we want to compute, equals the second derivative with respect to $lambda$ of the moment generating function for a normal distribution. The moment generating function for a $N(mu,sigma^2)$ is given by (with $X$ in this case a continuous random variable):
$mathbb{E}[e^{lambda X}]=e^{lambda mu + frac{1}{2}sigma^2 lambda^2}$
In our setting we have that $X_s = mu s +W_ssim N(mu s, s)$. So the moment generating function becomes:
$mathbb{E}[e^{lambda X_s}]=e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the first derivative with respect to $lambda$ yields:
$mathbb{E}[X_{s}e^{lambda X_s}]=(mu s+slambda)e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the derivative of the first derivative yields our desired answer:
$mathbb{E}[X_{s}^{2}e^{lambda X_s}]=(mu s+slambda)^{2}e^{lambda mu s + frac{1}{2}s lambda^2}+s e^{lambda mu s + frac{1}{2}s lambda^2}=((mu s+slambda)^{2}+s)e^{lambda mu s + frac{1}{2}s lambda^2} $
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3062054%2fhow-to-compute-mathbbex-s2e-lambda-x-s-where-x-s-is-a-brown%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Using the setting as stated in the question and working out the expectation by using the tip given in the comments, yields the following. The key observation is that the expectation we want to compute, equals the second derivative with respect to $lambda$ of the moment generating function for a normal distribution. The moment generating function for a $N(mu,sigma^2)$ is given by (with $X$ in this case a continuous random variable):
$mathbb{E}[e^{lambda X}]=e^{lambda mu + frac{1}{2}sigma^2 lambda^2}$
In our setting we have that $X_s = mu s +W_ssim N(mu s, s)$. So the moment generating function becomes:
$mathbb{E}[e^{lambda X_s}]=e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the first derivative with respect to $lambda$ yields:
$mathbb{E}[X_{s}e^{lambda X_s}]=(mu s+slambda)e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the derivative of the first derivative yields our desired answer:
$mathbb{E}[X_{s}^{2}e^{lambda X_s}]=(mu s+slambda)^{2}e^{lambda mu s + frac{1}{2}s lambda^2}+s e^{lambda mu s + frac{1}{2}s lambda^2}=((mu s+slambda)^{2}+s)e^{lambda mu s + frac{1}{2}s lambda^2} $
add a comment |
Using the setting as stated in the question and working out the expectation by using the tip given in the comments, yields the following. The key observation is that the expectation we want to compute, equals the second derivative with respect to $lambda$ of the moment generating function for a normal distribution. The moment generating function for a $N(mu,sigma^2)$ is given by (with $X$ in this case a continuous random variable):
$mathbb{E}[e^{lambda X}]=e^{lambda mu + frac{1}{2}sigma^2 lambda^2}$
In our setting we have that $X_s = mu s +W_ssim N(mu s, s)$. So the moment generating function becomes:
$mathbb{E}[e^{lambda X_s}]=e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the first derivative with respect to $lambda$ yields:
$mathbb{E}[X_{s}e^{lambda X_s}]=(mu s+slambda)e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the derivative of the first derivative yields our desired answer:
$mathbb{E}[X_{s}^{2}e^{lambda X_s}]=(mu s+slambda)^{2}e^{lambda mu s + frac{1}{2}s lambda^2}+s e^{lambda mu s + frac{1}{2}s lambda^2}=((mu s+slambda)^{2}+s)e^{lambda mu s + frac{1}{2}s lambda^2} $
add a comment |
Using the setting as stated in the question and working out the expectation by using the tip given in the comments, yields the following. The key observation is that the expectation we want to compute, equals the second derivative with respect to $lambda$ of the moment generating function for a normal distribution. The moment generating function for a $N(mu,sigma^2)$ is given by (with $X$ in this case a continuous random variable):
$mathbb{E}[e^{lambda X}]=e^{lambda mu + frac{1}{2}sigma^2 lambda^2}$
In our setting we have that $X_s = mu s +W_ssim N(mu s, s)$. So the moment generating function becomes:
$mathbb{E}[e^{lambda X_s}]=e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the first derivative with respect to $lambda$ yields:
$mathbb{E}[X_{s}e^{lambda X_s}]=(mu s+slambda)e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the derivative of the first derivative yields our desired answer:
$mathbb{E}[X_{s}^{2}e^{lambda X_s}]=(mu s+slambda)^{2}e^{lambda mu s + frac{1}{2}s lambda^2}+s e^{lambda mu s + frac{1}{2}s lambda^2}=((mu s+slambda)^{2}+s)e^{lambda mu s + frac{1}{2}s lambda^2} $
Using the setting as stated in the question and working out the expectation by using the tip given in the comments, yields the following. The key observation is that the expectation we want to compute, equals the second derivative with respect to $lambda$ of the moment generating function for a normal distribution. The moment generating function for a $N(mu,sigma^2)$ is given by (with $X$ in this case a continuous random variable):
$mathbb{E}[e^{lambda X}]=e^{lambda mu + frac{1}{2}sigma^2 lambda^2}$
In our setting we have that $X_s = mu s +W_ssim N(mu s, s)$. So the moment generating function becomes:
$mathbb{E}[e^{lambda X_s}]=e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the first derivative with respect to $lambda$ yields:
$mathbb{E}[X_{s}e^{lambda X_s}]=(mu s+slambda)e^{lambda mu s + frac{1}{2}s lambda^2}$
Taking the derivative of the first derivative yields our desired answer:
$mathbb{E}[X_{s}^{2}e^{lambda X_s}]=(mu s+slambda)^{2}e^{lambda mu s + frac{1}{2}s lambda^2}+s e^{lambda mu s + frac{1}{2}s lambda^2}=((mu s+slambda)^{2}+s)e^{lambda mu s + frac{1}{2}s lambda^2} $
answered Jan 6 at 14:13
S. CrimS. Crim
14412
14412
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3062054%2fhow-to-compute-mathbbex-s2e-lambda-x-s-where-x-s-is-a-brown%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
6
Differentiate twice $E(e^{lambda X_t})$ with respect to $lambda$.
– Did
Jan 4 at 20:16
@Did That works. Thank you very much!
– S. Crim
Jan 5 at 8:50