Variance of random sum of random i.i.d. variables - spot the mistake?
Probably a trivial mistake, but can't seem to spot it:
Assume $X_1, ldots, X_tau$ and $tau in {1, ldots, n}$ are random i.i.d variables, where $S_tau = X_1 + ldots + X_tau$ denotes the random sum.
It can be shown that the following holds:
$$Var(S_tau | tau )=tau Var(X_1)$$
However, from what I know,
$$Var(S_tau | tau) = mathbb{E}( (S_tau - mathbb{E}(S_tau|tau))^2|tau) = mathbb{E}(S_tau^2|tau) - tau^2mathbb{E}(X_1)^2$$
Here $$ mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau^2$$
due to independency and identical distributions.
So with my calculations I'm getting $$Var(S_tau|tau)=tau^2 Var(X_1)$$
and I'm not sure where should the squared $tau$ disappear.
I seem to be missing something, but can't spot it.
Would be grateful for any observations!
probability probability-theory conditional-expectation expected-value
add a comment |
Probably a trivial mistake, but can't seem to spot it:
Assume $X_1, ldots, X_tau$ and $tau in {1, ldots, n}$ are random i.i.d variables, where $S_tau = X_1 + ldots + X_tau$ denotes the random sum.
It can be shown that the following holds:
$$Var(S_tau | tau )=tau Var(X_1)$$
However, from what I know,
$$Var(S_tau | tau) = mathbb{E}( (S_tau - mathbb{E}(S_tau|tau))^2|tau) = mathbb{E}(S_tau^2|tau) - tau^2mathbb{E}(X_1)^2$$
Here $$ mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau^2$$
due to independency and identical distributions.
So with my calculations I'm getting $$Var(S_tau|tau)=tau^2 Var(X_1)$$
and I'm not sure where should the squared $tau$ disappear.
I seem to be missing something, but can't spot it.
Would be grateful for any observations!
probability probability-theory conditional-expectation expected-value
add a comment |
Probably a trivial mistake, but can't seem to spot it:
Assume $X_1, ldots, X_tau$ and $tau in {1, ldots, n}$ are random i.i.d variables, where $S_tau = X_1 + ldots + X_tau$ denotes the random sum.
It can be shown that the following holds:
$$Var(S_tau | tau )=tau Var(X_1)$$
However, from what I know,
$$Var(S_tau | tau) = mathbb{E}( (S_tau - mathbb{E}(S_tau|tau))^2|tau) = mathbb{E}(S_tau^2|tau) - tau^2mathbb{E}(X_1)^2$$
Here $$ mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau^2$$
due to independency and identical distributions.
So with my calculations I'm getting $$Var(S_tau|tau)=tau^2 Var(X_1)$$
and I'm not sure where should the squared $tau$ disappear.
I seem to be missing something, but can't spot it.
Would be grateful for any observations!
probability probability-theory conditional-expectation expected-value
Probably a trivial mistake, but can't seem to spot it:
Assume $X_1, ldots, X_tau$ and $tau in {1, ldots, n}$ are random i.i.d variables, where $S_tau = X_1 + ldots + X_tau$ denotes the random sum.
It can be shown that the following holds:
$$Var(S_tau | tau )=tau Var(X_1)$$
However, from what I know,
$$Var(S_tau | tau) = mathbb{E}( (S_tau - mathbb{E}(S_tau|tau))^2|tau) = mathbb{E}(S_tau^2|tau) - tau^2mathbb{E}(X_1)^2$$
Here $$ mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau^2$$
due to independency and identical distributions.
So with my calculations I'm getting $$Var(S_tau|tau)=tau^2 Var(X_1)$$
and I'm not sure where should the squared $tau$ disappear.
I seem to be missing something, but can't spot it.
Would be grateful for any observations!
probability probability-theory conditional-expectation expected-value
probability probability-theory conditional-expectation expected-value
asked 2 days ago
Nutle
274110
274110
add a comment |
add a comment |
4 Answers
4
active
oldest
votes
I think one oopsie is: since $X_i$ and $X_j$ are independent, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = mathbb{E}[X_1]^2$ and not $mathbb{E}[X_1^2]$! There must be another mistake, because plugging that in returns zero, but I don't see it quite yet.
There's also a much simpler proof. $var(S_tau|tau) = var(sum_{i=1}^tau X_i|tau)$. The variance of the sum of i.i.d. random variables is the sum of the variance of any random variable in the sum, so $var(sum_{i=1}^tau X_i|tau) = sum_{i=1}^tau var(X_1)$
New contributor
add a comment |
First of all, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = (mathbb{E}[X_1])^2 ne mathbb{E}[X_1^2]$
Also, there will be $tau$ terms of $X_i^2$ and $tau(tau-1)$ terms of $X_iX_j$ in the expansion of the square. Therefore your second equation will become
$$mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2$$
$$Var(S_tau|tau)=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2 - tau^2mathbb{E}(X_1)^2$$
$$= taumathbb{E}(X_1^2) -tau(mathbb{E}[X_1])^2 $$
$$= tau(mathbb{E}(X_1^2)-(mathbb{E}[X_1])^2)$$
$$=tau Var(X_1)$$
add a comment |
The most succinct calculation writes the covariance of two iids using the Kronecker delta:$$operatorname{Var}S_tau=sum_{1le i,,jletau}operatorname{Cov}(X_i,,X_j)=sum_{ij}sigma^2delta_{ij}=tausigma^2.$$Your mistake, essentially, was to replace $delta_{ij}$ with $1$ throughout.
add a comment |
Conditional on $tau=k$, where $1leq k leq n$, $S_tau=S_k=X_1+dotsc +X_k$. For any $(X_i)$ that are IID, we have that
$$Var(X_1+dotsc+X_k)=Var(X_1)+dotsc +Var(X_k)=kVar(X_1).$$
Thus,
$$Var(X_1+dotsc +X_tau | tau =k)=Var(X_1 | tau =k)+dotsc Var(X_n |tau=k)$$
$$=Var(X_1)+dotsc +Var(X_k)=k Var(X_1),$$
and finally we obtain $Var(S_tau | tau)= tau Var(X_1)$.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060961%2fvariance-of-random-sum-of-random-i-i-d-variables-spot-the-mistake%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
4 Answers
4
active
oldest
votes
4 Answers
4
active
oldest
votes
active
oldest
votes
active
oldest
votes
I think one oopsie is: since $X_i$ and $X_j$ are independent, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = mathbb{E}[X_1]^2$ and not $mathbb{E}[X_1^2]$! There must be another mistake, because plugging that in returns zero, but I don't see it quite yet.
There's also a much simpler proof. $var(S_tau|tau) = var(sum_{i=1}^tau X_i|tau)$. The variance of the sum of i.i.d. random variables is the sum of the variance of any random variable in the sum, so $var(sum_{i=1}^tau X_i|tau) = sum_{i=1}^tau var(X_1)$
New contributor
add a comment |
I think one oopsie is: since $X_i$ and $X_j$ are independent, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = mathbb{E}[X_1]^2$ and not $mathbb{E}[X_1^2]$! There must be another mistake, because plugging that in returns zero, but I don't see it quite yet.
There's also a much simpler proof. $var(S_tau|tau) = var(sum_{i=1}^tau X_i|tau)$. The variance of the sum of i.i.d. random variables is the sum of the variance of any random variable in the sum, so $var(sum_{i=1}^tau X_i|tau) = sum_{i=1}^tau var(X_1)$
New contributor
add a comment |
I think one oopsie is: since $X_i$ and $X_j$ are independent, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = mathbb{E}[X_1]^2$ and not $mathbb{E}[X_1^2]$! There must be another mistake, because plugging that in returns zero, but I don't see it quite yet.
There's also a much simpler proof. $var(S_tau|tau) = var(sum_{i=1}^tau X_i|tau)$. The variance of the sum of i.i.d. random variables is the sum of the variance of any random variable in the sum, so $var(sum_{i=1}^tau X_i|tau) = sum_{i=1}^tau var(X_1)$
New contributor
I think one oopsie is: since $X_i$ and $X_j$ are independent, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = mathbb{E}[X_1]^2$ and not $mathbb{E}[X_1^2]$! There must be another mistake, because plugging that in returns zero, but I don't see it quite yet.
There's also a much simpler proof. $var(S_tau|tau) = var(sum_{i=1}^tau X_i|tau)$. The variance of the sum of i.i.d. random variables is the sum of the variance of any random variable in the sum, so $var(sum_{i=1}^tau X_i|tau) = sum_{i=1}^tau var(X_1)$
New contributor
New contributor
answered 2 days ago
Allen
11
11
New contributor
New contributor
add a comment |
add a comment |
First of all, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = (mathbb{E}[X_1])^2 ne mathbb{E}[X_1^2]$
Also, there will be $tau$ terms of $X_i^2$ and $tau(tau-1)$ terms of $X_iX_j$ in the expansion of the square. Therefore your second equation will become
$$mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2$$
$$Var(S_tau|tau)=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2 - tau^2mathbb{E}(X_1)^2$$
$$= taumathbb{E}(X_1^2) -tau(mathbb{E}[X_1])^2 $$
$$= tau(mathbb{E}(X_1^2)-(mathbb{E}[X_1])^2)$$
$$=tau Var(X_1)$$
add a comment |
First of all, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = (mathbb{E}[X_1])^2 ne mathbb{E}[X_1^2]$
Also, there will be $tau$ terms of $X_i^2$ and $tau(tau-1)$ terms of $X_iX_j$ in the expansion of the square. Therefore your second equation will become
$$mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2$$
$$Var(S_tau|tau)=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2 - tau^2mathbb{E}(X_1)^2$$
$$= taumathbb{E}(X_1^2) -tau(mathbb{E}[X_1])^2 $$
$$= tau(mathbb{E}(X_1^2)-(mathbb{E}[X_1])^2)$$
$$=tau Var(X_1)$$
add a comment |
First of all, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = (mathbb{E}[X_1])^2 ne mathbb{E}[X_1^2]$
Also, there will be $tau$ terms of $X_i^2$ and $tau(tau-1)$ terms of $X_iX_j$ in the expansion of the square. Therefore your second equation will become
$$mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2$$
$$Var(S_tau|tau)=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2 - tau^2mathbb{E}(X_1)^2$$
$$= taumathbb{E}(X_1^2) -tau(mathbb{E}[X_1])^2 $$
$$= tau(mathbb{E}(X_1^2)-(mathbb{E}[X_1])^2)$$
$$=tau Var(X_1)$$
First of all, $mathbb{E}[X_iX_j] = mathbb{E}[X_i]mathbb{E}[X_j] = (mathbb{E}[X_1])^2 ne mathbb{E}[X_1^2]$
Also, there will be $tau$ terms of $X_i^2$ and $tau(tau-1)$ terms of $X_iX_j$ in the expansion of the square. Therefore your second equation will become
$$mathbb{E}(S^2_tau|tau)=mathbb{E}((X_1+ldots+X_tau)^2|tau)=sum_{i,j}^{n}mathbb{E}(X_iX_j|tau)mathbb{1}_{{i,jleq tau}}=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2$$
$$Var(S_tau|tau)=mathbb{E}(X_1^2)tau + tau(tau-1)(mathbb{E}[X_1])^2 - tau^2mathbb{E}(X_1)^2$$
$$= taumathbb{E}(X_1^2) -tau(mathbb{E}[X_1])^2 $$
$$= tau(mathbb{E}(X_1^2)-(mathbb{E}[X_1])^2)$$
$$=tau Var(X_1)$$
answered 2 days ago
Sauhard Sharma
74516
74516
add a comment |
add a comment |
The most succinct calculation writes the covariance of two iids using the Kronecker delta:$$operatorname{Var}S_tau=sum_{1le i,,jletau}operatorname{Cov}(X_i,,X_j)=sum_{ij}sigma^2delta_{ij}=tausigma^2.$$Your mistake, essentially, was to replace $delta_{ij}$ with $1$ throughout.
add a comment |
The most succinct calculation writes the covariance of two iids using the Kronecker delta:$$operatorname{Var}S_tau=sum_{1le i,,jletau}operatorname{Cov}(X_i,,X_j)=sum_{ij}sigma^2delta_{ij}=tausigma^2.$$Your mistake, essentially, was to replace $delta_{ij}$ with $1$ throughout.
add a comment |
The most succinct calculation writes the covariance of two iids using the Kronecker delta:$$operatorname{Var}S_tau=sum_{1le i,,jletau}operatorname{Cov}(X_i,,X_j)=sum_{ij}sigma^2delta_{ij}=tausigma^2.$$Your mistake, essentially, was to replace $delta_{ij}$ with $1$ throughout.
The most succinct calculation writes the covariance of two iids using the Kronecker delta:$$operatorname{Var}S_tau=sum_{1le i,,jletau}operatorname{Cov}(X_i,,X_j)=sum_{ij}sigma^2delta_{ij}=tausigma^2.$$Your mistake, essentially, was to replace $delta_{ij}$ with $1$ throughout.
answered 2 days ago
J.G.
23.2k22137
23.2k22137
add a comment |
add a comment |
Conditional on $tau=k$, where $1leq k leq n$, $S_tau=S_k=X_1+dotsc +X_k$. For any $(X_i)$ that are IID, we have that
$$Var(X_1+dotsc+X_k)=Var(X_1)+dotsc +Var(X_k)=kVar(X_1).$$
Thus,
$$Var(X_1+dotsc +X_tau | tau =k)=Var(X_1 | tau =k)+dotsc Var(X_n |tau=k)$$
$$=Var(X_1)+dotsc +Var(X_k)=k Var(X_1),$$
and finally we obtain $Var(S_tau | tau)= tau Var(X_1)$.
add a comment |
Conditional on $tau=k$, where $1leq k leq n$, $S_tau=S_k=X_1+dotsc +X_k$. For any $(X_i)$ that are IID, we have that
$$Var(X_1+dotsc+X_k)=Var(X_1)+dotsc +Var(X_k)=kVar(X_1).$$
Thus,
$$Var(X_1+dotsc +X_tau | tau =k)=Var(X_1 | tau =k)+dotsc Var(X_n |tau=k)$$
$$=Var(X_1)+dotsc +Var(X_k)=k Var(X_1),$$
and finally we obtain $Var(S_tau | tau)= tau Var(X_1)$.
add a comment |
Conditional on $tau=k$, where $1leq k leq n$, $S_tau=S_k=X_1+dotsc +X_k$. For any $(X_i)$ that are IID, we have that
$$Var(X_1+dotsc+X_k)=Var(X_1)+dotsc +Var(X_k)=kVar(X_1).$$
Thus,
$$Var(X_1+dotsc +X_tau | tau =k)=Var(X_1 | tau =k)+dotsc Var(X_n |tau=k)$$
$$=Var(X_1)+dotsc +Var(X_k)=k Var(X_1),$$
and finally we obtain $Var(S_tau | tau)= tau Var(X_1)$.
Conditional on $tau=k$, where $1leq k leq n$, $S_tau=S_k=X_1+dotsc +X_k$. For any $(X_i)$ that are IID, we have that
$$Var(X_1+dotsc+X_k)=Var(X_1)+dotsc +Var(X_k)=kVar(X_1).$$
Thus,
$$Var(X_1+dotsc +X_tau | tau =k)=Var(X_1 | tau =k)+dotsc Var(X_n |tau=k)$$
$$=Var(X_1)+dotsc +Var(X_k)=k Var(X_1),$$
and finally we obtain $Var(S_tau | tau)= tau Var(X_1)$.
answered 2 days ago
LoveTooNap29
1,0001613
1,0001613
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060961%2fvariance-of-random-sum-of-random-i-i-d-variables-spot-the-mistake%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown