Find orthogonal projection of $ [n,0,0,…,0]^T$ on subspace $V$
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
add a comment |
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
– A.Γ.
yesterday
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
– VirtualUser
yesterday
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
– amd
yesterday
add a comment |
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
$n>1$ Given is $$V = left{ vec{x} in mathbb R^n : x_1+x_2 + ... + x_n = 0 right} $$
a) Find orthogonal basis of $V^{perp} $
b) Find orthogonal projection $vec{x} = [n,0,0,...,0]^T$ on subspace $V$
If it comes to a)
$$dim V^{perp} = n - dim V = dim V^{perp} = n - n + 1 = 1$$ So $V^{perp} = span$ one_vector_perpendicular_to_v
Put $[1,1,1,...,1,1]^T$ - it is perpendicular to $V$
Let's start Gram–Schmidt process - but we have $1$ vector so $u_1 = [1,1,1,...,1,1]^T$ = orthogonal basis of $V^{perp}$
b) It seems to be very interesting and hard. I found basis of $V$:
$$[-1,1,0,0,...,0] = vec{v_1}$$
$$[-1,0,1,0,...,0] = vec{v_2}$$
$$[-1,0,0,1,...,0] = vec{v_3}$$
$$...$$
$$[-1,0,0,0,...,1] = vec{v_{n-1}}$$
Now I start Gram–Schmidt process
$$u_1 = v_1 $$
$$u_2 = v_2 - frac{1}{2} cdot v_1$$
$$u_ 3 = v_3 - frac{1}{4} cdot v_2 + frac{1}{8} cdot v_1 $$
$$u_4 = v_4 - frac{7}{16} cdot v_3 + frac{7}{64} cdot v_2 - frac{7}{64} cdot v_1$$
I don't even know if I don't take mistake.
Moreover the calculations getting harder and harder and I still don't see any regular sequence in it. Can somebody help me with this task?
linear-algebra orthogonality
linear-algebra orthogonality
asked yesterday
VirtualUser
44711
44711
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
– A.Γ.
yesterday
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
– VirtualUser
yesterday
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
– amd
yesterday
add a comment |
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
– A.Γ.
yesterday
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
– VirtualUser
yesterday
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
– amd
yesterday
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
– A.Γ.
yesterday
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
– A.Γ.
yesterday
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
– VirtualUser
yesterday
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
– VirtualUser
yesterday
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
– amd
yesterday
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
– amd
yesterday
add a comment |
1 Answer
1
active
oldest
votes
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
1
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060582%2ffind-orthogonal-projection-of-n-0-0-0t-on-subspace-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
1
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
add a comment |
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
1
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
add a comment |
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
Let us call $vec u$ the projection of $vec{x} = [n,0,0,...,0]^T$ on $V$.
$vec x - vec u$ is orthogonal to $V$, i.e. $vec x - vec u = a vec v$ with $v = (1, dots, 1)^T$ as you already showed
Therefore $vec x = a vec v + vec u$ with $vec u$ satisfying $sum_i u_i = 0$
Then,
$$ sum_i (x_i - a) = 0 $$
And finally $a = 1$ and
$$vec u = (n-1, -1, dots, -1) ^T $$
edited yesterday
answered yesterday
Damien
50714
50714
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
1
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
add a comment |
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
1
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
the answer should be $vec u = (n-1, 1, dots, 1) ^T $ or $ vec u = (n-1, -1, dots, -1) ^T$?
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
Great, it is a loooot of simpler way than my idea, thanks
– VirtualUser
yesterday
1
1
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
@VirtualUser Corrected. You read my answer before I had time to check it!
– Damien
yesterday
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060582%2ffind-orthogonal-projection-of-n-0-0-0t-on-subspace-v%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Why do you need an orthogonal basis of $V$? One can find the projection if one follows the vector $(1 1 ldots 1)$ until the line meets $V$.
– A.Γ.
yesterday
I need orthogonal basis because I have that formula for orthogonal projection: $ P_z(x) = sum_{j=1}^k <z_j,x>z_j $ where $z_1,...,z_k $ is orthogonal basis and I don't have any other idea how to do this task @A.Γ.
– VirtualUser
yesterday
You can save yourself a lot of work by taking advantage of the fact that the orthogonal projection onto a subspace is what’s left after subtracting the orthogonal projection onto its complement.
– amd
yesterday