Is $V$ isomorphic to direct sum of subspace $U$ and $V/U$?












1














Given a vector space $V$ and a subspace $U$ of $V$.
$$ V cong U oplus(V/U) $$
Does the above equation always hold? Where $oplus$ is external direct sum.
For finite dimensional vector space $V$, here is my attemp of prove:



Let dimension of $U$ be m, dimension of $V$ be $n$.
Find a basis of $U$ : ${ mathbf{ u_1, u_2, cdots ,u_m}}$ and extend it to a basis for $V$ : ${ mathbf{ u_1, u_2, cdots ,u_m, v_1, cdots,v_{n-m} } }$.



For every vector $mathbf{x} in V$, we can write $mathbf{x}= c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m} + d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$ uniquely. Define a linear map $T$ as $$T(mathbf{x})=(c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m}, [d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}])$$
,where $$ is used to express the equivalent class. We claim $T$ is an isomorphism.



Surjectivity is obvious. As for injectivity,

if $T(mathbf{x})=(mathbf{0},[mathbf{0}])$, then $ c_1 mathbf{u_1}+ cdots+ c_mmathbf{u_m}= mathbf{0} $
$Rightarrow c_1=0, c_2=0, cdots,c_m=0$
$Rightarrow x=d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$

Since $[d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}] = [mathbf{0}]$, we have
$ (d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}-mathbf{0})in U$, which means $d_1=0, d_2=0, cdots,d_{n-m}=0$
$Rightarrow mathbf{x}=mathbf{0}$, so $T$ is injective.



Is the above proof correct? Does this mean $V cong ker F oplus (V/ ker F) cong ker F oplusmathrm{im}F$ for any linear map $F$, because $ker F$ is a subspace of $V$ ?



The final question is about how should I prove it when the dimension of $V$ is infinite?










share|cite|improve this question





























    1














    Given a vector space $V$ and a subspace $U$ of $V$.
    $$ V cong U oplus(V/U) $$
    Does the above equation always hold? Where $oplus$ is external direct sum.
    For finite dimensional vector space $V$, here is my attemp of prove:



    Let dimension of $U$ be m, dimension of $V$ be $n$.
    Find a basis of $U$ : ${ mathbf{ u_1, u_2, cdots ,u_m}}$ and extend it to a basis for $V$ : ${ mathbf{ u_1, u_2, cdots ,u_m, v_1, cdots,v_{n-m} } }$.



    For every vector $mathbf{x} in V$, we can write $mathbf{x}= c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m} + d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$ uniquely. Define a linear map $T$ as $$T(mathbf{x})=(c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m}, [d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}])$$
    ,where $$ is used to express the equivalent class. We claim $T$ is an isomorphism.



    Surjectivity is obvious. As for injectivity,

    if $T(mathbf{x})=(mathbf{0},[mathbf{0}])$, then $ c_1 mathbf{u_1}+ cdots+ c_mmathbf{u_m}= mathbf{0} $
    $Rightarrow c_1=0, c_2=0, cdots,c_m=0$
    $Rightarrow x=d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$

    Since $[d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}] = [mathbf{0}]$, we have
    $ (d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}-mathbf{0})in U$, which means $d_1=0, d_2=0, cdots,d_{n-m}=0$
    $Rightarrow mathbf{x}=mathbf{0}$, so $T$ is injective.



    Is the above proof correct? Does this mean $V cong ker F oplus (V/ ker F) cong ker F oplusmathrm{im}F$ for any linear map $F$, because $ker F$ is a subspace of $V$ ?



    The final question is about how should I prove it when the dimension of $V$ is infinite?










    share|cite|improve this question



























      1












      1








      1







      Given a vector space $V$ and a subspace $U$ of $V$.
      $$ V cong U oplus(V/U) $$
      Does the above equation always hold? Where $oplus$ is external direct sum.
      For finite dimensional vector space $V$, here is my attemp of prove:



      Let dimension of $U$ be m, dimension of $V$ be $n$.
      Find a basis of $U$ : ${ mathbf{ u_1, u_2, cdots ,u_m}}$ and extend it to a basis for $V$ : ${ mathbf{ u_1, u_2, cdots ,u_m, v_1, cdots,v_{n-m} } }$.



      For every vector $mathbf{x} in V$, we can write $mathbf{x}= c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m} + d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$ uniquely. Define a linear map $T$ as $$T(mathbf{x})=(c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m}, [d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}])$$
      ,where $$ is used to express the equivalent class. We claim $T$ is an isomorphism.



      Surjectivity is obvious. As for injectivity,

      if $T(mathbf{x})=(mathbf{0},[mathbf{0}])$, then $ c_1 mathbf{u_1}+ cdots+ c_mmathbf{u_m}= mathbf{0} $
      $Rightarrow c_1=0, c_2=0, cdots,c_m=0$
      $Rightarrow x=d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$

      Since $[d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}] = [mathbf{0}]$, we have
      $ (d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}-mathbf{0})in U$, which means $d_1=0, d_2=0, cdots,d_{n-m}=0$
      $Rightarrow mathbf{x}=mathbf{0}$, so $T$ is injective.



      Is the above proof correct? Does this mean $V cong ker F oplus (V/ ker F) cong ker F oplusmathrm{im}F$ for any linear map $F$, because $ker F$ is a subspace of $V$ ?



      The final question is about how should I prove it when the dimension of $V$ is infinite?










      share|cite|improve this question















      Given a vector space $V$ and a subspace $U$ of $V$.
      $$ V cong U oplus(V/U) $$
      Does the above equation always hold? Where $oplus$ is external direct sum.
      For finite dimensional vector space $V$, here is my attemp of prove:



      Let dimension of $U$ be m, dimension of $V$ be $n$.
      Find a basis of $U$ : ${ mathbf{ u_1, u_2, cdots ,u_m}}$ and extend it to a basis for $V$ : ${ mathbf{ u_1, u_2, cdots ,u_m, v_1, cdots,v_{n-m} } }$.



      For every vector $mathbf{x} in V$, we can write $mathbf{x}= c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m} + d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$ uniquely. Define a linear map $T$ as $$T(mathbf{x})=(c_1 mathbf{u_1}+ cdots+ c_m mathbf{u_m}, [d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}])$$
      ,where $$ is used to express the equivalent class. We claim $T$ is an isomorphism.



      Surjectivity is obvious. As for injectivity,

      if $T(mathbf{x})=(mathbf{0},[mathbf{0}])$, then $ c_1 mathbf{u_1}+ cdots+ c_mmathbf{u_m}= mathbf{0} $
      $Rightarrow c_1=0, c_2=0, cdots,c_m=0$
      $Rightarrow x=d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}$

      Since $[d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}] = [mathbf{0}]$, we have
      $ (d_1 mathbf{v_1}+ cdots d_{n-m} mathbf{v_{n-m}}-mathbf{0})in U$, which means $d_1=0, d_2=0, cdots,d_{n-m}=0$
      $Rightarrow mathbf{x}=mathbf{0}$, so $T$ is injective.



      Is the above proof correct? Does this mean $V cong ker F oplus (V/ ker F) cong ker F oplusmathrm{im}F$ for any linear map $F$, because $ker F$ is a subspace of $V$ ?



      The final question is about how should I prove it when the dimension of $V$ is infinite?







      linear-algebra proof-verification vector-space-isomorphism






      share|cite|improve this question















      share|cite|improve this question













      share|cite|improve this question




      share|cite|improve this question








      edited 2 days ago









      Adam Higgins

      44911




      44911










      asked 2 days ago









      Rikeijin

      949




      949






















          2 Answers
          2






          active

          oldest

          votes


















          2














          It is true that for every finite dimensional vector space $V$ with $U$ a vector subspace that
          $$
          V cong U oplus (V / U)
          $$

          I think your proof is essentially correct. And yes it is true that for $T: V rightarrow W$ any linear that we have
          $$
          V cong operatorname{ker}(T) oplus operatorname{Im}(T)
          $$

          The rank-nullity theorem is a direct consequence of this.



          In more technical language, we say that every "short exact sequence" of finite dimensional vector spaces over a field $k$ $textit{splits}$. What this means is that if $T : U rightarrow V$, $S : V rightarrow W$ are linear maps such that $T$ is injective, $operatorname{ker}(S) = operatorname{Im}(T)$, and $S$ is surjective, then $V cong U oplus W$.



          Note then that this directly gives us our result since if $U$ is a subspace of $V$, then the inclusion map $iota : U rightarrow V$ and projection map $pi : V rightarrow (V / U) $ set up exactly a short exact sequence.





          In terms of whether or not this extends to infinite dimensional vector spaces, the result does hold again (assuming the axiom of choice), and the proof is essentially the same. All your proof relies on is the ability to extend a basis of a subspace to a basis of your entire space. We can do this with the axiom of choice.






          share|cite|improve this answer





















          • I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
            – irchans
            2 days ago












          • @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
            – Adam Higgins
            2 days ago












          • @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
            – Adam Higgins
            2 days ago






          • 1




            Thank you very much !
            – irchans
            2 days ago






          • 1




            This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
            – irchans
            2 days ago



















          1














          Your proof works. The answer to your second question is yes, that is true. For an infinite dimensional vector space, take any linear map $F: V -> W$. Then $U = ker F$ is a subspace of $V$. Note that we have a short exact sequence (if you don't know what that means, don't worry, the explanation is coming)
          $$0to Uto Vto V/Uto 0$$



          (That is, there's an injective map $U to V$ (inclusion, I'll call it $i$) and a surjective map $V to V/U$ (the quotient map, I'll call it $q$) such that the image of the injection is the kernel of the surjection).



          But there's also a surjective map $V to U$ (projection onto $U$, I'll call it $p$), and note that for any $u in U$, $pi(u) = u$ (since $p$ fixes $u$).



          Now, we're going to show that $V$ is the (internal) direct sum of the kernel of $p$ and the image of $i$. First, note that it's the sum of the two: for any $v in V$, $v = (v - ip(v)) + ip(v)$, $ip(v)$ is obviously in the image of $i$, and $p(v - ip(v)) = p(v) - pip(v) = 0$ (with the last equality being due to our note about $p$, since $p(v)in U$.



          And further, the intersection is trivial: if $v in ker(p)capmathrm{im}(i)$ then there is some $uin U$ such that $i(u) = v$, and $pi(u) = p(v) = 0$, but $pi(u) = u$, so $u = 0$, hence $v = 0$. Thus, $V = ker(p)oplus mathrm{im}(i)$.



          Now, it's clear that $mathrm{im}(i)cong U$ since it's the image of $U$ under an injective map, so we need only show that $ker(p)cong V/U$.



          For that purpose, since $q$ is surjective, for any $w in V/U$, there is some $v in V$ such that $w = q(b)$. But since $V = ker(p)oplusmathrm{im}(i)$, there are unique $u in U$, $x in ker(p)$ such that $v = i(u) + x$, so $w = q(b) = q(i(u)+x) = qi(u) + q(x)=q(x)$ (since the image of $i$ is the kernel of $q$, in particular $qi = 0$). Thus, $q|_{ker(p)}: ker(p)to V/U$ is surjective. But also, if $q(v) = 0$, and $v in ker(p)$. But $ker(q) = mathrm{im}(i)$, and $p$ fixes $mathrm{im}(i)$, so the only element that it sends to $0$ is $0$ itself, so we must have $v = 0$, hence $q|_{ker(p)}$ is also injective, so is an isomorphism.



          Thus, we have $V cong mathrm{im}(i)oplusker(p) cong Uoplus (V/U)$, as required.



          This is precisely a special case of the Splitting Lemma, and my proof is essentially just one part of the proof of that, translated: there are easier ways to prove it, but I thought that it would be useful to see it in a more broadly applicable form.






          share|cite|improve this answer





















            Your Answer





            StackExchange.ifUsing("editor", function () {
            return StackExchange.using("mathjaxEditing", function () {
            StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
            StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
            });
            });
            }, "mathjax-editing");

            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "69"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060802%2fis-v-isomorphic-to-direct-sum-of-subspace-u-and-v-u%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            2 Answers
            2






            active

            oldest

            votes








            2 Answers
            2






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            2














            It is true that for every finite dimensional vector space $V$ with $U$ a vector subspace that
            $$
            V cong U oplus (V / U)
            $$

            I think your proof is essentially correct. And yes it is true that for $T: V rightarrow W$ any linear that we have
            $$
            V cong operatorname{ker}(T) oplus operatorname{Im}(T)
            $$

            The rank-nullity theorem is a direct consequence of this.



            In more technical language, we say that every "short exact sequence" of finite dimensional vector spaces over a field $k$ $textit{splits}$. What this means is that if $T : U rightarrow V$, $S : V rightarrow W$ are linear maps such that $T$ is injective, $operatorname{ker}(S) = operatorname{Im}(T)$, and $S$ is surjective, then $V cong U oplus W$.



            Note then that this directly gives us our result since if $U$ is a subspace of $V$, then the inclusion map $iota : U rightarrow V$ and projection map $pi : V rightarrow (V / U) $ set up exactly a short exact sequence.





            In terms of whether or not this extends to infinite dimensional vector spaces, the result does hold again (assuming the axiom of choice), and the proof is essentially the same. All your proof relies on is the ability to extend a basis of a subspace to a basis of your entire space. We can do this with the axiom of choice.






            share|cite|improve this answer





















            • I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
              – irchans
              2 days ago












            • @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
              – Adam Higgins
              2 days ago












            • @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
              – Adam Higgins
              2 days ago






            • 1




              Thank you very much !
              – irchans
              2 days ago






            • 1




              This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
              – irchans
              2 days ago
















            2














            It is true that for every finite dimensional vector space $V$ with $U$ a vector subspace that
            $$
            V cong U oplus (V / U)
            $$

            I think your proof is essentially correct. And yes it is true that for $T: V rightarrow W$ any linear that we have
            $$
            V cong operatorname{ker}(T) oplus operatorname{Im}(T)
            $$

            The rank-nullity theorem is a direct consequence of this.



            In more technical language, we say that every "short exact sequence" of finite dimensional vector spaces over a field $k$ $textit{splits}$. What this means is that if $T : U rightarrow V$, $S : V rightarrow W$ are linear maps such that $T$ is injective, $operatorname{ker}(S) = operatorname{Im}(T)$, and $S$ is surjective, then $V cong U oplus W$.



            Note then that this directly gives us our result since if $U$ is a subspace of $V$, then the inclusion map $iota : U rightarrow V$ and projection map $pi : V rightarrow (V / U) $ set up exactly a short exact sequence.





            In terms of whether or not this extends to infinite dimensional vector spaces, the result does hold again (assuming the axiom of choice), and the proof is essentially the same. All your proof relies on is the ability to extend a basis of a subspace to a basis of your entire space. We can do this with the axiom of choice.






            share|cite|improve this answer





















            • I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
              – irchans
              2 days ago












            • @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
              – Adam Higgins
              2 days ago












            • @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
              – Adam Higgins
              2 days ago






            • 1




              Thank you very much !
              – irchans
              2 days ago






            • 1




              This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
              – irchans
              2 days ago














            2












            2








            2






            It is true that for every finite dimensional vector space $V$ with $U$ a vector subspace that
            $$
            V cong U oplus (V / U)
            $$

            I think your proof is essentially correct. And yes it is true that for $T: V rightarrow W$ any linear that we have
            $$
            V cong operatorname{ker}(T) oplus operatorname{Im}(T)
            $$

            The rank-nullity theorem is a direct consequence of this.



            In more technical language, we say that every "short exact sequence" of finite dimensional vector spaces over a field $k$ $textit{splits}$. What this means is that if $T : U rightarrow V$, $S : V rightarrow W$ are linear maps such that $T$ is injective, $operatorname{ker}(S) = operatorname{Im}(T)$, and $S$ is surjective, then $V cong U oplus W$.



            Note then that this directly gives us our result since if $U$ is a subspace of $V$, then the inclusion map $iota : U rightarrow V$ and projection map $pi : V rightarrow (V / U) $ set up exactly a short exact sequence.





            In terms of whether or not this extends to infinite dimensional vector spaces, the result does hold again (assuming the axiom of choice), and the proof is essentially the same. All your proof relies on is the ability to extend a basis of a subspace to a basis of your entire space. We can do this with the axiom of choice.






            share|cite|improve this answer












            It is true that for every finite dimensional vector space $V$ with $U$ a vector subspace that
            $$
            V cong U oplus (V / U)
            $$

            I think your proof is essentially correct. And yes it is true that for $T: V rightarrow W$ any linear that we have
            $$
            V cong operatorname{ker}(T) oplus operatorname{Im}(T)
            $$

            The rank-nullity theorem is a direct consequence of this.



            In more technical language, we say that every "short exact sequence" of finite dimensional vector spaces over a field $k$ $textit{splits}$. What this means is that if $T : U rightarrow V$, $S : V rightarrow W$ are linear maps such that $T$ is injective, $operatorname{ker}(S) = operatorname{Im}(T)$, and $S$ is surjective, then $V cong U oplus W$.



            Note then that this directly gives us our result since if $U$ is a subspace of $V$, then the inclusion map $iota : U rightarrow V$ and projection map $pi : V rightarrow (V / U) $ set up exactly a short exact sequence.





            In terms of whether or not this extends to infinite dimensional vector spaces, the result does hold again (assuming the axiom of choice), and the proof is essentially the same. All your proof relies on is the ability to extend a basis of a subspace to a basis of your entire space. We can do this with the axiom of choice.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered 2 days ago









            Adam Higgins

            44911




            44911












            • I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
              – irchans
              2 days ago












            • @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
              – Adam Higgins
              2 days ago












            • @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
              – Adam Higgins
              2 days ago






            • 1




              Thank you very much !
              – irchans
              2 days ago






            • 1




              This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
              – irchans
              2 days ago


















            • I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
              – irchans
              2 days ago












            • @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
              – Adam Higgins
              2 days ago












            • @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
              – Adam Higgins
              2 days ago






            • 1




              Thank you very much !
              – irchans
              2 days ago






            • 1




              This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
              – irchans
              2 days ago
















            I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
            – irchans
            2 days ago






            I am wondering if the following statement is false for infinite dimensional spaces. "If $V$ is a vector space and $U$ is a subspace of $V$, then there exists another subspace of $V$ called $U^perp$ such that every element of $V$ can be uniquely expressed as the sum of an element from $U$ with an element from $U^perp$." (I am thinking a counterexample would be if $U$ was the set of real number sequences with finite support (i.e. eventually zero) and $V$ is the set of all real number sequences.)
            – irchans
            2 days ago














            @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
            – Adam Higgins
            2 days ago






            @irchans If we take the axiom of choice, then every subspace of a vector space has a direct sum complement (what you call the perpendicular space, but this language is typically reserved for a space equipped with some bi-linear form). The proof is pretty simple. Let $V$ be a $k$-vector space, with $U$ a vector subspace. Let $mathcal{B}_{U}$ be a basis for $U$ and extend it to a basis $mathcal{B}_{V}$ (using the axiom of choice) for $V$. Then let $W = operatorname{Span}_{k}left( mathcal{B}_{V} backslash mathcal{B}_{U} right)$. Then $V = U oplus W$.
            – Adam Higgins
            2 days ago














            @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
            – Adam Higgins
            2 days ago




            @irchans Perhaps the reason you think that your example is a counter example is because of the $textit{weirdness}$ of bases of infinite dimensional vector spaces. Notice that a subset $S$ of a vector space $V$ is said to be a basis if and only if every element $v in V$ can be written as a $textbf{finite}$ linear combination of the elements of $S$, and that there is no finite non-trivial linear relation amongst the elements of $S$.
            – Adam Higgins
            2 days ago




            1




            1




            Thank you very much !
            – irchans
            2 days ago




            Thank you very much !
            – irchans
            2 days ago




            1




            1




            This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
            – irchans
            2 days ago




            This set of notes seems relevant math.lsa.umich.edu/~kesmith/infinite.pdf
            – irchans
            2 days ago











            1














            Your proof works. The answer to your second question is yes, that is true. For an infinite dimensional vector space, take any linear map $F: V -> W$. Then $U = ker F$ is a subspace of $V$. Note that we have a short exact sequence (if you don't know what that means, don't worry, the explanation is coming)
            $$0to Uto Vto V/Uto 0$$



            (That is, there's an injective map $U to V$ (inclusion, I'll call it $i$) and a surjective map $V to V/U$ (the quotient map, I'll call it $q$) such that the image of the injection is the kernel of the surjection).



            But there's also a surjective map $V to U$ (projection onto $U$, I'll call it $p$), and note that for any $u in U$, $pi(u) = u$ (since $p$ fixes $u$).



            Now, we're going to show that $V$ is the (internal) direct sum of the kernel of $p$ and the image of $i$. First, note that it's the sum of the two: for any $v in V$, $v = (v - ip(v)) + ip(v)$, $ip(v)$ is obviously in the image of $i$, and $p(v - ip(v)) = p(v) - pip(v) = 0$ (with the last equality being due to our note about $p$, since $p(v)in U$.



            And further, the intersection is trivial: if $v in ker(p)capmathrm{im}(i)$ then there is some $uin U$ such that $i(u) = v$, and $pi(u) = p(v) = 0$, but $pi(u) = u$, so $u = 0$, hence $v = 0$. Thus, $V = ker(p)oplus mathrm{im}(i)$.



            Now, it's clear that $mathrm{im}(i)cong U$ since it's the image of $U$ under an injective map, so we need only show that $ker(p)cong V/U$.



            For that purpose, since $q$ is surjective, for any $w in V/U$, there is some $v in V$ such that $w = q(b)$. But since $V = ker(p)oplusmathrm{im}(i)$, there are unique $u in U$, $x in ker(p)$ such that $v = i(u) + x$, so $w = q(b) = q(i(u)+x) = qi(u) + q(x)=q(x)$ (since the image of $i$ is the kernel of $q$, in particular $qi = 0$). Thus, $q|_{ker(p)}: ker(p)to V/U$ is surjective. But also, if $q(v) = 0$, and $v in ker(p)$. But $ker(q) = mathrm{im}(i)$, and $p$ fixes $mathrm{im}(i)$, so the only element that it sends to $0$ is $0$ itself, so we must have $v = 0$, hence $q|_{ker(p)}$ is also injective, so is an isomorphism.



            Thus, we have $V cong mathrm{im}(i)oplusker(p) cong Uoplus (V/U)$, as required.



            This is precisely a special case of the Splitting Lemma, and my proof is essentially just one part of the proof of that, translated: there are easier ways to prove it, but I thought that it would be useful to see it in a more broadly applicable form.






            share|cite|improve this answer


























              1














              Your proof works. The answer to your second question is yes, that is true. For an infinite dimensional vector space, take any linear map $F: V -> W$. Then $U = ker F$ is a subspace of $V$. Note that we have a short exact sequence (if you don't know what that means, don't worry, the explanation is coming)
              $$0to Uto Vto V/Uto 0$$



              (That is, there's an injective map $U to V$ (inclusion, I'll call it $i$) and a surjective map $V to V/U$ (the quotient map, I'll call it $q$) such that the image of the injection is the kernel of the surjection).



              But there's also a surjective map $V to U$ (projection onto $U$, I'll call it $p$), and note that for any $u in U$, $pi(u) = u$ (since $p$ fixes $u$).



              Now, we're going to show that $V$ is the (internal) direct sum of the kernel of $p$ and the image of $i$. First, note that it's the sum of the two: for any $v in V$, $v = (v - ip(v)) + ip(v)$, $ip(v)$ is obviously in the image of $i$, and $p(v - ip(v)) = p(v) - pip(v) = 0$ (with the last equality being due to our note about $p$, since $p(v)in U$.



              And further, the intersection is trivial: if $v in ker(p)capmathrm{im}(i)$ then there is some $uin U$ such that $i(u) = v$, and $pi(u) = p(v) = 0$, but $pi(u) = u$, so $u = 0$, hence $v = 0$. Thus, $V = ker(p)oplus mathrm{im}(i)$.



              Now, it's clear that $mathrm{im}(i)cong U$ since it's the image of $U$ under an injective map, so we need only show that $ker(p)cong V/U$.



              For that purpose, since $q$ is surjective, for any $w in V/U$, there is some $v in V$ such that $w = q(b)$. But since $V = ker(p)oplusmathrm{im}(i)$, there are unique $u in U$, $x in ker(p)$ such that $v = i(u) + x$, so $w = q(b) = q(i(u)+x) = qi(u) + q(x)=q(x)$ (since the image of $i$ is the kernel of $q$, in particular $qi = 0$). Thus, $q|_{ker(p)}: ker(p)to V/U$ is surjective. But also, if $q(v) = 0$, and $v in ker(p)$. But $ker(q) = mathrm{im}(i)$, and $p$ fixes $mathrm{im}(i)$, so the only element that it sends to $0$ is $0$ itself, so we must have $v = 0$, hence $q|_{ker(p)}$ is also injective, so is an isomorphism.



              Thus, we have $V cong mathrm{im}(i)oplusker(p) cong Uoplus (V/U)$, as required.



              This is precisely a special case of the Splitting Lemma, and my proof is essentially just one part of the proof of that, translated: there are easier ways to prove it, but I thought that it would be useful to see it in a more broadly applicable form.






              share|cite|improve this answer
























                1












                1








                1






                Your proof works. The answer to your second question is yes, that is true. For an infinite dimensional vector space, take any linear map $F: V -> W$. Then $U = ker F$ is a subspace of $V$. Note that we have a short exact sequence (if you don't know what that means, don't worry, the explanation is coming)
                $$0to Uto Vto V/Uto 0$$



                (That is, there's an injective map $U to V$ (inclusion, I'll call it $i$) and a surjective map $V to V/U$ (the quotient map, I'll call it $q$) such that the image of the injection is the kernel of the surjection).



                But there's also a surjective map $V to U$ (projection onto $U$, I'll call it $p$), and note that for any $u in U$, $pi(u) = u$ (since $p$ fixes $u$).



                Now, we're going to show that $V$ is the (internal) direct sum of the kernel of $p$ and the image of $i$. First, note that it's the sum of the two: for any $v in V$, $v = (v - ip(v)) + ip(v)$, $ip(v)$ is obviously in the image of $i$, and $p(v - ip(v)) = p(v) - pip(v) = 0$ (with the last equality being due to our note about $p$, since $p(v)in U$.



                And further, the intersection is trivial: if $v in ker(p)capmathrm{im}(i)$ then there is some $uin U$ such that $i(u) = v$, and $pi(u) = p(v) = 0$, but $pi(u) = u$, so $u = 0$, hence $v = 0$. Thus, $V = ker(p)oplus mathrm{im}(i)$.



                Now, it's clear that $mathrm{im}(i)cong U$ since it's the image of $U$ under an injective map, so we need only show that $ker(p)cong V/U$.



                For that purpose, since $q$ is surjective, for any $w in V/U$, there is some $v in V$ such that $w = q(b)$. But since $V = ker(p)oplusmathrm{im}(i)$, there are unique $u in U$, $x in ker(p)$ such that $v = i(u) + x$, so $w = q(b) = q(i(u)+x) = qi(u) + q(x)=q(x)$ (since the image of $i$ is the kernel of $q$, in particular $qi = 0$). Thus, $q|_{ker(p)}: ker(p)to V/U$ is surjective. But also, if $q(v) = 0$, and $v in ker(p)$. But $ker(q) = mathrm{im}(i)$, and $p$ fixes $mathrm{im}(i)$, so the only element that it sends to $0$ is $0$ itself, so we must have $v = 0$, hence $q|_{ker(p)}$ is also injective, so is an isomorphism.



                Thus, we have $V cong mathrm{im}(i)oplusker(p) cong Uoplus (V/U)$, as required.



                This is precisely a special case of the Splitting Lemma, and my proof is essentially just one part of the proof of that, translated: there are easier ways to prove it, but I thought that it would be useful to see it in a more broadly applicable form.






                share|cite|improve this answer












                Your proof works. The answer to your second question is yes, that is true. For an infinite dimensional vector space, take any linear map $F: V -> W$. Then $U = ker F$ is a subspace of $V$. Note that we have a short exact sequence (if you don't know what that means, don't worry, the explanation is coming)
                $$0to Uto Vto V/Uto 0$$



                (That is, there's an injective map $U to V$ (inclusion, I'll call it $i$) and a surjective map $V to V/U$ (the quotient map, I'll call it $q$) such that the image of the injection is the kernel of the surjection).



                But there's also a surjective map $V to U$ (projection onto $U$, I'll call it $p$), and note that for any $u in U$, $pi(u) = u$ (since $p$ fixes $u$).



                Now, we're going to show that $V$ is the (internal) direct sum of the kernel of $p$ and the image of $i$. First, note that it's the sum of the two: for any $v in V$, $v = (v - ip(v)) + ip(v)$, $ip(v)$ is obviously in the image of $i$, and $p(v - ip(v)) = p(v) - pip(v) = 0$ (with the last equality being due to our note about $p$, since $p(v)in U$.



                And further, the intersection is trivial: if $v in ker(p)capmathrm{im}(i)$ then there is some $uin U$ such that $i(u) = v$, and $pi(u) = p(v) = 0$, but $pi(u) = u$, so $u = 0$, hence $v = 0$. Thus, $V = ker(p)oplus mathrm{im}(i)$.



                Now, it's clear that $mathrm{im}(i)cong U$ since it's the image of $U$ under an injective map, so we need only show that $ker(p)cong V/U$.



                For that purpose, since $q$ is surjective, for any $w in V/U$, there is some $v in V$ such that $w = q(b)$. But since $V = ker(p)oplusmathrm{im}(i)$, there are unique $u in U$, $x in ker(p)$ such that $v = i(u) + x$, so $w = q(b) = q(i(u)+x) = qi(u) + q(x)=q(x)$ (since the image of $i$ is the kernel of $q$, in particular $qi = 0$). Thus, $q|_{ker(p)}: ker(p)to V/U$ is surjective. But also, if $q(v) = 0$, and $v in ker(p)$. But $ker(q) = mathrm{im}(i)$, and $p$ fixes $mathrm{im}(i)$, so the only element that it sends to $0$ is $0$ itself, so we must have $v = 0$, hence $q|_{ker(p)}$ is also injective, so is an isomorphism.



                Thus, we have $V cong mathrm{im}(i)oplusker(p) cong Uoplus (V/U)$, as required.



                This is precisely a special case of the Splitting Lemma, and my proof is essentially just one part of the proof of that, translated: there are easier ways to prove it, but I thought that it would be useful to see it in a more broadly applicable form.







                share|cite|improve this answer












                share|cite|improve this answer



                share|cite|improve this answer










                answered 2 days ago









                user3482749

                2,728414




                2,728414






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Mathematics Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.





                    Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                    Please pay close attention to the following guidance:


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3060802%2fis-v-isomorphic-to-direct-sum-of-subspace-u-and-v-u%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    1300-talet

                    1300-talet

                    Has there ever been an instance of an active nuclear power plant within or near a war zone?