Noncommuting complex matrices: Existence of a simultaneous eigenvector












3














Let $A$ and $B$ be $ntimes n$ matrices with complex entries such that
$AB - BA$ is a linear combination of $A$ and $B$.



I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.










share|cite|improve this question
























  • Anything you tried to solve this problem?
    – Hetebrij
    Nov 29 '15 at 22:17










  • So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important.
    – Mark Frazier
    Nov 29 '15 at 22:27












  • What is given means $AB-BA=lambda A+mu B$. Use that
    – Denis Düsseldorf
    Nov 29 '15 at 22:33










  • The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis.
    – Hetebrij
    Nov 29 '15 at 23:41












  • In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $ker(A)$.
    – loup blanc
    Dec 3 '15 at 15:13
















3














Let $A$ and $B$ be $ntimes n$ matrices with complex entries such that
$AB - BA$ is a linear combination of $A$ and $B$.



I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.










share|cite|improve this question
























  • Anything you tried to solve this problem?
    – Hetebrij
    Nov 29 '15 at 22:17










  • So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important.
    – Mark Frazier
    Nov 29 '15 at 22:27












  • What is given means $AB-BA=lambda A+mu B$. Use that
    – Denis Düsseldorf
    Nov 29 '15 at 22:33










  • The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis.
    – Hetebrij
    Nov 29 '15 at 23:41












  • In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $ker(A)$.
    – loup blanc
    Dec 3 '15 at 15:13














3












3








3







Let $A$ and $B$ be $ntimes n$ matrices with complex entries such that
$AB - BA$ is a linear combination of $A$ and $B$.



I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.










share|cite|improve this question















Let $A$ and $B$ be $ntimes n$ matrices with complex entries such that
$AB - BA$ is a linear combination of $A$ and $B$.



I'd like to prove that there exists a non-zero vector $v$ that is an eigenvector of both $A$ and $B$.







linear-algebra matrices eigenvalues-eigenvectors lie-algebras






share|cite|improve this question















share|cite|improve this question













share|cite|improve this question




share|cite|improve this question








edited Jan 5 at 7:28









Hanno

2,061425




2,061425










asked Nov 29 '15 at 22:13









Mark FrazierMark Frazier

325




325












  • Anything you tried to solve this problem?
    – Hetebrij
    Nov 29 '15 at 22:17










  • So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important.
    – Mark Frazier
    Nov 29 '15 at 22:27












  • What is given means $AB-BA=lambda A+mu B$. Use that
    – Denis Düsseldorf
    Nov 29 '15 at 22:33










  • The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis.
    – Hetebrij
    Nov 29 '15 at 23:41












  • In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $ker(A)$.
    – loup blanc
    Dec 3 '15 at 15:13


















  • Anything you tried to solve this problem?
    – Hetebrij
    Nov 29 '15 at 22:17










  • So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important.
    – Mark Frazier
    Nov 29 '15 at 22:27












  • What is given means $AB-BA=lambda A+mu B$. Use that
    – Denis Düsseldorf
    Nov 29 '15 at 22:33










  • The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis.
    – Hetebrij
    Nov 29 '15 at 23:41












  • In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $ker(A)$.
    – loup blanc
    Dec 3 '15 at 15:13
















Anything you tried to solve this problem?
– Hetebrij
Nov 29 '15 at 22:17




Anything you tried to solve this problem?
– Hetebrij
Nov 29 '15 at 22:17












So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important.
– Mark Frazier
Nov 29 '15 at 22:27






So, I know that eigenvectors are the solutions such that the determinant is zero when you know the eigenvalues to each. I assume that, since the matrices are distinct, their eigenvalues will have to be different for that matching eigenvector. Otherwise, I am very lost about why the linear combination of A and B is important.
– Mark Frazier
Nov 29 '15 at 22:27














What is given means $AB-BA=lambda A+mu B$. Use that
– Denis Düsseldorf
Nov 29 '15 at 22:33




What is given means $AB-BA=lambda A+mu B$. Use that
– Denis Düsseldorf
Nov 29 '15 at 22:33












The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis.
– Hetebrij
Nov 29 '15 at 23:41






The linear combination of $A$ and $B$ is important because in general we do not know if $A$ and $B$ are diagonalizable w.r.t. the same basis. However, in the more general case that $AB-BA=0$, we do know that $A$ and $B$ are diagonalizable w.r.t. the same basis.
– Hetebrij
Nov 29 '15 at 23:41














In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $ker(A)$.
– loup blanc
Dec 3 '15 at 15:13




In fact $A,B$ are simultaneously triangularizable. If you know Lie theory, then it is very easy. Otherwise do as follows. 1. Reduce the problem to the case $AB-BA=A$. 2. Calculate, by a recurrence reasoning, $A^kB-BA^k$. 3. Show that $A$ is nilpotent. 4. Consider $ker(A)$.
– loup blanc
Dec 3 '15 at 15:13










1 Answer
1






active

oldest

votes


















1














It is assumed throughout that




The given matrices $A$ and $B$ are linearly independent in the $mathbb C$-vector space $M_n(mathbb C)$,

thus ignoring the unremarkable setup where one matrix is a multiple of the other.




My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider
$$0:neq:[A,B]:=:AB-BA:=:alpha A+beta B$$
and at least one of the scalars $alpha,beta $ is not zero. Divide by a scalar, say $alphaneq0$, then set
$$B'=frac1alpha B;text{ and }; A'= A+beta B'quadtext{to obtain };
big[,A',B'big]:=:A',.$$

Thus in the sequel we may start w.l.o.g. from
$,mathbf{AB-BA=Aqquadqquad (star)}$



I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...

simply jump: Goto The shared eigenvector !



Examples

how to satisfy the relation above:




  • If $,n=2$, then $,A=begin{pmatrix}0&1\ 0&0end{pmatrix}:$ and
    $;B=frac 12begin{pmatrix} -1&0\ 0&1end{pmatrix};$ do the job.
    $:begin{pmatrix}1\ 0end{pmatrix}$ is a joint eigenvector.


  • For $n=3$ one may consider
    $$A:=:begin{pmatrix}0&a_2&a_3\ 0&0&0\ 0&0&0end{pmatrix};,quad
    B:=:begin{pmatrix}b&0&0\ 0&b+1&0\ 0&0&b+1end{pmatrix}$$

    where $a_2,a_3,binmathbb C$,and $a_2,a_3$ not both zero.



It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:

For each $ninmathbb N$ one has
$$big[,A^n,Bbig]:=:n,A^n,$$
the proof by induction using
$,[AA^{n-1},B] = A[A^{n-1},B] + [A,B],A^{n-1}$ (derivation property) is straightforward. Let $|cdot|$ be a submultiplicative matrix norm, then
$$n|A^n|:=:|A^nB-BA^n|;leq;2|B| |A^n|quadforall,n$$
which implies $A^n=0,$ at the latest when $,n>2|B|$. Thus $A$ is nilpotent.



The shared eigenvector

for $B$ and $A$ does exist: Pick an eigenvector $wneq 0$ to an eigenvalue $mu$ of $B$ such that $,mu-1,$ is no eigenvalue of $B$. Insert $w$ into $(star)$
$$BAw:=:ABw - Aw:=:(mu-1)AwquadLongrightarrow; Aw:=:0,,$$
thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.



Lie algebra background
$A$ and $B$ span a complex Lie algebra with Lie bracket given by $(star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.

Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.



Related posts




  • $AB + BA = A implies A$ and $B$ have a common eigenvector


  • Questions about non-abelian 2 dimensional lie algebras


  • A Problem about Common Eigenvector


  • Representations of the two dimensional non-abelian Lie algebra @MO


  • Examples of finite dimensional non simple non abelian Lie algebras @MO







share|cite|improve this answer























    Your Answer





    StackExchange.ifUsing("editor", function () {
    return StackExchange.using("mathjaxEditing", function () {
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    });
    });
    }, "mathjax-editing");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "69"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1552209%2fnoncommuting-complex-matrices-existence-of-a-simultaneous-eigenvector%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1














    It is assumed throughout that




    The given matrices $A$ and $B$ are linearly independent in the $mathbb C$-vector space $M_n(mathbb C)$,

    thus ignoring the unremarkable setup where one matrix is a multiple of the other.




    My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider
    $$0:neq:[A,B]:=:AB-BA:=:alpha A+beta B$$
    and at least one of the scalars $alpha,beta $ is not zero. Divide by a scalar, say $alphaneq0$, then set
    $$B'=frac1alpha B;text{ and }; A'= A+beta B'quadtext{to obtain };
    big[,A',B'big]:=:A',.$$

    Thus in the sequel we may start w.l.o.g. from
    $,mathbf{AB-BA=Aqquadqquad (star)}$



    I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...

    simply jump: Goto The shared eigenvector !



    Examples

    how to satisfy the relation above:




    • If $,n=2$, then $,A=begin{pmatrix}0&1\ 0&0end{pmatrix}:$ and
      $;B=frac 12begin{pmatrix} -1&0\ 0&1end{pmatrix};$ do the job.
      $:begin{pmatrix}1\ 0end{pmatrix}$ is a joint eigenvector.


    • For $n=3$ one may consider
      $$A:=:begin{pmatrix}0&a_2&a_3\ 0&0&0\ 0&0&0end{pmatrix};,quad
      B:=:begin{pmatrix}b&0&0\ 0&b+1&0\ 0&0&b+1end{pmatrix}$$

      where $a_2,a_3,binmathbb C$,and $a_2,a_3$ not both zero.



    It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:

    For each $ninmathbb N$ one has
    $$big[,A^n,Bbig]:=:n,A^n,$$
    the proof by induction using
    $,[AA^{n-1},B] = A[A^{n-1},B] + [A,B],A^{n-1}$ (derivation property) is straightforward. Let $|cdot|$ be a submultiplicative matrix norm, then
    $$n|A^n|:=:|A^nB-BA^n|;leq;2|B| |A^n|quadforall,n$$
    which implies $A^n=0,$ at the latest when $,n>2|B|$. Thus $A$ is nilpotent.



    The shared eigenvector

    for $B$ and $A$ does exist: Pick an eigenvector $wneq 0$ to an eigenvalue $mu$ of $B$ such that $,mu-1,$ is no eigenvalue of $B$. Insert $w$ into $(star)$
    $$BAw:=:ABw - Aw:=:(mu-1)AwquadLongrightarrow; Aw:=:0,,$$
    thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.



    Lie algebra background
    $A$ and $B$ span a complex Lie algebra with Lie bracket given by $(star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.

    Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.



    Related posts




    • $AB + BA = A implies A$ and $B$ have a common eigenvector


    • Questions about non-abelian 2 dimensional lie algebras


    • A Problem about Common Eigenvector


    • Representations of the two dimensional non-abelian Lie algebra @MO


    • Examples of finite dimensional non simple non abelian Lie algebras @MO







    share|cite|improve this answer




























      1














      It is assumed throughout that




      The given matrices $A$ and $B$ are linearly independent in the $mathbb C$-vector space $M_n(mathbb C)$,

      thus ignoring the unremarkable setup where one matrix is a multiple of the other.




      My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider
      $$0:neq:[A,B]:=:AB-BA:=:alpha A+beta B$$
      and at least one of the scalars $alpha,beta $ is not zero. Divide by a scalar, say $alphaneq0$, then set
      $$B'=frac1alpha B;text{ and }; A'= A+beta B'quadtext{to obtain };
      big[,A',B'big]:=:A',.$$

      Thus in the sequel we may start w.l.o.g. from
      $,mathbf{AB-BA=Aqquadqquad (star)}$



      I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...

      simply jump: Goto The shared eigenvector !



      Examples

      how to satisfy the relation above:




      • If $,n=2$, then $,A=begin{pmatrix}0&1\ 0&0end{pmatrix}:$ and
        $;B=frac 12begin{pmatrix} -1&0\ 0&1end{pmatrix};$ do the job.
        $:begin{pmatrix}1\ 0end{pmatrix}$ is a joint eigenvector.


      • For $n=3$ one may consider
        $$A:=:begin{pmatrix}0&a_2&a_3\ 0&0&0\ 0&0&0end{pmatrix};,quad
        B:=:begin{pmatrix}b&0&0\ 0&b+1&0\ 0&0&b+1end{pmatrix}$$

        where $a_2,a_3,binmathbb C$,and $a_2,a_3$ not both zero.



      It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:

      For each $ninmathbb N$ one has
      $$big[,A^n,Bbig]:=:n,A^n,$$
      the proof by induction using
      $,[AA^{n-1},B] = A[A^{n-1},B] + [A,B],A^{n-1}$ (derivation property) is straightforward. Let $|cdot|$ be a submultiplicative matrix norm, then
      $$n|A^n|:=:|A^nB-BA^n|;leq;2|B| |A^n|quadforall,n$$
      which implies $A^n=0,$ at the latest when $,n>2|B|$. Thus $A$ is nilpotent.



      The shared eigenvector

      for $B$ and $A$ does exist: Pick an eigenvector $wneq 0$ to an eigenvalue $mu$ of $B$ such that $,mu-1,$ is no eigenvalue of $B$. Insert $w$ into $(star)$
      $$BAw:=:ABw - Aw:=:(mu-1)AwquadLongrightarrow; Aw:=:0,,$$
      thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.



      Lie algebra background
      $A$ and $B$ span a complex Lie algebra with Lie bracket given by $(star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.

      Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.



      Related posts




      • $AB + BA = A implies A$ and $B$ have a common eigenvector


      • Questions about non-abelian 2 dimensional lie algebras


      • A Problem about Common Eigenvector


      • Representations of the two dimensional non-abelian Lie algebra @MO


      • Examples of finite dimensional non simple non abelian Lie algebras @MO







      share|cite|improve this answer


























        1












        1








        1






        It is assumed throughout that




        The given matrices $A$ and $B$ are linearly independent in the $mathbb C$-vector space $M_n(mathbb C)$,

        thus ignoring the unremarkable setup where one matrix is a multiple of the other.




        My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider
        $$0:neq:[A,B]:=:AB-BA:=:alpha A+beta B$$
        and at least one of the scalars $alpha,beta $ is not zero. Divide by a scalar, say $alphaneq0$, then set
        $$B'=frac1alpha B;text{ and }; A'= A+beta B'quadtext{to obtain };
        big[,A',B'big]:=:A',.$$

        Thus in the sequel we may start w.l.o.g. from
        $,mathbf{AB-BA=Aqquadqquad (star)}$



        I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...

        simply jump: Goto The shared eigenvector !



        Examples

        how to satisfy the relation above:




        • If $,n=2$, then $,A=begin{pmatrix}0&1\ 0&0end{pmatrix}:$ and
          $;B=frac 12begin{pmatrix} -1&0\ 0&1end{pmatrix};$ do the job.
          $:begin{pmatrix}1\ 0end{pmatrix}$ is a joint eigenvector.


        • For $n=3$ one may consider
          $$A:=:begin{pmatrix}0&a_2&a_3\ 0&0&0\ 0&0&0end{pmatrix};,quad
          B:=:begin{pmatrix}b&0&0\ 0&b+1&0\ 0&0&b+1end{pmatrix}$$

          where $a_2,a_3,binmathbb C$,and $a_2,a_3$ not both zero.



        It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:

        For each $ninmathbb N$ one has
        $$big[,A^n,Bbig]:=:n,A^n,$$
        the proof by induction using
        $,[AA^{n-1},B] = A[A^{n-1},B] + [A,B],A^{n-1}$ (derivation property) is straightforward. Let $|cdot|$ be a submultiplicative matrix norm, then
        $$n|A^n|:=:|A^nB-BA^n|;leq;2|B| |A^n|quadforall,n$$
        which implies $A^n=0,$ at the latest when $,n>2|B|$. Thus $A$ is nilpotent.



        The shared eigenvector

        for $B$ and $A$ does exist: Pick an eigenvector $wneq 0$ to an eigenvalue $mu$ of $B$ such that $,mu-1,$ is no eigenvalue of $B$. Insert $w$ into $(star)$
        $$BAw:=:ABw - Aw:=:(mu-1)AwquadLongrightarrow; Aw:=:0,,$$
        thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.



        Lie algebra background
        $A$ and $B$ span a complex Lie algebra with Lie bracket given by $(star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.

        Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.



        Related posts




        • $AB + BA = A implies A$ and $B$ have a common eigenvector


        • Questions about non-abelian 2 dimensional lie algebras


        • A Problem about Common Eigenvector


        • Representations of the two dimensional non-abelian Lie algebra @MO


        • Examples of finite dimensional non simple non abelian Lie algebras @MO







        share|cite|improve this answer














        It is assumed throughout that




        The given matrices $A$ and $B$ are linearly independent in the $mathbb C$-vector space $M_n(mathbb C)$,

        thus ignoring the unremarkable setup where one matrix is a multiple of the other.




        My next guess is that the question doesn't focus upon a vanishing commutator of $A$ and $B$, hence consider
        $$0:neq:[A,B]:=:AB-BA:=:alpha A+beta B$$
        and at least one of the scalars $alpha,beta $ is not zero. Divide by a scalar, say $alphaneq0$, then set
        $$B'=frac1alpha B;text{ and }; A'= A+beta B'quadtext{to obtain };
        big[,A',B'big]:=:A',.$$

        Thus in the sequel we may start w.l.o.g. from
        $,mathbf{AB-BA=Aqquadqquad (star)}$



        I decided to spill some more MathJax ink here & now, to give examples and a bit Lie algebra background as well. Who's not willing to go through all this should ...

        simply jump: Goto The shared eigenvector !



        Examples

        how to satisfy the relation above:




        • If $,n=2$, then $,A=begin{pmatrix}0&1\ 0&0end{pmatrix}:$ and
          $;B=frac 12begin{pmatrix} -1&0\ 0&1end{pmatrix};$ do the job.
          $:begin{pmatrix}1\ 0end{pmatrix}$ is a joint eigenvector.


        • For $n=3$ one may consider
          $$A:=:begin{pmatrix}0&a_2&a_3\ 0&0&0\ 0&0&0end{pmatrix};,quad
          B:=:begin{pmatrix}b&0&0\ 0&b+1&0\ 0&0&b+1end{pmatrix}$$

          where $a_2,a_3,binmathbb C$,and $a_2,a_3$ not both zero.



        It is not by accident that $A$ is represented by a nilpotent matrix since this cannot be avoided:

        For each $ninmathbb N$ one has
        $$big[,A^n,Bbig]:=:n,A^n,$$
        the proof by induction using
        $,[AA^{n-1},B] = A[A^{n-1},B] + [A,B],A^{n-1}$ (derivation property) is straightforward. Let $|cdot|$ be a submultiplicative matrix norm, then
        $$n|A^n|:=:|A^nB-BA^n|;leq;2|B| |A^n|quadforall,n$$
        which implies $A^n=0,$ at the latest when $,n>2|B|$. Thus $A$ is nilpotent.



        The shared eigenvector

        for $B$ and $A$ does exist: Pick an eigenvector $wneq 0$ to an eigenvalue $mu$ of $B$ such that $,mu-1,$ is no eigenvalue of $B$. Insert $w$ into $(star)$
        $$BAw:=:ABw - Aw:=:(mu-1)AwquadLongrightarrow; Aw:=:0,,$$
        thus $w$ is also an eigenvector of $A$, to the eigenvalue zero.



        Lie algebra background
        $A$ and $B$ span a complex Lie algebra with Lie bracket given by $(star)$. It is up to (Lie algebra) isomorphism the unique complex Lie algebra, which is 2-dimensional and nonabelian. It is solvable, but not nilpotent.

        Lie's theorem guarantees (in any matrix representation) the existence of a shared eigenvector for all elements of a solvable Lie algebra. One implication is that all matrices of a representation may be chosen to be triangular.



        Related posts




        • $AB + BA = A implies A$ and $B$ have a common eigenvector


        • Questions about non-abelian 2 dimensional lie algebras


        • A Problem about Common Eigenvector


        • Representations of the two dimensional non-abelian Lie algebra @MO


        • Examples of finite dimensional non simple non abelian Lie algebras @MO








        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited yesterday

























        answered Jan 4 at 22:22









        HannoHanno

        2,061425




        2,061425






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Mathematics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f1552209%2fnoncommuting-complex-matrices-existence-of-a-simultaneous-eigenvector%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            An IMO inspired problem

            Management

            Has there ever been an instance of an active nuclear power plant within or near a war zone?