A function with a non-zero derivative, with an inverse function that has no derivative.












21














While studying calculus, I encountered the following statement:
"Given a function $f(x)$ with $f'(x_0)neq 0$, such that $f$ has an inverse in some neighborhood of $x_0$, and such that $f$ is continuous on said neighborhood, then $f^{-1}$ has a derivative at $f(x_0)$ given by:
$${f^{-1}}'(x_0)=frac{1}{f'(x_0)}$$



My questions is - why does $f$ have to be continuous on a whole neighborhood of $x_0$ and not just at $x_0$? Is there some known counter-example for that?










share|cite|improve this question









New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 6




    Welcome to MSE. Nice first question!
    – José Carlos Santos
    yesterday










  • Example 1 in this post shows that continuity is not necessary (for pointwise differentiability of the inverse).
    – user21820
    16 hours ago


















21














While studying calculus, I encountered the following statement:
"Given a function $f(x)$ with $f'(x_0)neq 0$, such that $f$ has an inverse in some neighborhood of $x_0$, and such that $f$ is continuous on said neighborhood, then $f^{-1}$ has a derivative at $f(x_0)$ given by:
$${f^{-1}}'(x_0)=frac{1}{f'(x_0)}$$



My questions is - why does $f$ have to be continuous on a whole neighborhood of $x_0$ and not just at $x_0$? Is there some known counter-example for that?










share|cite|improve this question









New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.
















  • 6




    Welcome to MSE. Nice first question!
    – José Carlos Santos
    yesterday










  • Example 1 in this post shows that continuity is not necessary (for pointwise differentiability of the inverse).
    – user21820
    16 hours ago
















21












21








21


4





While studying calculus, I encountered the following statement:
"Given a function $f(x)$ with $f'(x_0)neq 0$, such that $f$ has an inverse in some neighborhood of $x_0$, and such that $f$ is continuous on said neighborhood, then $f^{-1}$ has a derivative at $f(x_0)$ given by:
$${f^{-1}}'(x_0)=frac{1}{f'(x_0)}$$



My questions is - why does $f$ have to be continuous on a whole neighborhood of $x_0$ and not just at $x_0$? Is there some known counter-example for that?










share|cite|improve this question









New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











While studying calculus, I encountered the following statement:
"Given a function $f(x)$ with $f'(x_0)neq 0$, such that $f$ has an inverse in some neighborhood of $x_0$, and such that $f$ is continuous on said neighborhood, then $f^{-1}$ has a derivative at $f(x_0)$ given by:
$${f^{-1}}'(x_0)=frac{1}{f'(x_0)}$$



My questions is - why does $f$ have to be continuous on a whole neighborhood of $x_0$ and not just at $x_0$? Is there some known counter-example for that?







calculus derivatives proof-explanation inverse-function inverse-function-theorem






share|cite|improve this question









New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.











share|cite|improve this question









New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









share|cite|improve this question




share|cite|improve this question








edited yesterday









LoveTooNap29

1,0241613




1,0241613






New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.









asked yesterday









Ran KiriRan Kiri

1085




1085




New contributor




Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.





New contributor





Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






Ran Kiri is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.








  • 6




    Welcome to MSE. Nice first question!
    – José Carlos Santos
    yesterday










  • Example 1 in this post shows that continuity is not necessary (for pointwise differentiability of the inverse).
    – user21820
    16 hours ago
















  • 6




    Welcome to MSE. Nice first question!
    – José Carlos Santos
    yesterday










  • Example 1 in this post shows that continuity is not necessary (for pointwise differentiability of the inverse).
    – user21820
    16 hours ago










6




6




Welcome to MSE. Nice first question!
– José Carlos Santos
yesterday




Welcome to MSE. Nice first question!
– José Carlos Santos
yesterday












Example 1 in this post shows that continuity is not necessary (for pointwise differentiability of the inverse).
– user21820
16 hours ago






Example 1 in this post shows that continuity is not necessary (for pointwise differentiability of the inverse).
– user21820
16 hours ago












4 Answers
4






active

oldest

votes


















12














The suggestion in the title isn't how it'll work. Instead of having an inverse that doesn't have a derivative, we'll fail to have a continuous inverse. Also, the required condition for the theorem isn't just that $f$ is continuous on an interval - it's that $f'$ is continuous on an interval around the key point.



Example: $f(x)=begin{cases}x+2x^2sinfrac 1x&xneq 0\0&x = 0end{cases}$.
This $f$ is differentiable everywhere, with derivative $1$ at zero, but it doesn't have an inverse in any neighborhood of zero. Why? Because it isn't monotone on any neighborhood of zero. We have $f'(x)=1+4xsinfrac1x-2cosfrac1x$ for $xneq 0$, which is negative whenever $frac1xequiv 0mod 2pi$. We can find a one-sided inverse $g$ with $f(g(x))=x$, but this $g$ will necessarily have infinitely many jump discontinuities near zero.



The calculation of the derivative of $f^{-1}$ is just an application of the chain rule. The real meat of the inverse function theorem is the existence of a differentiable inverse.






share|cite|improve this answer























  • Yes I can see it now. Thank you very much!
    – Ran Kiri
    yesterday



















2














First off, any function has an inverse "at $x_0$" because we just assign $f(x_0)$ the value $x_0$; it's really meaningless to talk at an inverse existing at a point. We need a whole neighbourhood because then we can use the derivative, which is defined by a limit, so we must be able to "approach" $x_0$ arbitarily closely.






share|cite|improve this answer





























    0














    The continuity condition is not necessary. It's enough that $f$ be injective on some neighborhood. This said, if your function has a sequence of jump discontinuities near $x_0$, you might have that there is no open interval $U$ around $x_0$ for which $f(U)$ is also an interval. This means that $f^{-1}$ might be defined on a strange domain, though we can still technically differentiate it to get the desired result.





    Formally, the statement you would need to prove is the following:




    Let $A$ and $B$ be subsets of $mathbb R$ and $f:Arightarrow B$ and $g:Brightarrow mathbb R$. Suppose that $x_0in A$ is an accumulation point of $A$ and $f(x_0)$ is an accumulation point of $B$. Then,




    • If two of the derivatives $f'(x_0)$ and $g'(f(x_0))$ and $(fcirc g)'(x_0)$ exist and are non-zero, the third exists as well.


    • If all of the derivatives exist, then $(fcirc g)'(x_0)=f'(x_0)cdot g'(f(x_0)).$





    One you have this statement, you can apply it to a pair where we take $g=f^{-1}$. Note that we can make this work even if $f$ isn't defined on an interval around $x_0$ - it's okay as long as we have enough points to define the relevant limit towards $x_0$.



    Granted, it is a bit unusual to talk about derivatives on sets that aren't open, but there's no technical limitations preventing it, though the proof of the suggested lemma is a pain.






    share|cite|improve this answer





























      0














      I think that as long as $f^{-1}$ is well-defined on a neighborhood of $f(x_0)$, and continuous at $f(x_0)$, there is no issue.



      Indeed, $f(f^{-1}(f(x_0)+h))=f(x_0)+h$ so $h=f(f^{-1}(f(x_0)+h))-f(x_0) sim f’(x_0)(f^{-1}(f(x_0)+h)-x_0)$, and the conclusion (of differentiability and value of the derivative) follows.






      share|cite|improve this answer























        Your Answer





        StackExchange.ifUsing("editor", function () {
        return StackExchange.using("mathjaxEditing", function () {
        StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
        StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
        });
        });
        }, "mathjax-editing");

        StackExchange.ready(function() {
        var channelOptions = {
        tags: "".split(" "),
        id: "69"
        };
        initTagRenderer("".split(" "), "".split(" "), channelOptions);

        StackExchange.using("externalEditor", function() {
        // Have to fire editor after snippets, if snippets enabled
        if (StackExchange.settings.snippets.snippetsEnabled) {
        StackExchange.using("snippets", function() {
        createEditor();
        });
        }
        else {
        createEditor();
        }
        });

        function createEditor() {
        StackExchange.prepareEditor({
        heartbeatType: 'answer',
        autoActivateHeartbeat: false,
        convertImagesToLinks: true,
        noModals: true,
        showLowRepImageUploadWarning: true,
        reputationToPostImages: 10,
        bindNavPrevention: true,
        postfix: "",
        imageUploader: {
        brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
        contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
        allowUrls: true
        },
        noCode: true, onDemand: true,
        discardSelector: ".discard-answer"
        ,immediatelyShowMarkdownHelp:true
        });


        }
        });






        Ran Kiri is a new contributor. Be nice, and check out our Code of Conduct.










        draft saved

        draft discarded


















        StackExchange.ready(
        function () {
        StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3064468%2fa-function-with-a-non-zero-derivative-with-an-inverse-function-that-has-no-deri%23new-answer', 'question_page');
        }
        );

        Post as a guest















        Required, but never shown

























        4 Answers
        4






        active

        oldest

        votes








        4 Answers
        4






        active

        oldest

        votes









        active

        oldest

        votes






        active

        oldest

        votes









        12














        The suggestion in the title isn't how it'll work. Instead of having an inverse that doesn't have a derivative, we'll fail to have a continuous inverse. Also, the required condition for the theorem isn't just that $f$ is continuous on an interval - it's that $f'$ is continuous on an interval around the key point.



        Example: $f(x)=begin{cases}x+2x^2sinfrac 1x&xneq 0\0&x = 0end{cases}$.
        This $f$ is differentiable everywhere, with derivative $1$ at zero, but it doesn't have an inverse in any neighborhood of zero. Why? Because it isn't monotone on any neighborhood of zero. We have $f'(x)=1+4xsinfrac1x-2cosfrac1x$ for $xneq 0$, which is negative whenever $frac1xequiv 0mod 2pi$. We can find a one-sided inverse $g$ with $f(g(x))=x$, but this $g$ will necessarily have infinitely many jump discontinuities near zero.



        The calculation of the derivative of $f^{-1}$ is just an application of the chain rule. The real meat of the inverse function theorem is the existence of a differentiable inverse.






        share|cite|improve this answer























        • Yes I can see it now. Thank you very much!
          – Ran Kiri
          yesterday
















        12














        The suggestion in the title isn't how it'll work. Instead of having an inverse that doesn't have a derivative, we'll fail to have a continuous inverse. Also, the required condition for the theorem isn't just that $f$ is continuous on an interval - it's that $f'$ is continuous on an interval around the key point.



        Example: $f(x)=begin{cases}x+2x^2sinfrac 1x&xneq 0\0&x = 0end{cases}$.
        This $f$ is differentiable everywhere, with derivative $1$ at zero, but it doesn't have an inverse in any neighborhood of zero. Why? Because it isn't monotone on any neighborhood of zero. We have $f'(x)=1+4xsinfrac1x-2cosfrac1x$ for $xneq 0$, which is negative whenever $frac1xequiv 0mod 2pi$. We can find a one-sided inverse $g$ with $f(g(x))=x$, but this $g$ will necessarily have infinitely many jump discontinuities near zero.



        The calculation of the derivative of $f^{-1}$ is just an application of the chain rule. The real meat of the inverse function theorem is the existence of a differentiable inverse.






        share|cite|improve this answer























        • Yes I can see it now. Thank you very much!
          – Ran Kiri
          yesterday














        12












        12








        12






        The suggestion in the title isn't how it'll work. Instead of having an inverse that doesn't have a derivative, we'll fail to have a continuous inverse. Also, the required condition for the theorem isn't just that $f$ is continuous on an interval - it's that $f'$ is continuous on an interval around the key point.



        Example: $f(x)=begin{cases}x+2x^2sinfrac 1x&xneq 0\0&x = 0end{cases}$.
        This $f$ is differentiable everywhere, with derivative $1$ at zero, but it doesn't have an inverse in any neighborhood of zero. Why? Because it isn't monotone on any neighborhood of zero. We have $f'(x)=1+4xsinfrac1x-2cosfrac1x$ for $xneq 0$, which is negative whenever $frac1xequiv 0mod 2pi$. We can find a one-sided inverse $g$ with $f(g(x))=x$, but this $g$ will necessarily have infinitely many jump discontinuities near zero.



        The calculation of the derivative of $f^{-1}$ is just an application of the chain rule. The real meat of the inverse function theorem is the existence of a differentiable inverse.






        share|cite|improve this answer














        The suggestion in the title isn't how it'll work. Instead of having an inverse that doesn't have a derivative, we'll fail to have a continuous inverse. Also, the required condition for the theorem isn't just that $f$ is continuous on an interval - it's that $f'$ is continuous on an interval around the key point.



        Example: $f(x)=begin{cases}x+2x^2sinfrac 1x&xneq 0\0&x = 0end{cases}$.
        This $f$ is differentiable everywhere, with derivative $1$ at zero, but it doesn't have an inverse in any neighborhood of zero. Why? Because it isn't monotone on any neighborhood of zero. We have $f'(x)=1+4xsinfrac1x-2cosfrac1x$ for $xneq 0$, which is negative whenever $frac1xequiv 0mod 2pi$. We can find a one-sided inverse $g$ with $f(g(x))=x$, but this $g$ will necessarily have infinitely many jump discontinuities near zero.



        The calculation of the derivative of $f^{-1}$ is just an application of the chain rule. The real meat of the inverse function theorem is the existence of a differentiable inverse.







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited yesterday

























        answered yesterday









        jmerryjmerry

        2,521312




        2,521312












        • Yes I can see it now. Thank you very much!
          – Ran Kiri
          yesterday


















        • Yes I can see it now. Thank you very much!
          – Ran Kiri
          yesterday
















        Yes I can see it now. Thank you very much!
        – Ran Kiri
        yesterday




        Yes I can see it now. Thank you very much!
        – Ran Kiri
        yesterday











        2














        First off, any function has an inverse "at $x_0$" because we just assign $f(x_0)$ the value $x_0$; it's really meaningless to talk at an inverse existing at a point. We need a whole neighbourhood because then we can use the derivative, which is defined by a limit, so we must be able to "approach" $x_0$ arbitarily closely.






        share|cite|improve this answer


























          2














          First off, any function has an inverse "at $x_0$" because we just assign $f(x_0)$ the value $x_0$; it's really meaningless to talk at an inverse existing at a point. We need a whole neighbourhood because then we can use the derivative, which is defined by a limit, so we must be able to "approach" $x_0$ arbitarily closely.






          share|cite|improve this answer
























            2












            2








            2






            First off, any function has an inverse "at $x_0$" because we just assign $f(x_0)$ the value $x_0$; it's really meaningless to talk at an inverse existing at a point. We need a whole neighbourhood because then we can use the derivative, which is defined by a limit, so we must be able to "approach" $x_0$ arbitarily closely.






            share|cite|improve this answer












            First off, any function has an inverse "at $x_0$" because we just assign $f(x_0)$ the value $x_0$; it's really meaningless to talk at an inverse existing at a point. We need a whole neighbourhood because then we can use the derivative, which is defined by a limit, so we must be able to "approach" $x_0$ arbitarily closely.







            share|cite|improve this answer












            share|cite|improve this answer



            share|cite|improve this answer










            answered yesterday









            Henno BrandsmaHenno Brandsma

            105k347114




            105k347114























                0














                The continuity condition is not necessary. It's enough that $f$ be injective on some neighborhood. This said, if your function has a sequence of jump discontinuities near $x_0$, you might have that there is no open interval $U$ around $x_0$ for which $f(U)$ is also an interval. This means that $f^{-1}$ might be defined on a strange domain, though we can still technically differentiate it to get the desired result.





                Formally, the statement you would need to prove is the following:




                Let $A$ and $B$ be subsets of $mathbb R$ and $f:Arightarrow B$ and $g:Brightarrow mathbb R$. Suppose that $x_0in A$ is an accumulation point of $A$ and $f(x_0)$ is an accumulation point of $B$. Then,




                • If two of the derivatives $f'(x_0)$ and $g'(f(x_0))$ and $(fcirc g)'(x_0)$ exist and are non-zero, the third exists as well.


                • If all of the derivatives exist, then $(fcirc g)'(x_0)=f'(x_0)cdot g'(f(x_0)).$





                One you have this statement, you can apply it to a pair where we take $g=f^{-1}$. Note that we can make this work even if $f$ isn't defined on an interval around $x_0$ - it's okay as long as we have enough points to define the relevant limit towards $x_0$.



                Granted, it is a bit unusual to talk about derivatives on sets that aren't open, but there's no technical limitations preventing it, though the proof of the suggested lemma is a pain.






                share|cite|improve this answer


























                  0














                  The continuity condition is not necessary. It's enough that $f$ be injective on some neighborhood. This said, if your function has a sequence of jump discontinuities near $x_0$, you might have that there is no open interval $U$ around $x_0$ for which $f(U)$ is also an interval. This means that $f^{-1}$ might be defined on a strange domain, though we can still technically differentiate it to get the desired result.





                  Formally, the statement you would need to prove is the following:




                  Let $A$ and $B$ be subsets of $mathbb R$ and $f:Arightarrow B$ and $g:Brightarrow mathbb R$. Suppose that $x_0in A$ is an accumulation point of $A$ and $f(x_0)$ is an accumulation point of $B$. Then,




                  • If two of the derivatives $f'(x_0)$ and $g'(f(x_0))$ and $(fcirc g)'(x_0)$ exist and are non-zero, the third exists as well.


                  • If all of the derivatives exist, then $(fcirc g)'(x_0)=f'(x_0)cdot g'(f(x_0)).$





                  One you have this statement, you can apply it to a pair where we take $g=f^{-1}$. Note that we can make this work even if $f$ isn't defined on an interval around $x_0$ - it's okay as long as we have enough points to define the relevant limit towards $x_0$.



                  Granted, it is a bit unusual to talk about derivatives on sets that aren't open, but there's no technical limitations preventing it, though the proof of the suggested lemma is a pain.






                  share|cite|improve this answer
























                    0












                    0








                    0






                    The continuity condition is not necessary. It's enough that $f$ be injective on some neighborhood. This said, if your function has a sequence of jump discontinuities near $x_0$, you might have that there is no open interval $U$ around $x_0$ for which $f(U)$ is also an interval. This means that $f^{-1}$ might be defined on a strange domain, though we can still technically differentiate it to get the desired result.





                    Formally, the statement you would need to prove is the following:




                    Let $A$ and $B$ be subsets of $mathbb R$ and $f:Arightarrow B$ and $g:Brightarrow mathbb R$. Suppose that $x_0in A$ is an accumulation point of $A$ and $f(x_0)$ is an accumulation point of $B$. Then,




                    • If two of the derivatives $f'(x_0)$ and $g'(f(x_0))$ and $(fcirc g)'(x_0)$ exist and are non-zero, the third exists as well.


                    • If all of the derivatives exist, then $(fcirc g)'(x_0)=f'(x_0)cdot g'(f(x_0)).$





                    One you have this statement, you can apply it to a pair where we take $g=f^{-1}$. Note that we can make this work even if $f$ isn't defined on an interval around $x_0$ - it's okay as long as we have enough points to define the relevant limit towards $x_0$.



                    Granted, it is a bit unusual to talk about derivatives on sets that aren't open, but there's no technical limitations preventing it, though the proof of the suggested lemma is a pain.






                    share|cite|improve this answer












                    The continuity condition is not necessary. It's enough that $f$ be injective on some neighborhood. This said, if your function has a sequence of jump discontinuities near $x_0$, you might have that there is no open interval $U$ around $x_0$ for which $f(U)$ is also an interval. This means that $f^{-1}$ might be defined on a strange domain, though we can still technically differentiate it to get the desired result.





                    Formally, the statement you would need to prove is the following:




                    Let $A$ and $B$ be subsets of $mathbb R$ and $f:Arightarrow B$ and $g:Brightarrow mathbb R$. Suppose that $x_0in A$ is an accumulation point of $A$ and $f(x_0)$ is an accumulation point of $B$. Then,




                    • If two of the derivatives $f'(x_0)$ and $g'(f(x_0))$ and $(fcirc g)'(x_0)$ exist and are non-zero, the third exists as well.


                    • If all of the derivatives exist, then $(fcirc g)'(x_0)=f'(x_0)cdot g'(f(x_0)).$





                    One you have this statement, you can apply it to a pair where we take $g=f^{-1}$. Note that we can make this work even if $f$ isn't defined on an interval around $x_0$ - it's okay as long as we have enough points to define the relevant limit towards $x_0$.



                    Granted, it is a bit unusual to talk about derivatives on sets that aren't open, but there's no technical limitations preventing it, though the proof of the suggested lemma is a pain.







                    share|cite|improve this answer












                    share|cite|improve this answer



                    share|cite|improve this answer










                    answered yesterday









                    Milo BrandtMilo Brandt

                    39.4k475139




                    39.4k475139























                        0














                        I think that as long as $f^{-1}$ is well-defined on a neighborhood of $f(x_0)$, and continuous at $f(x_0)$, there is no issue.



                        Indeed, $f(f^{-1}(f(x_0)+h))=f(x_0)+h$ so $h=f(f^{-1}(f(x_0)+h))-f(x_0) sim f’(x_0)(f^{-1}(f(x_0)+h)-x_0)$, and the conclusion (of differentiability and value of the derivative) follows.






                        share|cite|improve this answer




























                          0














                          I think that as long as $f^{-1}$ is well-defined on a neighborhood of $f(x_0)$, and continuous at $f(x_0)$, there is no issue.



                          Indeed, $f(f^{-1}(f(x_0)+h))=f(x_0)+h$ so $h=f(f^{-1}(f(x_0)+h))-f(x_0) sim f’(x_0)(f^{-1}(f(x_0)+h)-x_0)$, and the conclusion (of differentiability and value of the derivative) follows.






                          share|cite|improve this answer


























                            0












                            0








                            0






                            I think that as long as $f^{-1}$ is well-defined on a neighborhood of $f(x_0)$, and continuous at $f(x_0)$, there is no issue.



                            Indeed, $f(f^{-1}(f(x_0)+h))=f(x_0)+h$ so $h=f(f^{-1}(f(x_0)+h))-f(x_0) sim f’(x_0)(f^{-1}(f(x_0)+h)-x_0)$, and the conclusion (of differentiability and value of the derivative) follows.






                            share|cite|improve this answer














                            I think that as long as $f^{-1}$ is well-defined on a neighborhood of $f(x_0)$, and continuous at $f(x_0)$, there is no issue.



                            Indeed, $f(f^{-1}(f(x_0)+h))=f(x_0)+h$ so $h=f(f^{-1}(f(x_0)+h))-f(x_0) sim f’(x_0)(f^{-1}(f(x_0)+h)-x_0)$, and the conclusion (of differentiability and value of the derivative) follows.







                            share|cite|improve this answer














                            share|cite|improve this answer



                            share|cite|improve this answer








                            edited yesterday

























                            answered yesterday









                            MindlackMindlack

                            1,99217




                            1,99217






















                                Ran Kiri is a new contributor. Be nice, and check out our Code of Conduct.










                                draft saved

                                draft discarded


















                                Ran Kiri is a new contributor. Be nice, and check out our Code of Conduct.













                                Ran Kiri is a new contributor. Be nice, and check out our Code of Conduct.












                                Ran Kiri is a new contributor. Be nice, and check out our Code of Conduct.
















                                Thanks for contributing an answer to Mathematics Stack Exchange!


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                Use MathJax to format equations. MathJax reference.


                                To learn more, see our tips on writing great answers.





                                Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


                                Please pay close attention to the following guidance:


                                • Please be sure to answer the question. Provide details and share your research!

                                But avoid



                                • Asking for help, clarification, or responding to other answers.

                                • Making statements based on opinion; back them up with references or personal experience.


                                To learn more, see our tips on writing great answers.




                                draft saved


                                draft discarded














                                StackExchange.ready(
                                function () {
                                StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3064468%2fa-function-with-a-non-zero-derivative-with-an-inverse-function-that-has-no-deri%23new-answer', 'question_page');
                                }
                                );

                                Post as a guest















                                Required, but never shown





















































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown

































                                Required, but never shown














                                Required, but never shown












                                Required, but never shown







                                Required, but never shown







                                Popular posts from this blog

                                An IMO inspired problem

                                Management

                                Investment