Primitive of $int R(cos x, sin x) dx$
Suppose we are seeking the primitive
$$int R(cos x, sin x) dx$$
where $R(u,v)=frac {P(u,v)}{ Q(u,v)}$ is a two-variable rational function. Show that
(a) if $R(−u,v)= R(u,v)$,then $R(u,v)$ has the form $R_1(u^2,v)$;
(b) if $R(−u, v) = −R(u, v)$, then $R(u, v) = u cdot R_2(u^2, v)$ and the substitution $t = sin x$ rationalizes the integral above;
(c) If $R(−u, −v) = R(u, v)$, then $R(u, v) = R_3( u , v^2)$, and the substitution $t =tan x$ rationalizes the integral above.
My attempt: Suppose
$$P(x)=sum_{i,jgeq0, i+jleq n}{a_{ij}x^iy^j}
=sum_{i,jgeq0, 2i+jleq n}{a_{2i,j}x^{2i}y^{j}}+sum_{i,jgeq0, 2i+jleq n}{a_{2i+1,j}x^{2i+1}y^{j}}=P_1(x^2,y)+xP_2(x^2,y)$$
Similarly,
$$Q(x)=Q_1(x^2,y)+xQ_2(x^2,y)$$
Then
$$P(x,y)=frac{P_1(x^2,y)+xP_2(x^2,y)}{Q_1(x^2,y)+xQ_2(x^2,y)}=frac{[P_1(x^2,y)+xP_2(x^2,y)][Q_1(x^2,y)-xQ_2(x^2,y)]}{[Q_1(x^2,y)]^2-[xQ_2(x^2,y)]^2}$$
$$=frac{P_3(x^2,y)+xP_4(x^2,y)}{Q_3(x^2,y)}$$
where
$$P_3(x^2,y)=P_1(x^2,y)Q_1(x^2,y)-x^2P_2(x,y^2)Q_2(x^2,y)$$
$$P_4(x^2,y)=-P_1(x^2,y)Q_2(x^2,y)+P_2(x^2,y)Q_1(x^2,y)$$
$$Q_3(x^2,y)=[Q_1(x^2,y)]^2+[xQ_2(x^2,y)]^2$$
For part (a), since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}$$
How could I eliminate $uP_4(u^2,v)$, or in the other way, this is alike an even function for single variable function, to show $R(u,v)$ in this form has no odd degree for $u$? If part (a) is done, then part (b) can be brought down using similar way as part (a).
Update:
Since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}=frac{P_3(u^2,v)+uP_4(u^2,v)}{Q_3(u^2,v)}=R(u,v)$$
implies $uP_4(u^2,v)=0$ for all $u$ and $v$. (Not necessary $u=0$). Then it is proven and follows part (a)? I am confused but this idea suddenly come
into my mind.
integration rational-functions
This question has an open bounty worth +50
reputation from weilam06 ending in 3 days.
The question is widely applicable to a large audience. A detailed canonical answer is required to address all the concerns.
A detailed explanation for the part (b) and (c), and little improvement and correction for my working.
add a comment |
Suppose we are seeking the primitive
$$int R(cos x, sin x) dx$$
where $R(u,v)=frac {P(u,v)}{ Q(u,v)}$ is a two-variable rational function. Show that
(a) if $R(−u,v)= R(u,v)$,then $R(u,v)$ has the form $R_1(u^2,v)$;
(b) if $R(−u, v) = −R(u, v)$, then $R(u, v) = u cdot R_2(u^2, v)$ and the substitution $t = sin x$ rationalizes the integral above;
(c) If $R(−u, −v) = R(u, v)$, then $R(u, v) = R_3( u , v^2)$, and the substitution $t =tan x$ rationalizes the integral above.
My attempt: Suppose
$$P(x)=sum_{i,jgeq0, i+jleq n}{a_{ij}x^iy^j}
=sum_{i,jgeq0, 2i+jleq n}{a_{2i,j}x^{2i}y^{j}}+sum_{i,jgeq0, 2i+jleq n}{a_{2i+1,j}x^{2i+1}y^{j}}=P_1(x^2,y)+xP_2(x^2,y)$$
Similarly,
$$Q(x)=Q_1(x^2,y)+xQ_2(x^2,y)$$
Then
$$P(x,y)=frac{P_1(x^2,y)+xP_2(x^2,y)}{Q_1(x^2,y)+xQ_2(x^2,y)}=frac{[P_1(x^2,y)+xP_2(x^2,y)][Q_1(x^2,y)-xQ_2(x^2,y)]}{[Q_1(x^2,y)]^2-[xQ_2(x^2,y)]^2}$$
$$=frac{P_3(x^2,y)+xP_4(x^2,y)}{Q_3(x^2,y)}$$
where
$$P_3(x^2,y)=P_1(x^2,y)Q_1(x^2,y)-x^2P_2(x,y^2)Q_2(x^2,y)$$
$$P_4(x^2,y)=-P_1(x^2,y)Q_2(x^2,y)+P_2(x^2,y)Q_1(x^2,y)$$
$$Q_3(x^2,y)=[Q_1(x^2,y)]^2+[xQ_2(x^2,y)]^2$$
For part (a), since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}$$
How could I eliminate $uP_4(u^2,v)$, or in the other way, this is alike an even function for single variable function, to show $R(u,v)$ in this form has no odd degree for $u$? If part (a) is done, then part (b) can be brought down using similar way as part (a).
Update:
Since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}=frac{P_3(u^2,v)+uP_4(u^2,v)}{Q_3(u^2,v)}=R(u,v)$$
implies $uP_4(u^2,v)=0$ for all $u$ and $v$. (Not necessary $u=0$). Then it is proven and follows part (a)? I am confused but this idea suddenly come
into my mind.
integration rational-functions
This question has an open bounty worth +50
reputation from weilam06 ending in 3 days.
The question is widely applicable to a large audience. A detailed canonical answer is required to address all the concerns.
A detailed explanation for the part (b) and (c), and little improvement and correction for my working.
add a comment |
Suppose we are seeking the primitive
$$int R(cos x, sin x) dx$$
where $R(u,v)=frac {P(u,v)}{ Q(u,v)}$ is a two-variable rational function. Show that
(a) if $R(−u,v)= R(u,v)$,then $R(u,v)$ has the form $R_1(u^2,v)$;
(b) if $R(−u, v) = −R(u, v)$, then $R(u, v) = u cdot R_2(u^2, v)$ and the substitution $t = sin x$ rationalizes the integral above;
(c) If $R(−u, −v) = R(u, v)$, then $R(u, v) = R_3( u , v^2)$, and the substitution $t =tan x$ rationalizes the integral above.
My attempt: Suppose
$$P(x)=sum_{i,jgeq0, i+jleq n}{a_{ij}x^iy^j}
=sum_{i,jgeq0, 2i+jleq n}{a_{2i,j}x^{2i}y^{j}}+sum_{i,jgeq0, 2i+jleq n}{a_{2i+1,j}x^{2i+1}y^{j}}=P_1(x^2,y)+xP_2(x^2,y)$$
Similarly,
$$Q(x)=Q_1(x^2,y)+xQ_2(x^2,y)$$
Then
$$P(x,y)=frac{P_1(x^2,y)+xP_2(x^2,y)}{Q_1(x^2,y)+xQ_2(x^2,y)}=frac{[P_1(x^2,y)+xP_2(x^2,y)][Q_1(x^2,y)-xQ_2(x^2,y)]}{[Q_1(x^2,y)]^2-[xQ_2(x^2,y)]^2}$$
$$=frac{P_3(x^2,y)+xP_4(x^2,y)}{Q_3(x^2,y)}$$
where
$$P_3(x^2,y)=P_1(x^2,y)Q_1(x^2,y)-x^2P_2(x,y^2)Q_2(x^2,y)$$
$$P_4(x^2,y)=-P_1(x^2,y)Q_2(x^2,y)+P_2(x^2,y)Q_1(x^2,y)$$
$$Q_3(x^2,y)=[Q_1(x^2,y)]^2+[xQ_2(x^2,y)]^2$$
For part (a), since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}$$
How could I eliminate $uP_4(u^2,v)$, or in the other way, this is alike an even function for single variable function, to show $R(u,v)$ in this form has no odd degree for $u$? If part (a) is done, then part (b) can be brought down using similar way as part (a).
Update:
Since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}=frac{P_3(u^2,v)+uP_4(u^2,v)}{Q_3(u^2,v)}=R(u,v)$$
implies $uP_4(u^2,v)=0$ for all $u$ and $v$. (Not necessary $u=0$). Then it is proven and follows part (a)? I am confused but this idea suddenly come
into my mind.
integration rational-functions
Suppose we are seeking the primitive
$$int R(cos x, sin x) dx$$
where $R(u,v)=frac {P(u,v)}{ Q(u,v)}$ is a two-variable rational function. Show that
(a) if $R(−u,v)= R(u,v)$,then $R(u,v)$ has the form $R_1(u^2,v)$;
(b) if $R(−u, v) = −R(u, v)$, then $R(u, v) = u cdot R_2(u^2, v)$ and the substitution $t = sin x$ rationalizes the integral above;
(c) If $R(−u, −v) = R(u, v)$, then $R(u, v) = R_3( u , v^2)$, and the substitution $t =tan x$ rationalizes the integral above.
My attempt: Suppose
$$P(x)=sum_{i,jgeq0, i+jleq n}{a_{ij}x^iy^j}
=sum_{i,jgeq0, 2i+jleq n}{a_{2i,j}x^{2i}y^{j}}+sum_{i,jgeq0, 2i+jleq n}{a_{2i+1,j}x^{2i+1}y^{j}}=P_1(x^2,y)+xP_2(x^2,y)$$
Similarly,
$$Q(x)=Q_1(x^2,y)+xQ_2(x^2,y)$$
Then
$$P(x,y)=frac{P_1(x^2,y)+xP_2(x^2,y)}{Q_1(x^2,y)+xQ_2(x^2,y)}=frac{[P_1(x^2,y)+xP_2(x^2,y)][Q_1(x^2,y)-xQ_2(x^2,y)]}{[Q_1(x^2,y)]^2-[xQ_2(x^2,y)]^2}$$
$$=frac{P_3(x^2,y)+xP_4(x^2,y)}{Q_3(x^2,y)}$$
where
$$P_3(x^2,y)=P_1(x^2,y)Q_1(x^2,y)-x^2P_2(x,y^2)Q_2(x^2,y)$$
$$P_4(x^2,y)=-P_1(x^2,y)Q_2(x^2,y)+P_2(x^2,y)Q_1(x^2,y)$$
$$Q_3(x^2,y)=[Q_1(x^2,y)]^2+[xQ_2(x^2,y)]^2$$
For part (a), since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}$$
How could I eliminate $uP_4(u^2,v)$, or in the other way, this is alike an even function for single variable function, to show $R(u,v)$ in this form has no odd degree for $u$? If part (a) is done, then part (b) can be brought down using similar way as part (a).
Update:
Since $R(−u,v)= R(u,v)$, then
$$R(-u,v)=frac{P_3(u^2,v)-uP_4(u^2,v)}{Q_3(u^2,v)}=frac{P_3(u^2,v)+uP_4(u^2,v)}{Q_3(u^2,v)}=R(u,v)$$
implies $uP_4(u^2,v)=0$ for all $u$ and $v$. (Not necessary $u=0$). Then it is proven and follows part (a)? I am confused but this idea suddenly come
into my mind.
integration rational-functions
integration rational-functions
edited Jan 4 at 5:09
weilam06
asked Jan 1 at 9:44
weilam06weilam06
14811
14811
This question has an open bounty worth +50
reputation from weilam06 ending in 3 days.
The question is widely applicable to a large audience. A detailed canonical answer is required to address all the concerns.
A detailed explanation for the part (b) and (c), and little improvement and correction for my working.
This question has an open bounty worth +50
reputation from weilam06 ending in 3 days.
The question is widely applicable to a large audience. A detailed canonical answer is required to address all the concerns.
A detailed explanation for the part (b) and (c), and little improvement and correction for my working.
add a comment |
add a comment |
2 Answers
2
active
oldest
votes
This topic was considered in a Russian book “Differential and Integral Calculus” by Grigorii Fichtenholz (v. II, 7-th edition, M.: Nauka, 1970). I copied the relevant pages: 74, 75, and 76.
According to Wikipedia
“book was translated, among others, into German, Chinese, and Persian however translation to English language has not been done still”. Below I translated the key moments relevant to your question.
(a) The conclusion $uP_4(u^2,v)=0$ looks OK. Indeed, even taking into account possible zeroes of the denominator $Q_3(u^2,v)$, we have a that a polynomial $S(u,v)= uP_4(u^2,v)Q_3(u^2,v)$ is zero for each $u,v$. Then $S(u,v)$ is the zero polynomial.
Indeed, assume to the contrary that $S(u,v)=s_0(u)+s_1(u)v+dots+s_m(u)v^m$ for some polynomials $s_0,dots, s_m$ such that $s_m(u)$ is not the zero polynomial. Since $s_m(u)$ has only finitely many roots, we can pick $u_0$ such that $s_m(u_0)ne 0$. Then $S(u_0,v)$ is
not the zero polynomial on $v$, so there exists $v_0$ such that $S(u_0,v_0)ne 0$, a contradiction.
(b) Fichtenholz says that the first part follows from (a) applid to the function $frac {R(u,v)}u$. The second part is I at at p. 75 which states
$$R(sin x,cos x)dx=R_0(sin^2 x,cos x)sin x dx=-R_0(1-cos^2 x,cos x) dcos x.$$
(c) I translated III from pp. 75-76:
Substtuting $u$ by $frac uvv$, we have
$$R(u,v)=Rleft(frac uvv,vright)=R^*left(frac uv,vright).$$
By the property of the function $R$ with the respect to change of signs of $u$ and $v$ (which does not change the fraction $frac uv$),
$$R^*left(frac uv,-vright)= R^*left(frac uv,vright),$$
then, as we know,
$$R^*left(frac uv,vright)=R_1^*left(frac uv,v^2right).$$
Thus
$$R(sin x,cos x)=R^*_1(tan x,cos^2 x)= R^*_1left(tan x,frac 1{1+tan^2 x}right),$$
that is
$$R(sin x,cos x)=tilde R(tan x).$$
Here a substitution $t=tan x$ $left(-fracpi{2}<x<fracpi{2} right)$ reaches the goal, because $$R(sin x,cos x)dx=tilde R(t)frac{dt}{1+t^2}$$
add a comment |
For part a), note that if $$R(u,v)=R(-u,v)$$then $$P(-u,v)Q(u,v)=P(u,v)Q(-u,v)$$Define $S(u,v)=P(-u,v)Q(u,v)$. Then $S(u,v)$ is a polynomial function of $u,v$ since it is the product of two other polynomials and therefore can be expressed as $$S(u,v)=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j$$from $S(u,v)=S(-u,v)$ we obtain$$sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$which yields to $$sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$from which we obtain $$sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=0$$for any $u,v$. Therefore we can write $$S(u,v){=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i'=0}^{m'}sum_{j=0}^{n}a_{i'j}u^{2i'}v^j\=S_1(u^2,v)}$$This means that $S(u,v)$ is a polynomial of $u^2$ and $v$ so is $P(-u,v)Q(u,v)$. Then both $P(-u,v)$ and $Q(u,v)$ must be polynomials of $u^2$ and $v$ (otherwise at least one term as $u^{2k+1}v^l$ would be appeared in $S(u,v)$) and by dividing $P(u,v)$ on $Q(u,v)$ we conclude that $$R(u,v)=R_1(u^2,v)$$The other parts can be proved easily.
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "69"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3058323%2fprimitive-of-int-r-cos-x-sin-x-dx%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
This topic was considered in a Russian book “Differential and Integral Calculus” by Grigorii Fichtenholz (v. II, 7-th edition, M.: Nauka, 1970). I copied the relevant pages: 74, 75, and 76.
According to Wikipedia
“book was translated, among others, into German, Chinese, and Persian however translation to English language has not been done still”. Below I translated the key moments relevant to your question.
(a) The conclusion $uP_4(u^2,v)=0$ looks OK. Indeed, even taking into account possible zeroes of the denominator $Q_3(u^2,v)$, we have a that a polynomial $S(u,v)= uP_4(u^2,v)Q_3(u^2,v)$ is zero for each $u,v$. Then $S(u,v)$ is the zero polynomial.
Indeed, assume to the contrary that $S(u,v)=s_0(u)+s_1(u)v+dots+s_m(u)v^m$ for some polynomials $s_0,dots, s_m$ such that $s_m(u)$ is not the zero polynomial. Since $s_m(u)$ has only finitely many roots, we can pick $u_0$ such that $s_m(u_0)ne 0$. Then $S(u_0,v)$ is
not the zero polynomial on $v$, so there exists $v_0$ such that $S(u_0,v_0)ne 0$, a contradiction.
(b) Fichtenholz says that the first part follows from (a) applid to the function $frac {R(u,v)}u$. The second part is I at at p. 75 which states
$$R(sin x,cos x)dx=R_0(sin^2 x,cos x)sin x dx=-R_0(1-cos^2 x,cos x) dcos x.$$
(c) I translated III from pp. 75-76:
Substtuting $u$ by $frac uvv$, we have
$$R(u,v)=Rleft(frac uvv,vright)=R^*left(frac uv,vright).$$
By the property of the function $R$ with the respect to change of signs of $u$ and $v$ (which does not change the fraction $frac uv$),
$$R^*left(frac uv,-vright)= R^*left(frac uv,vright),$$
then, as we know,
$$R^*left(frac uv,vright)=R_1^*left(frac uv,v^2right).$$
Thus
$$R(sin x,cos x)=R^*_1(tan x,cos^2 x)= R^*_1left(tan x,frac 1{1+tan^2 x}right),$$
that is
$$R(sin x,cos x)=tilde R(tan x).$$
Here a substitution $t=tan x$ $left(-fracpi{2}<x<fracpi{2} right)$ reaches the goal, because $$R(sin x,cos x)dx=tilde R(t)frac{dt}{1+t^2}$$
add a comment |
This topic was considered in a Russian book “Differential and Integral Calculus” by Grigorii Fichtenholz (v. II, 7-th edition, M.: Nauka, 1970). I copied the relevant pages: 74, 75, and 76.
According to Wikipedia
“book was translated, among others, into German, Chinese, and Persian however translation to English language has not been done still”. Below I translated the key moments relevant to your question.
(a) The conclusion $uP_4(u^2,v)=0$ looks OK. Indeed, even taking into account possible zeroes of the denominator $Q_3(u^2,v)$, we have a that a polynomial $S(u,v)= uP_4(u^2,v)Q_3(u^2,v)$ is zero for each $u,v$. Then $S(u,v)$ is the zero polynomial.
Indeed, assume to the contrary that $S(u,v)=s_0(u)+s_1(u)v+dots+s_m(u)v^m$ for some polynomials $s_0,dots, s_m$ such that $s_m(u)$ is not the zero polynomial. Since $s_m(u)$ has only finitely many roots, we can pick $u_0$ such that $s_m(u_0)ne 0$. Then $S(u_0,v)$ is
not the zero polynomial on $v$, so there exists $v_0$ such that $S(u_0,v_0)ne 0$, a contradiction.
(b) Fichtenholz says that the first part follows from (a) applid to the function $frac {R(u,v)}u$. The second part is I at at p. 75 which states
$$R(sin x,cos x)dx=R_0(sin^2 x,cos x)sin x dx=-R_0(1-cos^2 x,cos x) dcos x.$$
(c) I translated III from pp. 75-76:
Substtuting $u$ by $frac uvv$, we have
$$R(u,v)=Rleft(frac uvv,vright)=R^*left(frac uv,vright).$$
By the property of the function $R$ with the respect to change of signs of $u$ and $v$ (which does not change the fraction $frac uv$),
$$R^*left(frac uv,-vright)= R^*left(frac uv,vright),$$
then, as we know,
$$R^*left(frac uv,vright)=R_1^*left(frac uv,v^2right).$$
Thus
$$R(sin x,cos x)=R^*_1(tan x,cos^2 x)= R^*_1left(tan x,frac 1{1+tan^2 x}right),$$
that is
$$R(sin x,cos x)=tilde R(tan x).$$
Here a substitution $t=tan x$ $left(-fracpi{2}<x<fracpi{2} right)$ reaches the goal, because $$R(sin x,cos x)dx=tilde R(t)frac{dt}{1+t^2}$$
add a comment |
This topic was considered in a Russian book “Differential and Integral Calculus” by Grigorii Fichtenholz (v. II, 7-th edition, M.: Nauka, 1970). I copied the relevant pages: 74, 75, and 76.
According to Wikipedia
“book was translated, among others, into German, Chinese, and Persian however translation to English language has not been done still”. Below I translated the key moments relevant to your question.
(a) The conclusion $uP_4(u^2,v)=0$ looks OK. Indeed, even taking into account possible zeroes of the denominator $Q_3(u^2,v)$, we have a that a polynomial $S(u,v)= uP_4(u^2,v)Q_3(u^2,v)$ is zero for each $u,v$. Then $S(u,v)$ is the zero polynomial.
Indeed, assume to the contrary that $S(u,v)=s_0(u)+s_1(u)v+dots+s_m(u)v^m$ for some polynomials $s_0,dots, s_m$ such that $s_m(u)$ is not the zero polynomial. Since $s_m(u)$ has only finitely many roots, we can pick $u_0$ such that $s_m(u_0)ne 0$. Then $S(u_0,v)$ is
not the zero polynomial on $v$, so there exists $v_0$ such that $S(u_0,v_0)ne 0$, a contradiction.
(b) Fichtenholz says that the first part follows from (a) applid to the function $frac {R(u,v)}u$. The second part is I at at p. 75 which states
$$R(sin x,cos x)dx=R_0(sin^2 x,cos x)sin x dx=-R_0(1-cos^2 x,cos x) dcos x.$$
(c) I translated III from pp. 75-76:
Substtuting $u$ by $frac uvv$, we have
$$R(u,v)=Rleft(frac uvv,vright)=R^*left(frac uv,vright).$$
By the property of the function $R$ with the respect to change of signs of $u$ and $v$ (which does not change the fraction $frac uv$),
$$R^*left(frac uv,-vright)= R^*left(frac uv,vright),$$
then, as we know,
$$R^*left(frac uv,vright)=R_1^*left(frac uv,v^2right).$$
Thus
$$R(sin x,cos x)=R^*_1(tan x,cos^2 x)= R^*_1left(tan x,frac 1{1+tan^2 x}right),$$
that is
$$R(sin x,cos x)=tilde R(tan x).$$
Here a substitution $t=tan x$ $left(-fracpi{2}<x<fracpi{2} right)$ reaches the goal, because $$R(sin x,cos x)dx=tilde R(t)frac{dt}{1+t^2}$$
This topic was considered in a Russian book “Differential and Integral Calculus” by Grigorii Fichtenholz (v. II, 7-th edition, M.: Nauka, 1970). I copied the relevant pages: 74, 75, and 76.
According to Wikipedia
“book was translated, among others, into German, Chinese, and Persian however translation to English language has not been done still”. Below I translated the key moments relevant to your question.
(a) The conclusion $uP_4(u^2,v)=0$ looks OK. Indeed, even taking into account possible zeroes of the denominator $Q_3(u^2,v)$, we have a that a polynomial $S(u,v)= uP_4(u^2,v)Q_3(u^2,v)$ is zero for each $u,v$. Then $S(u,v)$ is the zero polynomial.
Indeed, assume to the contrary that $S(u,v)=s_0(u)+s_1(u)v+dots+s_m(u)v^m$ for some polynomials $s_0,dots, s_m$ such that $s_m(u)$ is not the zero polynomial. Since $s_m(u)$ has only finitely many roots, we can pick $u_0$ such that $s_m(u_0)ne 0$. Then $S(u_0,v)$ is
not the zero polynomial on $v$, so there exists $v_0$ such that $S(u_0,v_0)ne 0$, a contradiction.
(b) Fichtenholz says that the first part follows from (a) applid to the function $frac {R(u,v)}u$. The second part is I at at p. 75 which states
$$R(sin x,cos x)dx=R_0(sin^2 x,cos x)sin x dx=-R_0(1-cos^2 x,cos x) dcos x.$$
(c) I translated III from pp. 75-76:
Substtuting $u$ by $frac uvv$, we have
$$R(u,v)=Rleft(frac uvv,vright)=R^*left(frac uv,vright).$$
By the property of the function $R$ with the respect to change of signs of $u$ and $v$ (which does not change the fraction $frac uv$),
$$R^*left(frac uv,-vright)= R^*left(frac uv,vright),$$
then, as we know,
$$R^*left(frac uv,vright)=R_1^*left(frac uv,v^2right).$$
Thus
$$R(sin x,cos x)=R^*_1(tan x,cos^2 x)= R^*_1left(tan x,frac 1{1+tan^2 x}right),$$
that is
$$R(sin x,cos x)=tilde R(tan x).$$
Here a substitution $t=tan x$ $left(-fracpi{2}<x<fracpi{2} right)$ reaches the goal, because $$R(sin x,cos x)dx=tilde R(t)frac{dt}{1+t^2}$$
edited Jan 4 at 9:44
answered Jan 4 at 8:08
Alex RavskyAlex Ravsky
39.4k32181
39.4k32181
add a comment |
add a comment |
For part a), note that if $$R(u,v)=R(-u,v)$$then $$P(-u,v)Q(u,v)=P(u,v)Q(-u,v)$$Define $S(u,v)=P(-u,v)Q(u,v)$. Then $S(u,v)$ is a polynomial function of $u,v$ since it is the product of two other polynomials and therefore can be expressed as $$S(u,v)=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j$$from $S(u,v)=S(-u,v)$ we obtain$$sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$which yields to $$sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$from which we obtain $$sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=0$$for any $u,v$. Therefore we can write $$S(u,v){=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i'=0}^{m'}sum_{j=0}^{n}a_{i'j}u^{2i'}v^j\=S_1(u^2,v)}$$This means that $S(u,v)$ is a polynomial of $u^2$ and $v$ so is $P(-u,v)Q(u,v)$. Then both $P(-u,v)$ and $Q(u,v)$ must be polynomials of $u^2$ and $v$ (otherwise at least one term as $u^{2k+1}v^l$ would be appeared in $S(u,v)$) and by dividing $P(u,v)$ on $Q(u,v)$ we conclude that $$R(u,v)=R_1(u^2,v)$$The other parts can be proved easily.
add a comment |
For part a), note that if $$R(u,v)=R(-u,v)$$then $$P(-u,v)Q(u,v)=P(u,v)Q(-u,v)$$Define $S(u,v)=P(-u,v)Q(u,v)$. Then $S(u,v)$ is a polynomial function of $u,v$ since it is the product of two other polynomials and therefore can be expressed as $$S(u,v)=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j$$from $S(u,v)=S(-u,v)$ we obtain$$sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$which yields to $$sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$from which we obtain $$sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=0$$for any $u,v$. Therefore we can write $$S(u,v){=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i'=0}^{m'}sum_{j=0}^{n}a_{i'j}u^{2i'}v^j\=S_1(u^2,v)}$$This means that $S(u,v)$ is a polynomial of $u^2$ and $v$ so is $P(-u,v)Q(u,v)$. Then both $P(-u,v)$ and $Q(u,v)$ must be polynomials of $u^2$ and $v$ (otherwise at least one term as $u^{2k+1}v^l$ would be appeared in $S(u,v)$) and by dividing $P(u,v)$ on $Q(u,v)$ we conclude that $$R(u,v)=R_1(u^2,v)$$The other parts can be proved easily.
add a comment |
For part a), note that if $$R(u,v)=R(-u,v)$$then $$P(-u,v)Q(u,v)=P(u,v)Q(-u,v)$$Define $S(u,v)=P(-u,v)Q(u,v)$. Then $S(u,v)$ is a polynomial function of $u,v$ since it is the product of two other polynomials and therefore can be expressed as $$S(u,v)=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j$$from $S(u,v)=S(-u,v)$ we obtain$$sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$which yields to $$sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$from which we obtain $$sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=0$$for any $u,v$. Therefore we can write $$S(u,v){=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i'=0}^{m'}sum_{j=0}^{n}a_{i'j}u^{2i'}v^j\=S_1(u^2,v)}$$This means that $S(u,v)$ is a polynomial of $u^2$ and $v$ so is $P(-u,v)Q(u,v)$. Then both $P(-u,v)$ and $Q(u,v)$ must be polynomials of $u^2$ and $v$ (otherwise at least one term as $u^{2k+1}v^l$ would be appeared in $S(u,v)$) and by dividing $P(u,v)$ on $Q(u,v)$ we conclude that $$R(u,v)=R_1(u^2,v)$$The other parts can be proved easily.
For part a), note that if $$R(u,v)=R(-u,v)$$then $$P(-u,v)Q(u,v)=P(u,v)Q(-u,v)$$Define $S(u,v)=P(-u,v)Q(u,v)$. Then $S(u,v)$ is a polynomial function of $u,v$ since it is the product of two other polynomials and therefore can be expressed as $$S(u,v)=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j$$from $S(u,v)=S(-u,v)$ we obtain$$sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$which yields to $$sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j+sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}(-u)^iv^j$$from which we obtain $$sum_{i=0\itext{ is odd}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j=0$$for any $u,v$. Therefore we can write $$S(u,v){=sum_{i=0}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i=0\itext{ is even}}^{m}sum_{j=0}^{n}a_{ij}u^iv^j\=sum_{i'=0}^{m'}sum_{j=0}^{n}a_{i'j}u^{2i'}v^j\=S_1(u^2,v)}$$This means that $S(u,v)$ is a polynomial of $u^2$ and $v$ so is $P(-u,v)Q(u,v)$. Then both $P(-u,v)$ and $Q(u,v)$ must be polynomials of $u^2$ and $v$ (otherwise at least one term as $u^{2k+1}v^l$ would be appeared in $S(u,v)$) and by dividing $P(u,v)$ on $Q(u,v)$ we conclude that $$R(u,v)=R_1(u^2,v)$$The other parts can be proved easily.
answered Jan 4 at 13:22
Mostafa AyazMostafa Ayaz
14.1k3937
14.1k3937
add a comment |
add a comment |
Thanks for contributing an answer to Mathematics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fmath.stackexchange.com%2fquestions%2f3058323%2fprimitive-of-int-r-cos-x-sin-x-dx%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown