Processing math: 100%

Sine, Cosine, and Polynomials

There is another way to solve this Laplace transform involving derivatives (which we will do in this post). I am also aware that we can decompose sin and cos into exponents using Eulier’s identity. I do this in the next post. However, I came up with this proof when I was 14 and before I had a good understanding of complex numbers. So, I wanted to showcase it.

One benefit of this proof is that I give a closed-form polynomial rather than an addition/subtraction of two polynomials. This form may be easier to use in certain applications.


Deriving a Recurrence

We want to evaluate L{tnsint} and L{tncost}. First, let’s do one iteration of integration by parts, starting with sine.

L{tnsint}=0tnsin(t)estdt(using integration by parts)=1sf(0)1slimtf(t)est+1s0f(t)est dt=1s01slimttnsin(t)est+1s0(ntn1sin(t)+tncos(t))est dt=ns0tn1sin(t)est dt+1s0tncos(t)est dt=nsL{tn1sint}+1sL{tncost}

The limit goes to 0 since the sint term is bounded between [1,1], so it can be disregarded. Then, we just have the same limit as when we proved L{tn}. We now do similar with cosine.

L{tncost}=0tncos(t)estdt(using integration by parts)=1sf(0)1slimtf(t)est+1s0f(t)est dt=1s01slimttncos(t)est+1s0(ntn1cos(t)tnsin(t))est dt=ns0tn1cos(t)est dt1s0tnsin(t)est dt=nsL{tn1cost}1sL{tnsint}

Now, we can combine these two facts and with a little bit of algebra get

L{tnsint}=ns2+1[sL{tn1sint}+L{tn1cost}] L{tncost}=ns2+1[sL{tn1cost}L{tn1sint}]


Proving a Closed-Form Formula

Now that we have L{tnsint} and L{tncost} in terms of each other, I am going to assert the following solution, which was derivated based on a series of substitutions and pattern matching.

L{tnsint}=n!(s2+1)n+1n+1k=0A(nk)(n+1k)sk L{tncost}=n!(s2+1)n+1n+1k=0A(nk+1)(n+1k)sk


The function A(m) is defined as follows:

A(m)={1ifm0mod40ifm1mod41ifm2mod40ifm3mod4

There are a few ways we can mathematically represent A(m) (which we will discuss at the end), but for now, we just need the interpretation of A(m). It gives an alternating series (positive/negative) for the even values of m, and disregards odd indices of m.


Now, I’ll prove my asserted equations by induction on n0. Using the base case of n=0, we get the following. Note that (10)=1,  (11)=1,  0!=1,  A(0)=1, and A(1)=0.

(0)!(s2+1)11k=0A(k)(1k)sk=1s2+1[A(0)(10)s0+A(1)(11)s1]=1s2+1=L{sint} (0)!(s2+1)11k=0A(k+1)(1k)sk=1s2+1[A(1)(10)s0+A(0)(11)s1]=ss2+1=L{cost}


Now, to do the induction step, we use the result from above to evaluate L{tnsint}, assuming that the above equations hold for L{tn1sint} and L{tn1cost}

(s2+b2)n+1n!L{tnsint}=(s2+b2)n(n1)![sL{tn1sint} + L{tn1cost}]=(s2+b2)n(n1)!(n1)!(s2+b2)n[snk=0A(nk1)(nk)sk + nk=0A(nk)(nk)sk]=nk=0A(nk1)(nk)sk+1 + nk=0A(nk)(nk)skLet j=k+1 and reindex the first summation=nj=1A(nj)(nj1)sj + nk=0A(nk)(nk)skNotice that adding a j=0 term to the first summation gives a 0 term since (n1)=0Also, adding a k=n+1 term to the second summation will give a 0 term since A(1)=0Finally, recall the identity (n+1k)=(nk)+(nk1) thus, we combine the two summations=n+1k=0A(nk)(n+1k)sk

We do likewise for L{tncost}.

(s2+b2)n+1n!L{tncost}=(s2+b2)n(n1)![sL{tn1cost}  L{tn1sint}]=(s2+b2)n(n1)!(n1)!(s2+b2)n[snk=0A(nk)(nk)sk  nk=0A(nk1)(nk)sk]=nk=0A(nk)(nk)sk+1  nk=0A(nk1)(nk)skLet j=k+1 and reindex the first summation=nj=1A(nj+1)(nj1)sj  nk=0A(nk1)(nk)skRecall the property that A(m)=A(m+2) Thus, A(nk1)=A(nk+1)Now, it's the same argument as before to combine the summations=n+1k=0A(nk+1)(n+1k)sk


Scaling

To derive the scaled versions, we use the scaling property f(bt)=1bF(s/b).

L{tnsin(bt)}=1bnL{(bt)nsin(bt)}=1bn1bn!((s/b)2+1)n+1n+1k=0A(nk)(n+1k)(s/b)k=bn+1n!(s2+b2)n+1n+1k=0A(nk)(n+1k)(s/b)k=n!(s2+b2)n+1n+1k=0A(nk)(n+1k)skbnk+1

Similarly

L{tncos(bt)}=1bnL{(bt)ncos(bt)}=1bn1bn!((s/b)2+1)n+1n+1k=0A(nk+1)(n+1k)(s/b)k=bn+1n!(s2+b2)n+1n+1k=0A(nk+1)(n+1k)(s/b)k=n!(s2+b2)n+1n+1k=0A(nk+1)(n+1k)skbnk+1


Translation

Same as in the previous post, we can use the identities

sin(θ+ϕ)=sin(θ)cos(ϕ)+cos(θ)sin(ϕ) cos(θ+ϕ)=cos(θ)cos(ϕ)sin(θ)sin(ϕ)

Thus, we have all of the ingredients to write a closed-form solution to L{tnsin(bt+c)} and L{tncos(bt+c)}. But I’ll leave that up to you if you so choose.


Alternative Forms

As I stated earlier, there are various ways to represent A(m). Algebraically, I think these are the most straightforward.

A(m)=12(im+(i)m)=12(1(1)m)im

Another alternative is to get rid of the summations altogether. We can do this as follows

L{tnsin(bt)}=n!(s2+b2)n+1(s+ib)n+1(sib)n+12i L{tncos(bt)}=n!(s2+b2)n+1(s+ib)n+1+(sib)n+12

Notice that all terms will be real because the imaginary terms always cancel out.

Laplace Transforms Series