-
Notifications
You must be signed in to change notification settings - Fork 20
/
ch-higher-order-ode.tex
3131 lines (2794 loc) · 102 KB
/
ch-higher-order-ode.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
\chapter{Higher order linear ODEs} \label{ho:chapter}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\section{Second order linear ODEs}
\label{solinear:section}
\sectionnotes{1 lecture, reduction of order optional\EPref{,
first part of \S3.1 in \cite{EP}}\BDref{,
parts of \S3.1 and \S3.2 in \cite{BD}}}
Consider the general
\emph{\myindex{second order linear differential equation}}
\begin{equation*}
A(x) y'' + B(x)y' + C(x)y = F(x) .
\end{equation*}
We usually divide through by $A(x)$ to get
\begin{equation} \label{sol:eqlin}
y'' + p(x)y' + q(x)y = f(x) ,
\end{equation}
where $p(x) = \nicefrac{B(x)}{A(x)}$, $q(x) = \nicefrac{C(x)}{A(x)}$, and
$f(x) = \nicefrac{F(x)}{A(x)}$.
The word \emph{linear\index{linear equation}} means that the equation contains no powers nor
functions of $y$, $y'$, and $y''$.
In the special case when $f(x) = 0$, we have a so-called
\emph{homogeneous\index{homogeneous linear equation}}
equation
\begin{equation} \label{sol:eqlinhom}
y'' + p(x)y' + q(x)y = 0 .
\end{equation}
We have already seen some second order linear homogeneous equations.
\begin{align*}
\qquad y'' + k^2 y & = 0 &
& \text{Two solutions are:} \quad y_1 = \cos (kx), \quad y_2 = \sin(kx) . \qquad \\
\qquad y'' - k^2 y & = 0 &
& \text{Two solutions are:} \quad y_1 = e^{kx}, \quad y_2 = e^{-kx} . \qquad
\end{align*}
If we know two solutions of a linear homogeneous equation, we know many
more of them.
\begin{theorem}[Superposition]\index{superposition}
Suppose $y_1$ and $y_2$ are two solutions of the
homogeneous equation \eqref{sol:eqlinhom}. Then
\begin{equation*}
y(x) = C_1 y_1(x) + C_2 y_2(x) ,
\end{equation*}
also solves \eqref{sol:eqlinhom} for arbitrary constants $C_1$ and $C_2$.
\end{theorem}
That is, we can add solutions together and multiply them by constants to
obtain new and different solutions. We call
the expression $C_1 y_1 + C_2 y_2$ a
\emph{\myindex{linear combination}} of $y_1$ and $y_2$.
Let us
prove this theorem; the
proof is very enlightening and illustrates how linear equations work.
\medskip
\emph{Proof:}
Let
$y = C_1 y_1 + C_2 y_2$. Then
\begin{equation*}
\begin{split}
y'' + py' + qy & =
(C_1 y_1 + C_2 y_2)'' + p(C_1 y_1 + C_2 y_2)' + q(C_1 y_1 + C_2 y_2) \\
& = C_1 y_1'' + C_2 y_2'' + C_1 p y_1' + C_2 p y_2' + C_1 q y_1 + C_2 q y_2 \\
& = C_1 ( y_1'' + p y_1' + q y_1 ) + C_2 ( y_2'' + p y_2' + q y_2 ) \\
& = C_1 \cdot 0 + C_2 \cdot 0 = 0 . \qed
\end{split}
\end{equation*}
\medskip
The proof becomes even simpler to state if we use the
operator notation.
An \emph{\myindex{operator}} is an object that eats functions and spits out functions (kind of
like what a function is, but a function eats numbers and spits out numbers).
Define the operator $L$ by
\begin{equation*}
Ly = y'' + py' + qy .
\end{equation*}
The differential equation now becomes $Ly=0$.
The operator (and the equation)
$L$ being \emph{linear}\index{linear operator} means that $L(C_1y_1 + C_2y_2) =
C_1 Ly_1 + C_2 Ly_2$. It is almost as if we were \myquote{multiplying} by $L$. The proof above becomes
\begin{equation*}
Ly = L(C_1y_1 + C_2y_2) =
C_1 Ly_1 + C_2 Ly_2 = C_1 \cdot 0 + C_2 \cdot 0 = 0 .
\end{equation*}
\medskip
Two different solutions to the second equation $y'' - k^2y = 0$ are
$y_1 = \cosh (kx)$ and $y_2 = \sinh (kx)$.
Recalling the definition of $\sinh$ and $\cosh$,
we note that these are solutions by
superposition as they
are linear combinations of the two
exponential solutions:
$\cosh(kx) = \frac{e^{kx} + e^{-kx}}{2} = (\nicefrac{1}{2}) e^{kx} +
(\nicefrac{1}{2}) e^{-kx}$ and
$\sinh(kx) = \frac{e^{kx} - e^{-kx}}{2} = (\nicefrac{1}{2}) e^{kx} -
(\nicefrac{1}{2}) e^{-kx}$.
The functions $\sinh$ and $\cosh$ are sometimes more convenient to use than the
exponential. Let us review some of their properties:
\begin{align*}
& \cosh 0 = 1 , & & \sinh 0 = 0 , \\
& \frac{d}{dx} \Bigl[ \cosh x \Bigr] = \sinh x , & & \frac{d}{dx} \Bigl[ \sinh x \Bigr] = \cosh x , \\
& \cosh^2 x - \sinh^2 x = 1 .
\end{align*}
\begin{exercise}
Derive these properties using the definitions of $\sinh$
and $\cosh$ in terms of exponentials.
\end{exercise}
Linear equations have nice and simple
answers to the existence and uniqueness question.
\begin{theorem}[Existence and uniqueness]\index{existence and uniqueness}
Suppose $p, q, f$ are continuous functions on some interval
$I$, $a$ is a number in $I$,
and $b_0, b_1$ are constants.
Then the equation
\begin{equation*}
y'' + p(x) y' + q(x) y = f(x) ,
\end{equation*}
has exactly one solution $y(x)$ defined on the interval $I$ satisfying the initial conditions
\begin{equation*}
y(a) = b_0 , \qquad y'(a) = b_1 .
\end{equation*}
\end{theorem}
For example, the equation $y'' + k^2 y = 0$ with $y(0) = b_0$ and $y'(0) = b_1$
has the solution
\begin{equation*}
y(x) = b_0 \cos (kx) + \frac{b_1}{k} \sin (kx) .
\end{equation*}
The equation $y'' - k^2 y = 0$ with $y(0) = b_0$ and $y'(0) = b_1$
has the solution
\begin{equation*}
y(x) = b_0 \cosh (kx) + \frac{b_1}{k} \sinh (kx) .
\end{equation*}
Using $\cosh$ and $\sinh$ in this solution allows us to solve for
the initial conditions
in a cleaner way
than if we have used the exponentials.
\medskip
The initial conditions for a second order ODE consist of two
equations. Common sense tells us that
if we have two arbitrary constants and two equations, then we should
be able to solve
for the constants and find a solution to the differential equation
satisfying the initial conditions.
\emph{Question:} Suppose we find two different solutions $y_1$ and $y_2$ to the
homogeneous equation \eqref{sol:eqlinhom}. Can every solution
be written (using superposition) in the form
$y = C_1 y_1 + C_2 y_2$?
Answer is affirmative! Provided that $y_1$ and $y_2$ are different enough in
the following sense. We say $y_1$ and $y_2$ are \emph{\myindex{linearly
independent}} if one is not a constant multiple of the other.
\begin{theorem}
Let $p, q$ be continuous functions.
Let $y_1$ and $y_2$ be two linearly independent
solutions to the homogeneous equation \eqref{sol:eqlinhom}.
Then every other solution is
of the form
\begin{equation*}
y = C_1 y_1 + C_2 y_2 .
\end{equation*}
That is, $y = C_1 y_1 + C_2 y_2$ is the general solution.
\end{theorem}
For example, we found the solutions
$y_1 = \sin x$ and $y_2 = \cos x$ for the
equation $y'' + y = 0$. It is not hard to see that sine and cosine are not
constant
multiples of each other. Indeed, if $\sin x = A \cos x$ for some constant $A$,
plugging in $x=0$ would imply $A = 0$. But then $\sin x = 0$ for all
$x$, which is preposterous.
So $y_1$ and $y_2$ are linearly independent. Hence,
\begin{equation*}
y = C_1 \cos x + C_2 \sin x
\end{equation*}
is the general solution to $y'' + y = 0$.
For two functions, checking linear independence is rather simple. Let us
see another example. Consider $y''-2x^{-2}y = 0$. Then $y_1 = x^2$ and $y_2 =
\nicefrac{1}{x}$ are solutions. To see that they are linearly independent,
suppose one is a multiple of the other: $y_1 = A y_2$, we just have to find
out that $A$ cannot be a constant. In this case we have $A =
\nicefrac{y_1}{y_2} = x^3$, this most decidedly not a constant.
So $y = C_1 x^2 + C_2 \nicefrac{1}{x}$ is the general solution.
\medskip
If you have one nonzero solution to a second order linear homogeneous
equation, then you can find another one. This is the \emph{\myindex{reduction of
order method}}. The idea is that if we somehow found $y_1$ as a solution of
$y'' + p(x) y' + q(x) y = 0$, then we try a second
solution of the form $y_2(x) = y_1(x) v(x)$.
We just need to find $v$. We plug $y_2$ into the equation:
\begin{equation*}
\begin{split}
0 =
y_2'' + p(x) y_2' + q(x) y_2
& =
\underbrace{y_1'' v + 2 y_1' v' + y_1 v''}_{y_2''}
+ p(x) \underbrace{( y_1' v + y_1 v' )}_{y_2'}
+ q(x) \underbrace{y_1 v}_{y_2}
\\
& =
y_1 v''
+ (2 y_1' + p(x) y_1) v'
+
\cancelto{0}{\bigl( y_1'' + p(x) y_1' + q(x) y_1 \bigr)} v .
\end{split}
\end{equation*}
In other words,
$y_1 v'' + (2 y_1' + p(x) y_1) v' = 0$. Using $w = v'$, we have the
first order linear equation
$y_1 w' + (2 y_1' + p(x) y_1) w = 0$. After solving this equation for $w$
(integrating factor),
we find $v$ by antidifferentiating $w$. We then form $y_2$ by computing
$y_1 v$. For example, suppose we somehow know $y_1 = x$ is a solution
to $y''+x^{-1}y'-x^{-2} y=0$.
The equation for $w$ is then
$xw' + 3 w = 0$. We find a solution, $w = Cx^{-3}$, and we find an
antiderivative $v = \frac{-C}{2x^2}$.
Hence $y_2 = y_1 v = \frac{-C}{2x}$.
Any $C$ works and so $C=-2$ makes $y_2 = \nicefrac{1}{x}$. Thus, the
general solution is $y = C_1 x + C_2\nicefrac{1}{x}$.
Since we have a formula for the solution to the first order linear equation,
we can write a formula for $y_2$:
\begin{equation*}
y_2(x) = y_1(x) \int \frac{e^{-\int p(x)\,dx}}{{\bigl(y_1(x)\bigr)}^2} \,dx
\end{equation*}
However, it is much easier to remember that we just need to try $y_2(x) =
y_1(x) v(x)$ and find $v(x)$ as we did above. The technique
works for higher order equations too: You get to reduce the order by one for each
solution you find. So it is better to remember how to do
it rather than a specific formula.
\medskip
We will study the solution of nonhomogeneous equations in
\sectionref{sec:nonhom}. We will first focus on finding general solutions to
homogeneous equations.
\subsection{Exercises}
\begin{exercise}
Show that $y=e^x$ and $y=e^{2x}$ are linearly independent.
\end{exercise}
\begin{exercise}
Take $y'' + 5 y = 10 x + 5$. Find (guess!) a solution.
\end{exercise}
\begin{exercise}
Prove the superposition principle for nonhomogeneous equations. Suppose that
$y_1$ is a solution to $L y_1 = f(x)$ and $y_2$ is a solution to
$L y_2 = g(x)$ (same linear operator $L$). Show that $y = y_1+y_2$ solves
$Ly = f(x) + g(x)$.
\end{exercise}
\begin{exercise}
For the equation $x^2 y'' - x y' = 0$, find two solutions, show that they
are linearly independent and find the general solution.
Hint: Try $y = x^r$.
\end{exercise}
\pagebreak[2]
Equations of the form $a x^2 y'' + b x y' + c y = 0$ are called
\emph{Euler's equations\index{Euler's equation}} or
\emph{Cauchy--Euler equations\index{Cauchy--Euler equation}}.
They are solved by trying
$y=x^r$ and solving for $r$ (assume $x \geq 0$ for simplicity).
\begin{exercise} \label{sol:eulerex}
\pagebreak[2]
Suppose that ${(b-a)}^2-4ac > 0$.
\begin{tasks}
\task Find a formula for the general solution
of Euler's equation (see above) $a x^2 y'' + b x y' + c y = 0$.
Hint: Try $y=x^r$ and find a formula for $r$.
\task What happens when ${(b-a)}^2-4ac = 0$ or ${(b-a)}^2-4ac < 0$?
\end{tasks}
\end{exercise}
We will revisit the case when ${(b-a)}^2-4ac < 0$ later.
\begin{exercise} \label{sol:eulerexln}
Same equation as in \exerciseref{sol:eulerex}.
Suppose ${(b-a)}^2-4ac = 0$. Find a formula for the general solution
of $a x^2 y'' + b x y' + c y = 0$. Hint: Try $y=x^r \ln x$ for the second
solution.
\end{exercise}
\begin{exercise}[reduction of order] \label{exercise:reductionoforder}
Suppose $y_1$ is a solution to $y'' + p(x) y' + q(x) y = 0$.
By directly plugging into the equation,
show that
\begin{equation*}
y_2(x) = y_1(x) \int \frac{e^{-\int p(x)\,dx}}{{\bigl(y_1(x)\bigr)}^2} \,dx
\end{equation*}
is also a solution.
\end{exercise}
\begin{exercise}[\myindex{Chebyshev's equation of order 1}]
Take
$(1-x^2)y''-xy' + y = 0$.
\begin{tasks}
\task Show that $y=x$ is a solution.
\task Use reduction of order to find a second linearly independent solution.
\task Write down the general solution.
\end{tasks}
\end{exercise}
\begin{exercise}[\myindex{Hermite's equation of order 2}]
Take
$y''-2xy' + 4y = 0$.
\begin{tasks}
\task Show that $y=1-2x^2$ is a solution.
\task Use reduction of order to find a second linearly independent solution.
(It's OK to leave a definite integral in the formula.)
\task Write down the general solution.
\end{tasks}
\end{exercise}
\setcounter{exercise}{100}
\begin{exercise}
Are $\sin(x)$ and $e^x$ linearly independent? Justify.
\end{exercise}
\exsol{%
Yes. To justify try to find a constant $A$ such that $\sin(x) = A e^x$
for all $x$.
}
\begin{exercise}
Are $e^x$ and $e^{x+2}$ linearly independent? Justify.
\end{exercise}
\exsol{%
No. $e^{x+2} = e^2 e^x$.
}
\begin{exercise}
Guess a solution to $y'' + y' + y= 5$.
\end{exercise}
\exsol{%
$y=5$
}
\begin{exercise}
Find the general solution to
$x y'' + y' = 0$. Hint: It is a first order ODE in $y'$.
\end{exercise}
\exsol{%
$y=C_1 \ln(x) + C_2$
}
\begin{exercise}
Write down an equation (guess) for which we have the solutions
$e^x$ and $e^{2x}$. Hint: Try an equation of the form
$y''+Ay'+By = 0$ for constants $A$ and $B$,
plug in both $e^x$ and $e^{2x}$ and solve for $A$ and $B$.
\end{exercise}
\exsol{%
$y''-3y'+2y = 0$
}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\sectionnewpage
\section{Constant coefficient second order linear ODEs}
\label{sec:ccsol}
%mbxINTROSUBSECTION
\sectionnotes{more than 1 lecture\EPref{,
second part of \S3.1 in \cite{EP}}\BDref{,
\S3.1 in \cite{BD}}}
\subsection{Solving constant coefficient equations}
Consider the problem
\begin{equation*}
y''-6y'+8y = 0, \qquad y(0) = - 2, \qquad y'(0) = 6 .
\end{equation*}
This is a second order linear homogeneous equation with constant
coefficients. \emph{Constant coefficients\index{constant coefficient}}
means that the functions
in front of $y''$, $y'$, and $y$ are constants, they do not depend on $x$.
To guess a solution, think of a function that stays essentially the
same when we differentiate it, so that we can take the function and its
derivatives, add some multiples of these together, and end up with zero.
Yes, we are talking about the exponential.
Let us try\footnote{%
Making an educated guess with some parameters to solve for
is such a central technique in differential equations that people sometimes use
a fancy name for such a guess: \emph{\myindex{ansatz}}, German for \myquote{initial
placement of a tool at a work piece.} Yes, the Germans have a word for that.}
a solution of the form $y = e^{rx}$. Then $y' = r e^{rx}$ and
$y'' = r^2 e^{rx}$. Plug in to get
\begin{align*}
y''-6y'+8y & = 0 , \\
\underbrace{r^2 e^{rx}}_{y''} -6 \underbrace{r e^{rx}}_{y'}+8 \underbrace{e^{rx}}_{y} & = 0 , \\
r^2 -6 r +8 & = 0 \qquad \text{(divide through by } e^{rx} \text{)},\\
(r-2)(r-4) & = 0 .
\end{align*}
Hence, if $r=2$ or $r=4$, then $e^{rx}$ is a solution. So let $y_1 = e^{2x}$
and $y_2 = e^{4x}$.
\begin{exercise}
Check that $y_1$ and $y_2$ are solutions.
\end{exercise}
The functions $e^{2x}$ and $e^{4x}$ are linearly independent. If they
were not linearly independent, we could write $e^{4x} = C e^{2x}$ for
some constant $C$,
implying that $e^{2x} = C$ for all $x$, which is clearly not possible.
Hence, we can write the general solution as
\begin{equation*}
y = C_1 e^{2x} + C_2 e^{4x} .
\end{equation*}
We need to solve for $C_1$ and $C_2$. To apply the initial conditions,
we first find $y' = 2 C_1 e^{2x} + 4 C_2 e^{4x}$. We plug $x=0$ into
$y$ and $y'$ and solve.
\begin{equation*}
\begin{aligned}
-2 & = y(0) = C_1 + C_2 , \\
6 & = y'(0) = 2 C_1 + 4 C_2 .
\end{aligned}
\end{equation*}
Either apply some matrix algebra, or just solve these by high school
math. For example, divide the second equation by 2
to obtain $3 = C_1 + 2 C_2$, and subtract the two equations to
get $5 = C_2$. Then $C_1 = -7$ as $-2 = C_1 + 5$. Hence, the solution we
are
looking for is
\begin{equation*}
y = -7 e^{2x} + 5 e^{4x} .
\end{equation*}
\medskip
We generalize this example into a method.
Suppose that we have an equation
\begin{equation} \label{ccsol:eq}
a y'' + b y' + c y = 0 ,
\end{equation}
where $a, b, c$ are constants. Try the solution $y = e^{rx}$ to obtain
\begin{equation*}
a r^2 e^{rx} +
b r e^{rx} +
c e^{rx} = 0 .
\end{equation*}
Divide by $e^{rx}$ to obtain the so-called
\emph{\myindex{characteristic equation}} of the ODE:
\begin{equation*}
a r^2 +
b r +
c = 0 .
\end{equation*}
Solve for the $r$ by using the \myindex{quadratic formula}:
\begin{equation*}
r_1, r_2 = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a} .
\end{equation*}
Suppose that $b^2 -4ac \geq 0$ for now so that $r_1$ and $r_2$ are real.
So $e^{r_1 x}$ and $e^{r_2 x}$ are solutions. There is
still a difficulty if $r_1 = r_2$, but it is not hard to overcome.
\begin{theorem}
Suppose that $r_1$ and $r_2$ are the roots of the characteristic equation.
\begin{enumerate}[(i)]
\item If $r_1$ and $r_2$ are distinct and real (when $b^2 - 4ac > 0$),
then \eqref{ccsol:eq} has the general solution
\begin{equation*}
y = C_1 e^{r_1 x} + C_2 e^{r_2 x} .
\end{equation*}
\item If $r_1 = r_2$ (happens when $b^2 - 4ac = 0$),
then \eqref{ccsol:eq} has the general solution
\begin{equation*}
y = (C_1 + C_2 x)\, e^{r_1 x} .
\end{equation*}
\end{enumerate}
\end{theorem}
\begin{example} \label{example:expsecondorder}
Solve
\begin{equation*}
y'' - k^2 y = 0 .
\end{equation*}
The characteristic equation is $r^2 - k^2 = 0$ or
$(r-k)(r+k) = 0$. Consequently, $e^{-k x}$ and $e^{kx}$ are the two
linearly independent solutions, and the general solution is
\begin{equation*}
y = C_1 e^{kx} + C_2e^{-kx} .
\end{equation*}
Since
$\cosh s = \frac{e^s+e^{-s}}{2}$
and
$\sinh s = \frac{e^s-e^{-s}}{2}$,
we can also write the general solution
as
\begin{equation*}
y = D_1 \cosh(kx) + D_2 \sinh(kx) .
\end{equation*}
\end{example}
\begin{example}
Find the general solution of
\begin{equation*}
y'' -8 y' + 16 y = 0 .
\end{equation*}
The characteristic equation is $r^2 - 8 r + 16 = {(r-4)}^2 = 0$.
The equation has a
double root $r_1 = r_2 = 4$. The general solution is, therefore,
\begin{equation*}
y = (C_1 + C_2 x)\, e^{4 x} = C_1 e^{4x} + C_2 x e^{4x} .
\end{equation*}
\begin{exercise}
Check that $e^{4x}$ and $x e^{4x}$ are linearly independent.
\end{exercise}
It is good to check your work.
That $e^{4x}$ solves the equation is clear.
Let us check that
$x e^{4x}$ solves the equation.
Compute
$y' = e^{4x} + 4xe^{4x}$ and
$y'' = 8 e^{4x} + 16xe^{4x}$. Plug in,
\begin{equation*}
y'' - 8 y' + 16 y =
8 e^{4x} + 16xe^{4x} - 8(e^{4x} + 4xe^{4x}) + 16 xe^{4x} =
0 .
\end{equation*}
\end{example}
In some sense, a doubled root rarely happens. If coefficients are
picked randomly, a doubled root is unlikely.
There are, however, some real-world problems
where a doubled root does happen naturally (e.g., critically damped
mass-spring system as we will see).
Let us give a short argument for why the solution $x e^{r x}$ works for a
doubled root.
This case is a limiting case of
two distinct but very close roots. Note that
$\frac{e^{r_2 x} - e^{r_1 x}}{r_2 - r_1}$ is a solution when the roots are
distinct. When we take the limit as $r_1$ goes to $r_2$, we are really
taking the
derivative of $e^{rx}$ using $r$ as the variable. Therefore, the limit is
$x e^{rx}$, and hence this is a solution in the doubled root case.
We remark that in some numerical computations,
two very close roots may lead to numerical instability while
a doubled root will not.
\subsection{Complex numbers and Euler's formula}
A polynomial may have complex roots. The
equation $r^2 + 1 = 0$ has no real roots, but it does have two complex roots.
Here we review some properties of complex numbers\index{complex number}.
Complex numbers may seem a strange concept, especially because of the
terminology. There is nothing imaginary or really complicated about complex
numbers.
A complex number is simply a pair of real numbers, $(a,b)$.
Think of a complex number as a point in the plane. We add complex numbers
in the straightforward way: $(a,b)+(c,d)=(a+c,b+d)$. We define
multiplication\index{multiplication of complex numbers} by
\begin{equation*}
(a,b) \times (c,d) \overset{\text{def}}{=} (ac-bd,ad+bc) .
\end{equation*}
It turns out that with this multiplication rule, all the standard properties
of arithmetic hold. Further, and most importantly $(0,1) \times (0,1) =
(-1,0)$.
Generally we write $(a,b)$ as $a+ib$, and we treat $i$ as if it were an
unknown. When $b$ is zero, then $(a,0)$ is just the number $a$.
We do arithmetic with complex numbers just as we would
with polynomials.
The property we just mentioned becomes $i^2 = -1$.
So whenever we see $i^2$, we replace it by $-1$.
For example,
\begin{equation*}
(2+3i)(4i) - 5i =
(2\times 4)i + (3 \times 4) i^2 - 5i
=
8i + 12 (-1) - 5i
=
-12 + 3i .
\end{equation*}
The numbers
$i$ and $-i$ are the two roots of $r^2 + 1 = 0$.
Some engineers use the letter $j$ instead of $i$ for the square
root of $-1$. We use the mathematicians' convention and use $i$.
\begin{exercise}
Make sure you understand (that you can justify)
the following identities:
\begin{tasks}(2)
\task $i^2 = -1$, $i^3 = -i$, $i^4 = 1$,
\task $\dfrac{1}{i} = -i$,
\task $(3-7i)(-2-9i) = \cdots = -69-13i$,
\task $(3-2i)(3+2i) = 3^2 - {(2i)}^2 = 3^2 + 2^2 = 13$,
\task $\frac{1}{3-2i} = \frac{1}{3-2i} \frac{3+2i}{3+2i} = \frac{3+2i}{13}
= \frac{3}{13}+\frac{2}{13}i$.
\end{tasks}
\end{exercise}
\pagebreak[2]
We also define the exponential $e^{a+ib}$ of a complex number. We do
this by writing down the Taylor series and plugging in the complex
number. Because most properties of the exponential can be proved by looking
at the Taylor series, these
properties still hold for the complex
exponential. For example the very important property: $e^{x+y} = e^x e^y$. This means that
$e^{a+ib} = e^a e^{ib}$. Hence if we can compute $e^{ib}$, we can
compute $e^{a+ib}$. For $e^{ib}$, we use the so-called
\emph{\myindex{Euler's formula}}.
\begin{theorem}[Euler's formula] \label{eulersformula}
\begin{equation*}
\mybxbg{~~
e^{i \theta} = \cos \theta + i \sin \theta
\qquad \text{ and } \qquad
e^{- i \theta} = \cos \theta - i \sin \theta .
~~}
\end{equation*}
\end{theorem}
In other words, $e^{a+ib} = e^a \bigl( \cos(b) + i \sin(b) \bigr) = e^a \cos(b) + i e^a \sin(b)$.
\begin{exercise}
Using Euler's formula, check the identities:
\begin{equation*}
\cos \theta = \frac{e^{i \theta} + e^{-i \theta}}{2}
\qquad \text{and} \qquad
\sin \theta = \frac{e^{i \theta} - e^{-i \theta}}{2i}.
\end{equation*}
\end{exercise}
\begin{exercise}
Double angle identities:
Start with $e^{i(2\theta)} = {\bigl(e^{i \theta} \bigr)}^2$. Use Euler on
each side and deduce:
\begin{equation*}
\cos (2\theta) = \cos^2 \theta - \sin^2 \theta
\qquad \text{and} \qquad
\sin (2\theta) = 2 \sin \theta \cos \theta .
\end{equation*}
\end{exercise}
For a complex number $a+ib$, we call
$a$ the \emph{\myindex{real part}} and $b$ the \emph{\myindex{imaginary part}} of the number.
Often the following notation is used:
\begin{equation*}
\operatorname{Re}(a+ib) = a
\qquad \text{and} \qquad
\operatorname{Im}(a+ib) = b.
\end{equation*}
\subsection{Complex roots}
Suppose the differential equation $ay'' + by' + cy = 0$ has the
characteristic equation
$a r^2 + b r + c = 0$ that has \myindex{complex roots}.
By the quadratic
formula, the roots are
$\frac{-b \pm \sqrt{b^2 - 4ac}}{2a}$.
These roots are complex if $b^2 - 4ac < 0$. In this case, we
write the roots as
\begin{equation*}
r_1, r_2 = \frac{-b}{2a} \pm i\frac{\sqrt{4ac - b^2}}{2a} .
\end{equation*}
As you can see, we get a pair of roots of the form $\alpha \pm i
\beta$. We could still write the solution as
\begin{equation*}
y = C_1 e^{(\alpha+i\beta)x} + C_2 e^{(\alpha-i\beta)x} .
\end{equation*}
However, the exponential is now complex-valued. We need to allow
$C_1$ and $C_2$ to be complex numbers to obtain a real-valued solution (which
is what we are after). While there is nothing particularly wrong with this
approach,
it can make calculations harder and it is generally preferred
to find two real-valued
solutions.
\hyperref[eulersformula]{Euler's formula} comes to the rescue. Let
\begin{equation*}
y_1 = e^{(\alpha+i\beta)x} \qquad \text{and} \qquad y_2 = e^{(\alpha-i\beta)x} .
\end{equation*}
Then
\begin{equation*}
\begin{aligned}
y_1 & = e^{\alpha x} \cos (\beta x) + i e^{\alpha x} \sin (\beta x) , \\
y_2 & = e^{\alpha x} \cos (\beta x) - i e^{\alpha x} \sin (\beta x) .
\end{aligned}
\end{equation*}
Linear combinations of solutions are also solutions. Hence,
\begin{equation*}
\begin{aligned}
y_3 & = \frac{y_1 + y_2}{2} = e^{\alpha x} \cos (\beta x) , \\
y_4 & = \frac{y_1 - y_2}{2i} = e^{\alpha x} \sin (\beta x) ,
\end{aligned}
\end{equation*}
are also solutions.
It is not hard to
see that $y_3$ and $y_4$ are linearly independent (not multiples of each other).
So the general solution can be written in terms of $y_3$ and $y_4$.
And as they are real-valued,
no complex numbers need to be used for the arbitrary constants in the
general solution.
We summarize what we found as a theorem.
\begin{theorem}
Take the equation
\begin{equation*}
ay'' + by' + cy = 0 .
\end{equation*}
If the characteristic equation has the roots $\alpha \pm i \beta$
(when $b^2 - 4ac < 0$),
then the general solution is
\begin{equation*}
y = C_1 e^{\alpha x} \cos (\beta x) + C_2 e^{\alpha x} \sin (\beta x) .
\end{equation*}
\end{theorem}
\begin{example} \label{example:sincossecondorder}
Find the general solution of $y'' + k^2 y = 0$, for a constant
$k > 0$.
The characteristic equation is $r^2 + k^2 = 0$. Therefore,
the roots are $r = \pm ik$, and by the theorem, we have the general solution
\begin{equation*}
y = C_1 \cos (kx) + C_2 \sin (kx) .
\end{equation*}
\end{example}
\begin{example}
Find the solution of $y'' - 6 y' + 13 y = 0$, $y(0) = 0$, $y'(0) =
10$.
The characteristic equation is $r^2 - 6 r + 13 = 0$. Completing the
square, we get ${(r-3)}^2 + 2^2 = 0$ and hence the roots are
$r = 3 \pm 2i$.
Per the theorem, the general solution is
\begin{equation*}
y = C_1 e^{3x} \cos (2x) + C_2 e^{3x} \sin (2x) .
\end{equation*}
To find the solution satisfying the initial conditions, we first plug in zero
to get
\begin{equation*}
0 = y(0) = C_1 e^{0} \cos 0 + C_2 e^{0} \sin 0 = C_1 .
\end{equation*}
Hence, $C_1 = 0$ and $y = C_2 e^{3x} \sin (2x)$. We differentiate,
\begin{equation*}
y' = 3C_2 e^{3x} \sin (2x) + 2C_2 e^{3x} \cos (2x) .
\end{equation*}
We again plug in the initial condition and obtain $10 = y'(0) = 2C_2$, or
$C_2 = 5$. The solution we are seeking is
\begin{equation*}
y = 5 e^{3x} \sin (2x) .
\end{equation*}
\end{example}
\subsection{Exercises}
\begin{exercise}
Find the general solution of $2y'' + 2y' -4 y = 0$.
\end{exercise}
\begin{exercise}
Find the general solution of $y'' + 9y' - 10 y = 0$.
\end{exercise}
\begin{exercise}
Solve $y'' - 8y' + 16 y = 0$ for $y(0) = 2$, $y'(0) = 0$.
\end{exercise}
\begin{exercise}
Solve $y'' + 9y' = 0$ for $y(0) = 1$, $y'(0) = 1$.
\end{exercise}
\begin{exercise}
Find the general solution of $2y'' + 50y = 0$.
\end{exercise}
\begin{exercise}
Find the general solution of $y'' + 6 y' + 13 y = 0$.
\end{exercise}
\begin{exercise}
Find the general solution of $y'' = 0$ using the methods of this section.
\end{exercise}
\begin{exercise}
The method of this section applies to equations of other orders than two.
We will see higher orders later.
Solve the first order equation
$2y' + 3y = 0$ using the methods of this section.
\end{exercise}
\begin{exercise}
Let us revisit the Cauchy--Euler equations\index{Cauchy--Euler equation} of
\exercisevref{sol:eulerex}. Suppose now
that ${(b-a)}^2-4ac < 0$. Find a formula for the general solution
of $a x^2 y'' + b x y' + c y = 0$. Hint: Note that $x^r = e^{r \ln x}$.
\end{exercise}
\begin{exercise}
Find the solution to
$y''-(2\alpha) y' + \alpha^2 y=0$, $y(0) = a$, $y'(0)=b$,
where $\alpha$, $a$, and $b$ are real numbers.
\end{exercise}
\begin{exercise}
Construct an equation such that $y = C_1 e^{-2x} \cos(3x) + C_2 e^{-2x}
\sin(3x)$ is the general
solution.
\end{exercise}
\setcounter{exercise}{100}
\begin{exercise}
Find the general solution to
$y''+4y'+2y=0$.
\end{exercise}
\exsol{%
$y =
C_1 e^{(-2+\sqrt{2}) x}
+
C_2 e^{(-2-\sqrt{2}) x}$
}
\begin{exercise}
Find the general solution to
$y''-6y'+9y=0$.
\end{exercise}
\exsol{%
$y =
C_1 e^{3x}
+
C_2 x e^{3x}$
}
\begin{exercise}
Find the solution to
$2y''+y'+y=0$, $y(0) = 1$, $y'(0)=-2$.
\end{exercise}
\exsol{%
$y =
e^{-x/4} \cos\bigl((\nicefrac{\sqrt{7}}{4})x\bigr)
-
\sqrt{7}
e^{-x/4} \sin\bigl((\nicefrac{\sqrt{7}}{4})x\bigr)$
}
\begin{exercise}
Find the solution to
$2y''+y'-3y=0$, $y(0) = a$, $y'(0)=b$.
\end{exercise}
\exsol{%
$y = \frac{2(a-b)}{5} \, e^{-3x/2}+\frac{3 a+2 b}{5} \, e^x$
}
\begin{exercise}
Find the solution to
$z''(t) = -2z'(t)-2z(t)$, $z(0) = 2$, $z'(0)= -2$.
\end{exercise}
\exsol{%
$z(t) =
2e^{-t} \cos(t)$
}
\begin{exercise}
Find the solution to
$y''-(\alpha+\beta) y' + \alpha \beta y=0$, $y(0) = a$, $y'(0)=b$,
where $\alpha$, $\beta$, $a$, and $b$ are real numbers, and $\alpha \not=
\beta$.
\end{exercise}
\exsol{%
$y =
\frac{a \beta-b}{\beta-\alpha} e^{\alpha x} +
\frac{b-a \alpha}{\beta-\alpha} e^{\beta x}$
}
\begin{exercise}
Construct an equation such that $y = C_1 e^{3x} + C_2 e^{-2x}$ is the general
solution.
\end{exercise}
\exsol{%
$y'' -y'-6y=0$
}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\sectionnewpage
\section{Higher order linear ODEs} \label{sec:hol}
%mbxINTROSUBSECTION
\sectionnotes{somewhat more than 1 lecture\EPref{, \S3.2 and \S3.3 in
\cite{EP}}\BDref{,
\S4.1 and \S4.2 in \cite{BD}}}
%After reading this lecture, it may be good to try
%Project III\index{IODE software!Project III} from the
%IODE website: \url{http://www.math.uiuc.edu/iode/}.
%
%\medskip
We briefly study higher order equations.
Equations appearing in applications tend to be second
order. Higher order equations do appear from time to time, but
generally the world around us is \myquote{second order.}
The basic results about linear ODEs of higher order are essentially
the same as for second order equations, with 2 replaced by $n$.
The important concept
of linear independence is somewhat more complicated when more than two
functions are involved.
For higher order constant coefficient ODEs, the methods developed are also
somewhat harder to apply,
but we will not dwell on these complications.
It is also possible to use the methods for systems
of linear equations from \chapterref{sys:chapter} to solve higher order
constant coefficient equations.
Let us start with a general homogeneous linear equation
\begin{equation} \label{hol:eqlinhom}
y^{(n)} + p_{n-1}(x)y^{(n-1)} + \cdots + p_1(x) y' + p_0(x) y = 0 .
\end{equation}
\begin{theorem}[Superposition]\index{superposition}
If $y_1$, $y_2$, \ldots, $y_n$ are solutions of the
homogeneous equation \eqref{hol:eqlinhom}, then
\begin{equation*}
y(x) = C_1 y_1(x) + C_2 y_2(x) + \cdots + C_n y_n(x)
\end{equation*}
also solves \eqref{hol:eqlinhom}
for arbitrary constants $C_1, C_2, \ldots, C_n$.
\end{theorem}
That is, a \emph{\myindex{linear combination}} of solutions
to \eqref{hol:eqlinhom}
is a solution to \eqref{hol:eqlinhom}.
There is also the existence and uniqueness theorem for linear
equations, including nonhomogeneous ones.
\begin{theorem}[Existence and uniqueness]\index{existence and uniqueness}
Suppose $p_0, p_1, \ldots, p_{n-1}$, and $f$ are continuous functions
on some interval $I$,
$a$ is a number in $I$,
and $b_0, b_1, \ldots, b_{n-1}$ are constants.
Then the equation
\begin{equation*} %\label{hol:eqlin}
y^{(n)} + p_{n-1}(x)y^{(n-1)} + \cdots + p_1(x) y' + p_0(x) y = f(x)
\end{equation*}
has exactly one solution $y(x)$ defined on the same interval $I$
satisfying the initial conditions
\begin{equation*}
y(a) = b_0, \quad y'(a) = b_1, \quad \ldots, \quad y^{(n-1)}(a) = b_{n-1} .
\end{equation*}
\end{theorem}
\subsection{Linear independence}
When we had two functions $y_1$ and $y_2$, we said they were linearly
independent if one was not a multiple of the other. Same idea holds for
$n$ functions, although in this case it is easier to state as follows. The functions
$y_1$, $y_2$, \ldots, $y_n$ are \emph{\myindex{linearly independent}} if
the equation
\begin{equation*}
c_1 y_1 + c_2 y_2 + \cdots + c_n y_n = 0
\end{equation*}
has only the trivial solution $c_1 = c_2 = \cdots = c_n = 0$, where the
equation must hold for all $x$. If we can