# HG changeset patch # User Rob Beezer # Date 1352941699 28800 # Node ID be31a794f0d272f5134c0abc4c76fe040b52e45a # Parent 5e80f9993ea51839ddee80b62a4c658227efcd8d Line breaks, Chapter EE diff --git a/src/section-EE.xml b/src/section-EE.xml --- a/src/section-EE.xml +++ b/src/section-EE.xml @@ -310,23 +310,23 @@

Put it all together and - -m$}\\]]> -\text{}\\ + +m$}\\]]> +\text{}\\ - +

Let $k$ be the smallest integer such that -(A-b_kI_n)(A-b_{k-1}I_n)\cdots(A-b_3I_n)(A-b_2I_n)(A-b_1I_n)\vect{x}=\zerovector. +(A-b_kI_n)(A-b_{k-1}I_n)\cdots(A-b_2I_n)(A-b_1I_n)\vect{x}=\zerovector.

From the preceding equation, we know that $k\leq m$. Define the vector $\vect{z}$ by -\vect{z}=(A-b_{k-1}I_n)\cdots(A-b_3I_n)(A-b_2I_n)(A-b_1I_n)\vect{x} +\vect{z}=(A-b_{k-1}I_n)\cdots(A-b_2I_n)(A-b_1I_n)\vect{x}

@@ -759,7 +759,7 @@ -

Since every eigenvalue must have at least one eigenvector, the associated eigenspace cannot be trivial, and so $\geomult{A}{\lambda}\geq 1$.

+

Every eigenvalue must have at least one eigenvector, so the associated eigenspace cannot be trivial, and so $\geomult{A}{\lambda}\geq 1$.

Eigenvalue multiplicities, matrix of size 4 @@ -921,7 +921,8 @@

Computing eigenvectors, - + + \begin{bmatrix} @@ -937,10 +938,11 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{-1\\\frac{3}{2}\\0\\1\\0},\,\colvector{0\\\frac{1}{2}\\1\\0\\1}}} =\spn{\set{\colvector{-2\\3\\0\\2\\0},\,\colvector{0\\1\\2\\0\\2}}}\\ - + + \begin{bmatrix} @@ -956,7 +958,7 @@ \end{bmatrix}\\ - +

@@ -988,10 +990,10 @@ So the eigenvalues are $\lambda=2,\,-1,2+i,\,2-i$ with algebraic multiplicities $\algmult{F}{2}=1$, $\algmult{F}{-1}=1$, $\algmult{F}{2+i}=2$ and $\algmult{F}{2-i}=2$.

-

Computing eigenvectors, +

We compute eigenvectors, noting that the last two basis vectors are each a scalar multiple of what will provide, - - + + \begin{bmatrix} @@ -1009,13 +1011,13 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{-\frac{1}{5}\\0\\-\frac{3}{5}\\\frac{1}{5}\\-\frac{4}{5}\\1}}} =\spn{\set{\colvector{-1\\0\\-3\\1\\-4\\5}}}\\ - - + + \begin{bmatrix} @@ -1033,13 +1035,13 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{-\frac{1}{2}\\\frac{3}{2}\\-\frac{1}{2}\\0\\\frac{1}{2}\\1}}} =\spn{\set{\colvector{-1\\3\\-1\\0\\1\\2}}}\\ - + \begin{bmatrix} @@ -1058,13 +1060,12 @@ \end{bmatrix}\\ - -=\spn{\set{\colvector{-\frac{1}{5}(7+i)\\\frac{1}{5}(9+2i)\\-1\\1\\-1\\1}}} + =\spn{\set{\colvector{-7-i\\9+2i\\-5\\5\\-5\\5}}}\\ - + \begin{bmatrix} @@ -1073,8 +1074,7 @@ \end{bmatrix}\\ - -\rref + \begin{bmatrix} @@ -1083,13 +1083,12 @@ \end{bmatrix}\\ - -=\spn{\set{\colvector{\frac{1}{5}(-7+i)\\\frac{1}{5}(9-2i)\\-1\\1\\-1\\1}}} + =\spn{\set{\colvector{-7+i\\9-2i\\-5\\5\\-5\\5}}}\\

-

So the eigenspace dimensions yield geometric multiplicities $\geomult{F}{2}=1$, $\geomult{F}{-1}=1$, $\geomult{F}{2+i}=1$ and $\geomult{F}{2-i}=1$. This example demonstrates some of the possibilities for the appearance of complex eigenvalues, even when all the entries of the matrix are real. Notice how all the numbers in the analysis of $\lambda=2-i$ are conjugates of the corresponding number in the analysis of $\lambda=2+i$. This is the content of the upcoming .

+

Eigenspace dimensions yield geometric multiplicities of $\geomult{F}{2}=1$, $\geomult{F}{-1}=1$, $\geomult{F}{2+i}=1$ and $\geomult{F}{2-i}=1$. This example demonstrates some of the possibilities for the appearance of complex eigenvalues, even when all the entries of the matrix are real. Notice how all the numbers in the analysis of $\lambda=2-i$ are conjugates of the corresponding number in the analysis of $\lambda=2+i$. This is the content of the upcoming .

@@ -1115,7 +1114,7 @@

Computing eigenvectors, - + \begin{bmatrix} @@ -1131,11 +1130,11 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{1\\-1\\-2\\-1\\1}}} - + \begin{bmatrix} @@ -1151,12 +1150,12 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{\frac{1}{2}\\0\\-\frac{1}{2}\\-1\\1}}} =\spn{\set{\colvector{1\\0\\-1\\-2\\2}}} - + \begin{bmatrix} @@ -1172,11 +1171,11 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{-1\\2\\2\\0\\1}}} - + \begin{bmatrix} @@ -1192,12 +1191,12 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{\frac{1}{2}\\0\\0\\-\frac{1}{2}\\1}}} =\spn{\set{\colvector{1\\0\\0\\-1\\2}}} - + \begin{bmatrix} @@ -1213,7 +1212,7 @@ \end{bmatrix}\\ - + =\spn{\set{\colvector{1\\-\frac{1}{2}\\-1\\-2\\1}}} =\spn{\set{\colvector{-2\\1\\2\\4\\-2}}} diff --git a/src/section-PEE.xml b/src/section-PEE.xml --- a/src/section-PEE.xml +++ b/src/section-PEE.xml @@ -29,7 +29,7 @@

We will prove this result by contradiction (). Suppose to the contrary that $S$ is a linearly dependent set. Define $S_i=\set{\vectorlist{x}{i}}$ and let $k$ be an integer such that $S_{k-1}=\set{\vectorlist{x}{k-1}}$ is linearly independent and $S_k=\set{\vectorlist{x}{k}}$ is linearly dependent. We have to ask if there is even such an integer $k$? First, since eigenvectors are nonzero, the set $\set{\vect{x}_1}$ is linearly independent. Since we are assuming that $S=S_p$ is linearly dependent, there must be an integer $k$, $2\leq k\leq p$, where the sets $S_i$ transition from linear independence to linear dependence (and stay that way). In other words, $\vect{x}_k$ is the vector with the smallest index that is a linear combination of just vectors with smaller indices.

-

Since $\set{\vectorlist{x}{k}}$ is linearly dependent there are scalars, $\scalarlist{a}{k}$, some non-zero (), so that +

Since $\set{\vectorlist{x}{k}}$ is a linearly dependent set there must be scalars, $\scalarlist{a}{k}$, not all zero (), so that \zerovector=\lincombo{a}{x}{k} @@ -43,55 +43,47 @@ \text{}\\ -\left(A-\lambda_kI_n\right)a_2\vect{x}_2+ \cdots+ \left(A-\lambda_kI_n\right)a_k\vect{x}_k \text{}\\ -a_2\left(A-\lambda_kI_n\right)\vect{x}_2+ \cdots+ a_k\left(A-\lambda_kI_n\right)\vect{x}_k \text{}\\ -a_2\left(A\vect{x}_2-\lambda_kI_n\vect{x}_2\right)+ \cdots+ a_k\left(A\vect{x}_k-\lambda_kI_n\vect{x}_k\right) \text{}\\ -a_2\left(A\vect{x}_2-\lambda_k\vect{x}_2\right)+ \cdots+ a_k\left(A\vect{x}_k-\lambda_k\vect{x}_k\right) \text{}\\ -a_2\left(\lambda_2\vect{x}_2-\lambda_k\vect{x}_2\right)+ \cdots+ a_k\left(\lambda_k\vect{x}_k-\lambda_k\vect{x}_k\right) \text{}\\ -a_2\left(\lambda_2-\lambda_k\right)\vect{x}_2+ \cdots+ a_k\left(\lambda_k-\lambda_k\right)\vect{x}_k \text{}\\ -a_2\left(\lambda_2-\lambda_k\right)\vect{x}_2+ \cdots+ +a_{k-1}\left(\lambda_{k-1}-\lambda_k\right)\vect{x}_{k-1}+ a_k\left(0\right)\vect{x}_k \text{}\\ -a_2\left(\lambda_2-\lambda_k\right)\vect{x}_2+ \cdots+ a_{k-1}\left(\lambda_{k-1}-\lambda_k\right)\vect{x}_{k-1}+ \zerovector \text{}\\ -a_2\left(\lambda_2-\lambda_k\right)\vect{x}_2+ \cdots+ a_{k-1}\left(\lambda_{k-1}-\lambda_k\right)\vect{x}_{k-1} \text{}

-

This is a relation of linear dependence on the linearly independent set $\set{\vectorlist{x}{k-1}}$, so the scalars must all be zero. That is, $a_i\left(\lambda_i-\lambda_k\right)=0$ for $1\leq i\leq k-1$. However, we have the hypothesis that the eigenvalues are distinct, so $\lambda_i\neq\lambda_k$ for $1\leq i\leq k-1$. Thus $a_i=0$ for $1\leq i\leq k-1$.

+

This equation is a relation of linear dependence on the linearly independent set $\set{\vectorlist{x}{k-1}}$, so the scalars must all be zero. That is, $a_i\left(\lambda_i-\lambda_k\right)=0$ for $1\leq i\leq k-1$. However, we have the hypothesis that the eigenvalues are distinct, so $\lambda_i\neq\lambda_k$ for $1\leq i\leq k-1$. Thus $a_i=0$ for $1\leq i\leq k-1$.

This reduces the original relation of linear dependence on $\set{\vectorlist{x}{k}}$ to the simpler equation $a_k\vect{x}_k=\zerovector$. By we conclude that $a_k=0$ or $\vect{x}_k=\zerovector$. Eigenvectors are never the zero vector (), so $a_k=0$. So all of the scalars $a_i$, $1\leq i\leq k$ are zero, contradicting their introduction as the scalars creating a nontrivial relation of linear dependence on the set $\set{\vectorlist{x}{k}}$. With a contradiction in hand, we conclude that $S$ must be linearly independent.

@@ -478,7 +470,7 @@ Number of Eigenvalues of a Matrix -

Suppose that $A$ is a square matrix of size $n$ with distinct eigenvalues $\scalarlist{\lambda}{k}$. Then +

Suppose that $\scalarlist{\lambda}{k}$ are the distinct eigenvalues of a square matrix $A$ of size $n$. Then \sum_{i=1}^{k}\algmult{A}{\lambda_i}=n diff --git a/src/section-SD.xml b/src/section-SD.xml --- a/src/section-SD.xml +++ b/src/section-SD.xml @@ -53,7 +53,7 @@

Check that $S$ is nonsingular and then compute - + \begin{bmatrix} @@ -404,7 +404,8 @@

Then consider, - + + \text{}\\ @@ -534,11 +535,12 @@

We next show that $S$ is a linearly independent set. So we will begin with a relation of linear dependence on $S$, using doubly-subscripted scalars and eigenvectors, - -\left(a_{11}\vect{x}_{11}+a_{12}\vect{x}_{12}+\cdots+a_{1\geomult{A}{\lambda_1}}\vect{x}_{1\geomult{A}{\lambda_1}}\right)+ -\left(a_{21}\vect{x}_{21}+a_{22}\vect{x}_{22}+\cdots+a_{2\geomult{A}{\lambda_2}}\vect{x}_{2\geomult{A}{\lambda_2}}\right)\\ - -\left(a_{k1}\vect{x}_{k1}+a_{k2}\vect{x}_{k2}+\cdots+a_{k\geomult{A}{\lambda_k}}\vect{x}_{k\geomult{A}{\lambda_k}}\right) +\zerovector= + + + + +

@@ -769,11 +771,9 @@ and so is a $5\times 5$ matrix with 5 distinct eigenvalues.

-

By we know $H$ must be diagonalizable. But just for practice, we exhibit the diagonalization itself. The matrix $S$ contains eigenvectors of $H$ as columns, one from each eigenspace, guaranteeing linear independent columns and thus the nonsingularity of $S$. The diagonal matrix has the eigenvalues of $H$ in the same order that their respective eigenvectors appear as the columns of $S$. Notice that we are using the versions of the eigenvectors from that have integer entries. +

By we know $H$ must be diagonalizable. But just for practice, we exhibit a diagonalization. The matrix $S$ contains eigenvectors of $H$ as columns, one from each eigenspace, guaranteeing linear independent columns and thus the nonsingularity of $S$. Notice that we are using the versions of the eigenvectors from that have integer entries. The diagonal matrix has the eigenvalues of $H$ in the same order that their respective eigenvectors appear as the columns of $S$. With these matrices, verify computationally that $\similar{H}{S}=D$. - - -\inverse{ + \begin{bmatrix} @@ -781,44 +781,7 @@ \end{bmatrix} -} -\begin{bmatrix} - - - - - -\end{bmatrix} -\begin{bmatrix} - - - - - -\end{bmatrix}\\ - -\begin{bmatrix} - - - - - -\end{bmatrix} -\begin{bmatrix} - - - - - -\end{bmatrix} -\begin{bmatrix} - - - - - -\end{bmatrix}\\ - + \begin{bmatrix} @@ -827,6 +790,7 @@ \end{bmatrix} +Note that there are many different ways to diagonalize $H$. We could replace eigenvectors by nonzero scalar multiples, or we could rearrange the order of the eigenvectors as the columns of $S$ (which would subsequently reorder the eigenvalues along the diagonal of $D$).

@@ -860,7 +824,7 @@ we find -D=\similar{A}{S} + \begin{bmatrix}