# HG changeset patch # User Rob Beezer # Date 1352941699 28800 # Node ID 5e80f9993ea51839ddee80b62a4c658227efcd8d # Parent ad83ec5affedd8a49a34c246bb51f53aae443b91 Line breaks, Chapters VS, D diff --git a/src/section-B.xml b/src/section-B.xml --- a/src/section-B.xml +++ b/src/section-B.xml @@ -35,8 +35,9 @@ Standard Unit Vectors are a Basis -

The set of standard unit vectors for $\complex{m}$ (), $B=\set{\vectorlist{e}{m}}=\setparts{\vect{e}_i}{1\leq i\leq m}$ is a basis for the vector space $\complex{m}$.

- +

The set of standard unit vectors for $\complex{m}$ (), +$B=\setparts{\vect{e}_i}{1\leq i\leq m}$ +is a basis for the vector space $\complex{m}$.

@@ -115,13 +116,13 @@ is a spanning set for $W=\setparts{p(x)}{p\in P_4,\ p(2)=0}$. We will now show that $S$ is also linearly independent in $W$. Begin with a relation of linear dependence, -0+0x+0x^2+0x^3+0x^4 - - + + + \left(\alpha_3-8\alpha_4\right)x^3+ \left(\alpha_2-6\alpha_3+24\alpha_4\right)x^2\\ - + \left(\alpha_1-4\alpha_2+12\alpha_3-32\alpha_4\right)x+ \left(-2\alpha_1+4\alpha_2-8\alpha_3+16\alpha_4\right) @@ -595,10 +596,10 @@

Now, to illustrate , choose any vector from $\complex{4}$, say $\vect{w}=\colvector{2\\-3\\1\\4}$, and compute - - - -\innerproduct{\vect{w}}{\vect{v}_4}=\frac{6+12i}{\sqrt{119}} + + + +

@@ -1012,7 +1013,7 @@ -Find a basis for the subspace $Q$ of $P_2$, defined by $Q = \setparts{p(x) = a + bx + cx^2}{p(0) = 0}$. +Find a basis for the subspace $Q$ of $P_2$, $Q = \setparts{p(x) = a + bx + cx^2}{p(0) = 0}$. If $p(0) = 0$, then $a + b(0) + c(0^2) = 0$, so $a = 0$. Thus, we can write $Q = \setparts{p(x) = bx + cx^2}{b, c\in\complexes}$. @@ -1021,7 +1022,7 @@ -Find a basis for the subspace $R$ of $P_2$ defined by $R = \setparts{p(x) = a + bx + cx^2}{p'(0) = 0}$, where $p'$ denotes the derivative. +Find a basis for the subspace $R$ of $P_2$, $R = \setparts{p(x) = a + bx + cx^2}{p'(0) = 0}$, where $p'$ denotes the derivative. The derivative of $p(x) = a + bx + cx^2$ is $p^\prime(x) = b + 2cx$. Thus, if $p \in R$, then $p^\prime(0) = b + 2c(0) = 0$, diff --git a/src/section-D.xml b/src/section-D.xml --- a/src/section-D.xml +++ b/src/section-D.xml @@ -40,7 +40,7 @@

We want to prove that any set of $t+1$ or more vectors from $V$ is linearly dependent. So we will begin with a totally arbitrary set of vectors from $V$, $R=\set{\vectorlist{u}{m}}$, where $m>t$. We will now construct a nontrivial relation of linear dependence on $R$.

-

Each vector $\vectorlist{u}{m}$ can be written as a linear combination of $\vectorlist{v}{t}$ since $S$ is a spanning set of $V$. This means there exist scalars $a_{ij}$, $1\leq i\leq t$, $1\leq j\leq m$, so that +

Each vector $\vectorlist{u}{m}$ can be written as a linear combination of the vectors $\vectorlist{v}{t}$ since $S$ is a spanning set of $V$. This means there exist scalars $a_{ij}$, $1\leq i\leq t$, $1\leq j\leq m$, so that @@ -676,7 +676,7 @@ -The archetypes listed below are matrices, or systems of equations with coefficient matrices. For each, compute the nullity and rank of the matrix. This information is listed for each archetype (along with the number of columns in the matrix, so as to illustrate ), and notice how it could have been computed immediately after the determination of the sets $D$ and $F$ associated with the reduced row-echelon form of the matrix.

+The archetypes listed below are matrices, or systems of equations with coefficient matrices. For each, compute the nullity and rank of the matrix. This information is listed for each archetype (along with the number of columns in the matrix, so as to illustrate ), and notice how it could have been computed immediately after the determination of the sets $D$ and $F$ associated with the reduced row-echelon form of the matrix.
, , , diff --git a/src/section-DM.xml b/src/section-DM.xml --- a/src/section-DM.xml +++ b/src/section-DM.xml @@ -613,7 +613,7 @@ Determinant Expansion about Rows -

Suppose that $A$ is a square matrix of size $n$. Then +

Suppose that $A$ is a square matrix of size $n$. Then for $1\leq i\leq n$ (-1)^{i+1}\matrixentry{A}{i1}\detname{\submatrix{A}{i}{1}}+ @@ -621,8 +621,6 @@ \cdots+ (-1)^{i+n}\matrixentry{A}{in}\detname{\submatrix{A}{i}{n}} - -1\leq i\leq n which is known as expansion about row $i$.

@@ -655,38 +653,38 @@

Now, -\detname{A} - + + \sum_{j=1}^{n}(-1)^{1+j}\matrixentry{A}{1j}\detname{\submatrix{A}{1}{j}} \text{}\\ - + \sum_{j=1}^{n}(-1)^{1+j}\matrixentry{A}{1j} \sum_{\substack{1\leq\ell\leq n\\\ell\neq j}} (-1)^{i-1+\ell-\epsilon_{\ell j}}\matrixentry{A}{i\ell}\detname{\submatrix{A}{1,i}{j,\ell}} - - + + \sum_{j=1}^{n}\sum_{\substack{1\leq\ell\leq n\\\ell\neq j}} (-1)^{j+i+\ell-\epsilon_{\ell j}} \matrixentry{A}{1j}\matrixentry{A}{i\ell}\detname{\submatrix{A}{1,i}{j,\ell}} \text{}\\ - + \sum_{\ell=1}^{n}\sum_{\substack{1\leq j\leq n\\j\neq\ell}} (-1)^{j+i+\ell-\epsilon_{\ell j}} \matrixentry{A}{1j}\matrixentry{A}{i\ell}\detname{\submatrix{A}{1,i}{j,\ell}} \text{}\\ - + \sum_{\ell=1}^{n}(-1)^{i+\ell}\matrixentry{A}{i\ell} \sum_{\substack{1\leq j\leq n\\j\neq\ell}} (-1)^{j-\epsilon_{\ell j}} \matrixentry{A}{1j}\detname{\submatrix{A}{1,i}{j,\ell}} \text{}\\ - + \sum_{\ell=1}^{n}(-1)^{i+\ell}\matrixentry{A}{i\ell} \sum_{\substack{1\leq j\leq n\\j\neq\ell}} (-1)^{\epsilon_{\ell j}+j} \matrixentry{A}{1j}\detname{\submatrix{A}{i,1}{\ell,j}} - + \sum_{\ell=1}^{n}(-1)^{i+\ell}\matrixentry{A}{i\ell}\detname{\submatrix{A}{i}{\ell}} \text{} @@ -748,7 +746,7 @@ Determinant Expansion about Columns -

Suppose that $A$ is a square matrix of size $n$. Then +

Suppose that $A$ is a square matrix of size $n$. Then for $1\leq j\leq n$ (-1)^{1+j}\matrixentry{A}{1j}\detname{\submatrix{A}{1}{j}}+ @@ -756,8 +754,6 @@ \cdots+ (-1)^{n+j}\matrixentry{A}{nj}\detname{\submatrix{A}{n}{j}} - -1\leq j\leq n which is known as expansion about column $j$.

diff --git a/src/section-LISS.xml b/src/section-LISS.xml --- a/src/section-LISS.xml +++ b/src/section-LISS.xml @@ -47,14 +47,14 @@ Linear independence in $P_4$ -

In the vector space of polynomials with degree 4 or less, $P_4$ () consider the set - -S=\set{ +

In the vector space of polynomials with degree 4 or less, $P_4$ () consider the set $S$ below + +\set{ 2x^4+3x^3+2x^2-x+10,\, -x^4-2x^3+x^2+5x-8,\, 2x^4+x^3+10x^2+17x-2 -}. - +} +

Is this set of vectors linearly independent or dependent? Consider that @@ -88,14 +88,9 @@

Using our definitions of vector addition and scalar multiplication in $P_4$ (), we arrive at, - -\left(3\alpha_1-3\alpha_2+4\alpha_3+2\alpha_4\right)x^4+ -\left(-2\alpha_1+\alpha_2+5\alpha_3-7\alpha_4\right)x^3\\ - -\left(4\alpha_1+ -2\alpha_3+4\alpha_4\right)x^2+ -\left(6\alpha_1+4\alpha_2+3\alpha_3+2\alpha_4\right)x\\ - -\left(-\alpha_1+2\alpha_2+\alpha_3+\alpha_4\right). + + +

@@ -317,7 +312,7 @@ and then massage it to a point where we can apply the definition of equality in $C$. Recall the definitions of vector addition and scalar multiplication in $C$ are not what you would expect. -(-1,\,-1) + \text{}\\ @@ -410,15 +405,15 @@

Any solution to this system of equations will provide the linear combination we need to determine if $r\in\spn{S}$, but we need to be convinced there is a solution for any values of $a,\,b,\,c,\,d,\,e$ that qualify $r$ to be a member of $W$. So the question is: is this system of equations consistent? We will form the augmented matrix, and row-reduce. (We probably need to do this by hand, since the matrix is symbolic reversing the order of the first four rows is the best way to start). We obtain a matrix in reduced row-echelon form - -\begin{bmatrix} + + -\end{bmatrix} -= +\end{bmatrix}\\ + \begin{bmatrix} @@ -426,7 +421,7 @@ \end{bmatrix} - +

For your results to match our first matrix, you may find it necessary to multiply the final row of your row-reduced matrix by the appropriate scalar, and/or add multiples of this row to some of the other rows. To obtain the second version of the matrix, the last entry of the last column has been simplified to zero according to the one condition we were able to impose on an arbitrary polynomial from $W$. So with no leading 1's in the last column, tells us this system is consistent. Therefore, any polynomial from $W$ can be written as a linear combination of the polynomials in $S$, so $W\subseteq\spn{S}$. Therefore, $W=\spn{S}$ and $S$ is a spanning set for $W$ by .

@@ -504,14 +499,12 @@

-

We will act as if this equation is true and try to determine just what $a_1$ and $a_2$ would be (as functions of $x$ and $y$). +

We will act as if this equation is true and try to determine just what $a_1$ and $a_2$ would be (as functions of $x$ and $y$). Recall that our vector space operations are unconventional and are defined in . - - + - - +

@@ -543,7 +536,7 @@

We could chase through the above implications backwards and take the existence of these solutions as sufficient evidence for $R$ being a spanning set for $C$. Instead, let us view the above as simply scratchwork and now get serious with a simple direct proof that $R$ is a spanning set. Ready? Suppose $(x,\,y)$ is any vector from $C$, then compute the following linear combination using the definitions of the operations in $C$, - + @@ -597,7 +590,7 @@

That $\vect{w}$ can be written as a linear combination of the vectors in $B$ follows from the spanning property of the set (). This is good, but not the meat of this theorem. We now know that for any choice of the vector $\vect{w}$ there exist some scalars that will create $\vect{w}$ as a linear combination of the basis vectors. The real question is: Is there more than one way to write $\vect{w}$ as a linear combination of $\{\vectorlist{v}{m}\}$? Are the scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ unique? ()

-

Assume there are two ways to express $\vect{w}$ as a linear combination of $\{\vectorlist{v}{m}\}$. In other words there exist scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ and $b_1,\,b_2,\,b_3,\,\ldots,\,b_m$ so that +

Assume there are two different linear combinations of $\{\vectorlist{v}{m}\}$ that equal the vector $\vect{w}$. In other words there exist scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ and $b_1,\,b_2,\,b_3,\,\ldots,\,b_m$ so that diff --git a/src/section-PDM.xml b/src/section-PDM.xml --- a/src/section-PDM.xml +++ b/src/section-PDM.xml @@ -219,6 +219,14 @@

We will perform a sequence of row operations on this matrix, shooting for an upper triangular matrix, whose determinant will be simply the product of its diagonal entries. For each row operation, we will track the effect on the determinant via , , . + +\begin{bmatrix} + + + + +\end{bmatrix} +\\ \xrightarrow{\rowopswap{1}{2}} \begin{bmatrix} @@ -228,7 +236,7 @@ \end{bmatrix} -\text{}\\ +\text{}\\ \xrightarrow{\rowopadd{-2}{1}{2}} \begin{bmatrix} @@ -603,7 +611,7 @@ -

It is amazing that matrix multiplication and the determinant interact this way. Might it also be true that $\detname{A+B}=\detname{A}+\detname{B}$? (See .)

+

It is amazing that matrix multiplication and the determinant interact this way. Might it also be true that $\detname{A+B}=\detname{A}+\detname{B}$? ()

Nonsingular Matrices, Round 7 diff --git a/src/section-VS.xml b/src/section-VS.xml --- a/src/section-VS.xml +++ b/src/section-VS.xml @@ -129,10 +129,10 @@

Set: $P_n$, the set of all polynomials of degree $n$ or less in the variable $x$ with coefficients from $\complex{\null}$.

Equality: - -a_0+a_1x+a_2x^2+\cdots+a_nx^n=b_0+b_1x+b_2x^2+\cdots+b_nx^n -\text{ if and only if }a_i=b_i\text{ for }0\leq i\leq n -

+ + + +