<![CDATA[3x_1+x_2+x_4-3x_5&=0\\]]>
<![CDATA[-2x_1+7x_2-5x_3+2x_4+2x_5&=-3]]>
+has coefficient matrix and vector of constants
<![CDATA[2 & 4 & -3 & 5 & 1\\]]>
<![CDATA[3 & 1 & 0 & 1 & -3\\]]>
<![CDATA[-2 & 7 & -5 & 2 & 2]]>
-\vect{b}=\colvector{9\\0\\-3}
+&\vect{b}&=\colvector{9\\0\\-3}
and so will be described compactly by the vector equation $A\vect{x}=\vect{b}$.
<p>If a speadsheet were used to make these computations, a row of weights would be entered somewhere near the table of data and the formulas in the spreadsheet would effect a matrix-vector product. This example is meant to illustrate how <q>linear</q> computations (addition, multiplication) can be organized as a matrix-vector product.</p>
-<p>Another example would be the matrix of numerical scores on examinations and exercises for students in a class. The rows would correspond to students and the columns to exams and assignments. The instructor could then assign weights to the different exams and assignments, and via a matrix-vector product, compute a single score for each student.</p>
+<p>Another example would be the matrix of numerical scores on examinations and exercises for students in a class. The rows would be indexed by students and the columns would be indexed by exams and assignments. The instructor could then assign weights to the different exams and assignments, and via a matrix-vector product, compute a single score for each student.</p>
-<p>We are assuming $A\vect{x}=B\vect{x}$ for all $\vect{x}\in\complex{n}$, so we can employ this equality for <em>any</em> choice of the vector $\vect{x}$. However, we'll limit our use of this equality to the standard unit vectors, $\vect{e}_j$, $1\leq j\leq n$ (<acroref type="definition" acro="SUV" />). For all $1\leq j\leq n$, $1\leq i\leq m$,
+<p>We are assuming $A\vect{x}=B\vect{x}$ for all $\vect{x}\in\complex{n}$, so we can employ this equality for <em>any</em> choice of the vector $\vect{x}$. However, we will limit our use of this equality to the standard unit vectors, $\vect{e}_j$, $1\leq j\leq n$ (<acroref type="definition" acro="SUV" />). For all $1\leq j\leq n$, $1\leq i\leq m$,
<![CDATA[&\matrixentry{A}{ij}\\]]>
-<p>Is this the definition of matrix multiplication you expected? Perhaps our previous operations for matrices caused you to think that we might multiply two matrices of the <em>same</em> size, <em>entry-by-entry</em>? Notice that our current definition uses matrices of different sizes (though the number of columns in the first must equal the number of rows in the second), and the result is of a third size. Notice too in the previous example that we cannot even consider the product $BA$, since the sizes of the two matrices in this order aren't right.</p>
+<p>Is this the definition of matrix multiplication you expected? Perhaps our previous operations for matrices caused you to think that we might multiply two matrices of the <em>same</em> size, <em>entry-by-entry</em>? Notice that our current definition uses matrices of different sizes (though the number of columns in the first must equal the number of rows in the second), and the result is of a third size. Notice too in the previous example that we cannot even consider the product $BA$, since the sizes of the two matrices in this order are not right.</p>
-<p>But it gets weirder than that. Many of your old ideas about <q>multiplication</q> won't apply to matrix multiplication, but some still will. So make no assumptions, and don't do anything until you have a theorem that says you can. Even if the sizes are right, matrix multiplication is not commutative <mdash /> order matters.</p>
+<p>But it gets weirder than that. Many of your old ideas about <q>multiplication</q> will not apply to matrix multiplication, but some still will. So make no assumptions, and do not do anything until you have a theorem that says you can. Even if the sizes are right, matrix multiplication is not commutative <mdash /> order matters.</p>
<example acro="MMNC" index="matrix multiplication!noncommutative">
<title>Matrix multiplication is not commutative</title>
<title>Matrix Multiplication, Entry-by-Entry</title>
-<p>While certain <q>natural</q> properties of multiplication don't hold, many more do. In the next subsection, we'll state and prove the relevant theorems. But first, we need a theorem that provides an alternate means of multiplying two matrices. In many texts, this would be given as the <em>definition</em> of matrix multiplication. We prefer to turn it around and have the following formula as a consequence of our definition. It will prove useful for proofs of matrix equality, where we need to examine products of matrices, entry-by-entry.</p>
+<p>While certain <q>natural</q> properties of multiplication do not hold, many more do. In the next subsection, we will state and prove the relevant theorems. But first, we need a theorem that provides an alternate means of multiplying two matrices. In many texts, this would be given as the <em>definition</em> of matrix multiplication. We prefer to turn it around and have the following formula as a consequence of our definition. It will prove useful for proofs of matrix equality, where we need to examine products of matrices, entry-by-entry.</p>
<theorem acro="EMP" index="matrix multiplication!entry-by-entry">
<title>Entries of Matrix Products</title>
<![CDATA[=&(0)(2)+(-4)(3)+(1)(2)+(2)(-1)+(3)(3)=-3]]>
-<p>Notice how there are 5 terms in the sum, since 5 is the common dimension of the two matrices (column count for $A$, row count for $B$). In the conclusion of <acroref type="theorem" acro="EMP" />, it would be the index $k$ that would run from 1 to 5 in this computation. Here's a bit more practice.</p>
+<p>Notice how there are 5 terms in the sum, since 5 is the common dimension of the two matrices (column count for $A$, row count for $B$). In the conclusion of <acroref type="theorem" acro="EMP" />, it would be the index $k$ that would run from 1 to 5 in this computation. Here is a bit more practice.</p>
<p>The entry of third row, first column:
conjugation (<acroref type="theorem" acro="MMCC" />),
the transpose (<acroref type="definition" acro="TM" />).
-Whew! Here we go. These are great proofs to practice with, so try to concoct the proofs before reading them, they'll get progressively more complicated as we go.</p>
+Whew! Here we go. These are great proofs to practice with, so try to concoct the proofs before reading them, they will get progressively more complicated as we go.</p>
<theorem acro="MMZM" index="matrix multiplication!zero matrix">
<title>Matrix Multiplication and the Zero Matrix</title>
-<p>We'll prove (1) and leave (2) to you. Entry-by-entry, for $1\leq i\leq m$, $1\leq j\leq p$,
+<p>We will prove (1) and leave (2) to you. Entry-by-entry, for $1\leq i\leq m$, $1\leq j\leq p$,
\matrixentry{A\zeromatrix_{n\times p}}{ij}
<![CDATA[&=\sum_{k=1}^{n}\matrixentry{A}{ik}\matrixentry{\zeromatrix_{n\times p}}{kj}]]>
-<p>Again, we'll prove (1) and leave (2) to you. Entry-by-entry, For $1\leq i\leq m$, $1\leq j\leq n$,
+<p>Again, we will prove (1) and leave (2) to you. Entry-by-entry, For $1\leq i\leq m$, $1\leq j\leq n$,
<![CDATA[\matrixentry{AI_n}{ij}=&]]>
\sum_{k=1}^{n}\matrixentry{A}{ik}\matrixentry{I_n}{kj}
-<p>We'll do (1), you do (2). Entry-by-entry, for $1\leq i\leq m$, $1\leq j\leq p$,
+<p>We will do (1), you do (2). Entry-by-entry, for $1\leq i\leq m$, $1\leq j\leq p$,
-<p>These are equalities of matrices. We'll do the first one, the second is similar and will be good practice for you. For $1\leq i\leq m$, $1\leq j\leq p$,
+<p>These are equalities of matrices. We will do the first one, the second is similar and will be good practice for you. For $1\leq i\leq m$, $1\leq j\leq p$,
\matrixentry{\alpha(AB)}{ij}
<![CDATA[&=\alpha\matrixentry{AB}{ij}&&]]><acroref type="definition" acro="MSM" />\\
-<p>A matrix equality, so we'll go entry-by-entry, no surprise there. For $1\leq i\leq m$, $1\leq j\leq s$,
+<p>A matrix equality, so we will go entry-by-entry, no surprise there. For $1\leq i\leq m$, $1\leq j\leq s$,
<![CDATA[&=\sum_{k=1}^{n}\matrixentry{A}{ik}\matrixentry{BD}{kj}]]>
-<p>Another theorem in this style, and it's a good one. If you've been practicing with the previous proofs you should be able to do this one yourself.</p>
+<p>Another theorem in this style, and it is a good one. If you've been practicing with the previous proofs you should be able to do this one yourself.</p>
<theorem acro="MMT" index="matrix multiplication!transposes">
<title>Matrix Multiplication and Transposes</title>
-<p>This theorem may be surprising but if we check the sizes of the matrices involved, then maybe it will not seem so far-fetched. First, $AB$ has size $m\times p$, so its transpose has size $p\times m$. The product of $\transpose{B}$ with $\transpose{A}$ is a $p\times n$ matrix times an $n\times m$ matrix, also resulting in a $p\times m$ matrix. So at least our objects are compatible for equality (and would not be, in general, if we didn't reverse the order of the matrix multiplication).</p>
+<p>This theorem may be surprising but if we check the sizes of the matrices involved, then maybe it will not seem so far-fetched. First, $AB$ has size $m\times p$, so its transpose has size $p\times m$. The product of $\transpose{B}$ with $\transpose{A}$ is a $p\times n$ matrix times an $n\times m$ matrix, also resulting in a $p\times m$ matrix. So at least our objects are compatible for equality (and would not be, in general, if we did not reverse the order of the matrix multiplication).</p>
<p>Here we go again, entry-by-entry. For $1\leq i\leq m$, $1\leq j\leq p$,
<p>This theorem seems odd at first glance, since we have to switch the order of $A$ and $B$. But if we simply consider the sizes of the matrices involved, we can see that the switch is necessary for this reason alone. That the individual entries of the products then come along to be equal is a bonus.</p>
-<p>As the adjoint of a matrix is a composition of a conjugate and a transpose, its interaction with matrix multiplication is similar to that of a transpose. Here's the last of our long list of basic properties of matrix multiplication.</p>
+<p>As the adjoint of a matrix is a composition of a conjugate and a transpose, its interaction with matrix multiplication is similar to that of a transpose. Here is the last of our long list of basic properties of matrix multiplication.</p>
<theorem acro="MMAD" index="matrix multiplication!adjoints">
<title>Matrix Multiplication and Adjoints</title>
<p>Notice how none of these proofs above relied on writing out huge general matrices with lots of ellipses (<q><ellipsis /></q>) and trying to formulate the equalities a whole matrix at a time. This messy business is a <q>proof technique</q> to be avoided at all costs. Notice too how the proof of <acroref type="theorem" acro="MMAD" /> does not use an entry-by-entry approach, but simply builds on previous results about matrix multiplication's interaction with conjugation and transposes.</p>
-<p>These theorems, along with <acroref type="theorem" acro="VSPM" /> and the other results in <acroref type="section" acro="MO" />, give you the <q>rules</q> for how matrices interact with the various operations we have defined on matrices (addition, scalar multiplication, matrix multiplication, conjugation, transposes and adjoints). Use them and use them often. But don't try to do anything with a matrix that you don't have a rule for. Together, we would informally call all these operations, and the attendant theorems, <q>the algebra of matrices.</q> Notice, too, that every column vector is just a $n\times 1$ matrix, so these theorems apply to column vectors also. Finally, these results, taken as a whole, may make us feel that the definition of matrix multiplication is not so unnatural.</p>
+<p>These theorems, along with <acroref type="theorem" acro="VSPM" /> and the other results in <acroref type="section" acro="MO" />, give you the <q>rules</q> for how matrices interact with the various operations we have defined on matrices (addition, scalar multiplication, matrix multiplication, conjugation, transposes and adjoints). Use them and use them often. But do not try to do anything with a matrix that you do not have a rule for. Together, we would informally call all these operations, and the attendant theorems, <q>the algebra of matrices.</q> Notice, too, that every column vector is just a $n\times 1$ matrix, so these theorems apply to column vectors also. Finally, these results, taken as a whole, may make us feel that the definition of matrix multiplication is not so unnatural.</p>
<sageadvice acro="PMM" index="matrix multiplication, properties">
<title>Properties of Matrix Multiplication</title>
-<p>So, informally, Hermitian matrices are those that can be tossed around from one side of an inner product to the other with reckless abandon. We'll see later what this buys us.</p>
+<p>So, informally, Hermitian matrices are those that can be tossed around from one side of an inner product to the other with reckless abandon. We will see later what this buys us.</p>
<exercise type="T" number="23" rough="Theorem MMSMM, part (2)">
<problem contributor="robertbeezer">Prove the second part of <acroref type="theorem" acro="MMSMM" />.
-<solution contributor="robertbeezer">We'll run the proof entry-by-entry.
+<solution contributor="robertbeezer">We will run the proof entry-by-entry.
<![CDATA[\matrixentry{\alpha(AB)}{ij}=&]]>
<![CDATA[\alpha\matrixentry{AB}{ij}&&]]><acroref type="definition" acro="MSM" />\\