The set of standard unit vectors for $\complex{m}$ (
The set of standard unit vectors for $\complex{m}$ (
Now, to illustrate
We want to prove that any set of $t+1$ or more vectors from $V$ is linearly dependent. So we will begin with a totally arbitrary set of vectors from $V$, $R=\set{\vectorlist{u}{m}}$, where $m>t$. We will now construct a nontrivial relation of linear dependence on $R$.
Each vector $\vectorlist{u}{m}$ can be written as a linear combination of $\vectorlist{v}{t}$ since $S$ is a spanning set of $V$. This means there exist scalars $a_{ij}$, $1\leq i\leq t$, $1\leq j\leq m$, so that +
Each vector $\vectorlist{u}{m}$ can be written as a linear combination of the vectors $\vectorlist{v}{t}$ since $S$ is a spanning set of $V$. This means there exist scalars $a_{ij}$, $1\leq i\leq t$, $1\leq j\leq m$, so that
+
Suppose that $A$ is a square matrix of size $n$. Then +
Suppose that $A$ is a square matrix of size $n$. Then for $1\leq i\leq n$
Now,
Suppose that $A$ is a square matrix of size $n$. Then +
Suppose that $A$ is a square matrix of size $n$. Then for $1\leq j\leq n$
In the vector space of polynomials with degree 4 or less, $P_4$ (
In the vector space of polynomials with degree 4 or less, $P_4$ (
Is this set of vectors linearly independent or dependent? Consider that @@ 88,14 +88,9 @@
Using our definitions of vector addition and scalar multiplication in $P_4$ (
Any solution to this system of equations will provide the linear combination we need to determine if $r\in\spn{S}$, but we need to be convinced there is a solution for any values of $a,\,b,\,c,\,d,\,e$ that qualify $r$ to be a member of $W$. So the question is: is this system of equations consistent? We will form the augmented matrix, and rowreduce. (We probably need to do this by hand, since the matrix is symbolic
For your results to match our first matrix, you may find it necessary to multiply the final row of your rowreduced matrix by the appropriate scalar, and/or add multiples of this row to some of the other rows. To obtain the second version of the matrix, the last entry of the last column has been simplified to zero according to the one condition we were able to impose on an arbitrary polynomial from $W$. So with no leading 1's in the last column,
We will act as if this equation is true and try to determine just what $a_1$ and $a_2$ would be (as functions of $x$ and $y$). +
We will act as if this equation is true and try to determine just what $a_1$ and $a_2$ would be (as functions of $x$ and $y$). Recall that our vector space operations are unconventional and are defined in
We could chase through the above implications backwards and take the existence of these solutions as sufficient evidence for $R$ being a spanning set for $C$. Instead, let us view the above as simply scratchwork and now get serious with a simple direct proof that $R$ is a spanning set. Ready? Suppose $(x,\,y)$ is any vector from $C$, then compute the following linear combination using the definitions of the operations in $C$,
That $\vect{w}$ can be written as a linear combination of the vectors in $B$ follows from the spanning property of the set (
Assume there are two ways to express $\vect{w}$ as a linear combination of $\{\vectorlist{v}{m}\}$. In other words there exist scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ and $b_1,\,b_2,\,b_3,\,\ldots,\,b_m$ so that +
Assume there are two different linear combinations of $\{\vectorlist{v}{m}\}$ that equal the vector $\vect{w}$. In other words there exist scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ and $b_1,\,b_2,\,b_3,\,\ldots,\,b_m$ so that
We will perform a sequence of row operations on this matrix, shooting for an upper triangular matrix, whose determinant will be simply the product of its diagonal entries. For each row operation, we will track the effect on the determinant via
It is amazing that matrix multiplication and the determinant interact this way. Might it also be true that $\detname{A+B}=\detname{A}+\detname{B}$? (See
It is amazing that matrix multiplication and the determinant interact this way. Might it also be true that $\detname{A+B}=\detname{A}+\detname{B}$? (
Set: $P_n$, the set of all polynomials of degree $n$ or less in the variable $x$ with coefficients from $\complex{\null}$.
Equality:

Vector Addition: