# fcla / src / section-LISS.xml

  1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 
Linear Independence and Spanning Sets

A vector space is defined as a set with two operations, meeting ten properties (). Just as the definition of span of a set of vectors only required knowing how to add vectors and how to multiply vectors by scalars, so it is with linear independence. A definition of a linear independent set of vectors in an arbitrary vector space only requires knowing how to form linear combinations and equating these with the zero vector. Since every vector space must have a zero vector (), we always have a zero vector at our disposal.

In this section we will also put a twist on the notion of the span of a set of vectors. Rather than beginning with a set of vectors and creating a subspace that is the span, we will instead begin with a subspace and look for a set of vectors whose span equals the subspace.

The combination of linear independence and spanning will be very important going forward.

Linear Independence

Our previous definition of linear independence () employed a relation of linear dependence that was a linear combination on one side of an equality and a zero vector on the other side. As a linear combination in a vector space () depends only on vector addition and scalar multiplication, and every vector space must have a zero vector (), we can extend our definition of linear independence from the setting of $\complex{m}$ to the setting of a general vector space $V$ with almost no changes. Compare these next two definitions with and .

Relation of Linear Dependence

Suppose that $V$ is a vector space. Given a set of vectors $S=\set{\vectorlist{u}{n}}$, an equation of the form \lincombo{\alpha}{u}{n}=\zerovector is a relation of linear dependence on $S$. If this equation is formed in a trivial fashion, $\alpha_i=0$, $1\leq i\leq n$, then we say it is a trivial relation of linear dependence on $S$.

Linear Independence

Suppose that $V$ is a vector space. The set of vectors $S=\set{\vectorlist{u}{n}}$ from $V$ is linearly dependent if there is a relation of linear dependence on $S$ that is not trivial. In the case where the only relation of linear dependence on $S$ is the trivial one, then $S$ is a linearly independent set of vectors.

Notice the emphasis on the word only. This might remind you of the definition of a nonsingular matrix, where if the matrix is employed as the coefficient matrix of a homogeneous system then the only solution is the trivial one.

Linear independence in $P_4$

In the vector space of polynomials with degree 4 or less, $P_4$ () consider the set $S$ below \set{ 2x^4+3x^3+2x^2-x+10,\, -x^4-2x^3+x^2+5x-8,\, 2x^4+x^3+10x^2+17x-2 }

Is this set of vectors linearly independent or dependent? Consider that +4\left(-x^4-2x^3+x^2+5x-8\right)\\ =0x^4+0x^3+0x^2+0x+0=\zerovector This is a nontrivial relation of linear dependence () on the set $S$ and so convinces us that $S$ is linearly dependent ().

Now, I hear you say, Where did those scalars come from? Do not worry about that right now, just be sure you understand why the above explanation is sufficient to prove that $S$ is linearly dependent. The remainder of the example will demonstrate how we might find these scalars if they had not been provided so readily.

Let's look at another set of vectors (polynomials) from $P_4$. Let 3x^4-2x^3+4x^2+6x-1,\, -3x^4+1x^3+0x^2+4x+2,\right.\\ 2x^4-7x^3+4x^2+2x+1\right\}

Suppose we have a relation of linear dependence on this set,

Using our definitions of vector addition and scalar multiplication in $P_4$ (), we arrive at,

Equating coefficients, we arrive at the homogeneous system of equations,

We form the coefficient matrix of this homogeneous system of equations and row-reduce to find \begin{bmatrix} \end{bmatrix}

We expected the system to be consistent () and so can compute $n-r=4-4=0$ and tells us that the solution is unique. Since this is a homogeneous system, this unique solution is the trivial solution (), $\alpha_1=0$, $\alpha_2=0$, $\alpha_3=0$, $\alpha_4=0$. So by the set $T$ is linearly independent.

A few observations. If we had discovered infinitely many solutions, then we could have used one of the non-trivial solutions to provide a linear combination in the manner we used to show that $S$ was linearly dependent. It is important to realize that it is not interesting that we can create a relation of linear dependence with zero scalars we can always do that but for $T$, this is the only way to create a relation of linear dependence. It was no accident that we arrived at a homogeneous system of equations in this example, it is related to our use of the zero vector in defining a relation of linear dependence. It is easy to present a convincing statement that a set is linearly dependent (just exhibit a nontrivial relation of linear dependence) but a convincing statement of linear independence requires demonstrating that there is no relation of linear dependence other than the trivial one. Notice how we relied on theorems from to provide this demonstration. Whew! There's a lot going on in this example. Spend some time with it, we'll be waiting patiently right here when you get back.

Linear independence in $M_{32}$

Consider the two sets of vectors $R$ and $S$ from the vector space of all $3\times 2$ matrices, $M_{32}$ () \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix} }\\ \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix} }

One set is linearly independent, the other is not. Which is which? Let's examine $R$ first. Build a generic relation of linear dependence (), \alpha_1\begin{bmatrix} \end{bmatrix}+ \alpha_2\begin{bmatrix} \end{bmatrix}+ \alpha_3\begin{bmatrix} \end{bmatrix}+ \alpha_4\begin{bmatrix} \end{bmatrix}= \zerovector

Massaging the left-hand side with our definitions of vector addition and scalar multiplication in $M_{32}$ () we obtain, \begin{bmatrix} -\alpha_1+3\alpha_2-6\alpha_3+9\alpha_4 \\ 4\alpha_1-3\alpha_2+ -5\alpha_4 \\ -6\alpha_1-6\alpha_2-9\alpha_3+5\alpha_4 \end{bmatrix} =\begin{bmatrix} \end{bmatrix}

Using our definition of matrix equality () and equating corresponding entries we get the homogeneous system of six equations in four variables,

Form the coefficient matrix of this homogeneous system and row-reduce to obtain \begin{bmatrix} \end{bmatrix}

Analyzing this matrix we are led to conclude that $\alpha_1=0$, $\alpha_2=0$, $\alpha_3=0$, $\alpha_4=0$. This means there is only a trivial relation of linear dependence on the vectors of $R$ and so we call $R$ a linearly independent set ().

So it must be that $S$ is linearly dependent. Let's see if we can find a non-trivial relation of linear dependence on $S$. We will begin as with $R$, by constructing a relation of linear dependence () with unknown scalars, \alpha_1\begin{bmatrix} \end{bmatrix}+ \alpha_2\begin{bmatrix} \end{bmatrix}+ \alpha_3\begin{bmatrix} \end{bmatrix}+ \alpha_4\begin{bmatrix} \end{bmatrix}= \zerovector

Massaging the left-hand side with our definitions of vector addition and scalar multiplication in $M_{32}$ () we obtain, \begin{bmatrix} \alpha_3+3\alpha_4\\ -\alpha_1+2\alpha_2+\alpha_3+7\alpha_4\\ 3\alpha_1-6\alpha_2+4\alpha_3 \end{bmatrix} =\begin{bmatrix} \end{bmatrix}

Using our definition of matrix equality () and equating corresponding entries we get the homogeneous system of six equations in four variables,

Form the coefficient matrix of this homogeneous system and row-reduce to obtain \begin{bmatrix} \end{bmatrix}

Analyzing this we see that the system is consistent (we expected this since the system is homogeneous, ) and has $n-r=4-2=2$ free variables, namely $\alpha_2$ and $\alpha_4$. This means there are infinitely many solutions, and in particular, we can find a non-trivial solution, so long as we do not pick all of our free variables to be zero. The mere presence of a nontrivial solution for these scalars is enough to conclude that $S$ is a linearly dependent set (). But let's go ahead and explicitly construct a non-trivial relation of linear dependence.

Choose $\alpha_2=1$ and $\alpha_4=-1$. There is nothing special about this choice, there are infinitely many possibilities, some easier than this one, just avoid picking both variables to be zero. Then we find the corresponding dependent variables to be $\alpha_1=-2$ and $\alpha_3=3$. So the relation of linear dependence, (-2)\begin{bmatrix} \end{bmatrix}+ (1)\begin{bmatrix} \end{bmatrix}+ (3)\begin{bmatrix} \end{bmatrix}+ (-1)\begin{bmatrix} \end{bmatrix} = \begin{bmatrix} \end{bmatrix} is an iron-clad demonstration that $S$ is linearly dependent. Can you construct another such demonstration?

Linearly independent set in the crazy vector space

Is the set $R=\set{(1,\,0),\,(6,\,3)}$ linearly independent in the crazy vector space $C$ ()?

We begin with an arbitrary relation of linear dependence on $R$ \text{}\\ and then massage it to a point where we can apply the definition of equality in $C$. Recall the definitions of vector addition and scalar multiplication in $C$ are not what you would expect. \text{}\\ \text{}\\ \text{}\\ \text{}\\

Equality in $C$ () then yields the two equations, which becomes the homogeneous system

Since the coefficient matrix of this system is nonsingular (check this!) the system has only the trivial solution $a_1=a_2=0$. By the set $R$ is linearly independent. Notice that even though the zero vector of $C$ is not what we might first suspected, a question about linear independence still concludes with a question about a homogeneous system of equations. Hmmm.

Spanning Sets

In a vector space $V$, suppose we are given a set of vectors $S\subseteq V$. Then we can immediately construct a subspace, $\spn{S}$, using and then be assured by that the construction does provide a subspace. We now turn the situation upside-down. Suppose we are first given a subspace $W\subseteq V$. Can we find a set $S$ so that $\spn{S}=W$? Typically $W$ is infinite and we are searching for a finite set of vectors $S$ that we can combine in linear combinations and build all of $W$.

I like to think of $S$ as the raw materials that are sufficient for the construction of $W$. If you have nails, lumber, wire, copper pipe, drywall, plywood, carpet, shingles, paint (and a few other things), then you can combine them in many different ways to create a house (or infinitely many different houses for that matter). A fast-food restaurant may have beef, chicken, beans, cheese, tortillas, taco shells and hot sauce and from this small list of ingredients build a wide variety of items for sale. Or maybe a better analogy comes from Ben Cordes the additive primary colors (red, green and blue) can be combined to create many different colors by varying the intensity of each. The intensity is like a scalar multiple, and the combination of the three intensities is like vector addition. The three individual colors, red, green and blue, are the elements of the spanning set.

Because we will use terms like spanned by and spanning set, there is the potential for confusion with the span. Come back and reread the first paragraph of this subsection whenever you are uncertain about the difference. Here's the working definition.

Spanning Set of a Vector Space

Suppose $V$ is a vector space. A subset $S$ of $V$ is a spanning set of $V$ if $\spn{S}=V$. In this case, we also frequently say $S$ spans $V$.

The definition of a spanning set requires that two sets (subspaces actually) be equal. If $S$ is a subset of $V$, then $\spn{S}\subseteq V$, always. Thus it is usually only necessary to prove that $V\subseteq\spn{S}$. Now would be a good time to review .

Spanning set in $P_4$

In we showed that W=\setparts{p(x)}{p\in P_4,\ p(2)=0} is a subspace of $P_4$, the vector space of polynomials with degree at most $4$ (). In this example, we will show that the set S=\set{x-2,\,x^2-4x+4,\,x^3-6x^2+12x-8,\,x^4-8x^3+24x^2-32x+16} is a spanning set for $W$. To do this, we require that $W=\spn{S}$. This is an equality of sets. We can check that every polynomial in $S$ has $x=2$ as a root and therefore $S\subseteq W$. Since $W$ is closed under addition and scalar multiplication, $\spn{S}\subseteq W$ also.

So it remains to show that $W\subseteq \spn{S}$ (). To do this, begin by choosing an arbitrary polynomial in $W$, say $r(x)=ax^4+bx^3+cx^2+dx+e\in W$. This polynomial is not as arbitrary as it would appear, since we also know it must have $x=2$ as a root. This translates to 0=a(2)^4+b(2)^3+c(2)^2+d(2)+e=16a+8b+4c+2d+e as a condition on $r$.

We wish to show that $r$ is a polynomial in $\spn{S}$, that is, we want to show that $r$ can be written as a linear combination of the vectors (polynomials) in $S$. So let's try. \left(\alpha_3-8\alpha_4\right)x^3+ \left(\alpha_2-6\alpha_3+24\alpha_4\right)x^2\\ \left(\alpha_1-4\alpha_2+12\alpha_3-32\alpha_4\right)x+ \left(-2\alpha_1+4\alpha_2-8\alpha_3+16\alpha_4\right)

Equating coefficients (vector equality in $P_4$) gives the system of five equations in four variables,

Any solution to this system of equations will provide the linear combination we need to determine if $r\in\spn{S}$, but we need to be convinced there is a solution for any values of $a,\,b,\,c,\,d,\,e$ that qualify $r$ to be a member of $W$. So the question is: is this system of equations consistent? We will form the augmented matrix, and row-reduce. (We probably need to do this by hand, since the matrix is symbolic reversing the order of the first four rows is the best way to start). We obtain a matrix in reduced row-echelon form \end{bmatrix}\\ \begin{bmatrix} \end{bmatrix}

For your results to match our first matrix, you may find it necessary to multiply the final row of your row-reduced matrix by the appropriate scalar, and/or add multiples of this row to some of the other rows. To obtain the second version of the matrix, the last entry of the last column has been simplified to zero according to the one condition we were able to impose on an arbitrary polynomial from $W$. So with no leading 1's in the last column, tells us this system is consistent. Therefore, any polynomial from $W$ can be written as a linear combination of the polynomials in $S$, so $W\subseteq\spn{S}$. Therefore, $W=\spn{S}$ and $S$ is a spanning set for $W$ by .

Notice that an alternative to row-reducing the augmented matrix by hand would be to appeal to by expressing the column space of the coefficient matrix as a null space, and then verifying that the condition on $r$ guarantees that $r$ is in the column space, thus implying that the system is always consistent. Give it a try, we'll wait. This has been a complicated example, but worth studying carefully.

Given a subspace and a set of vectors, as in it can take some work to determine that the set actually is a spanning set. An even harder problem is to be confronted with a subspace and required to construct a spanning set with no guidance. We will now work an example of this flavor, but some of the steps will be unmotivated. Fortunately, we will have some better tools for this type of problem later on.

Spanning set in $M_{22}$

In the space of all $2\times 2$ matrices, $M_{22}$ consider the subspace and find a spanning set for $Z$.

\colvector{a\\b\\c\\d}\in

Row-reducing this matrix and applying we obtain the equivalent statement, \colvector{a\\b\\c\\d}\in

We can then express the subspace $Z$ in the following equal forms, }{b,\,d\in\complex{\null}}\\ }{b,\,d\in\complex{\null}}\\ }}

So the set Q=\set{ } spans $Z$ by .

Spanning set in the crazy vector space

In we determined that the set $R=\set{(1,\,0),\,(6,\,3)}$ is linearly independent in the crazy vector space $C$ (). We now show that $R$ is a spanning set for $C$.

Given an arbitrary vector $(x,\,y)\in C$ we desire to show that it can be written as a linear combination of the elements of $R$. In other words, are there scalars $a_1$ and $a_2$ so that (x,\,y)=a_1(1,\,0) + a_2(6,\,3)

We will act as if this equation is true and try to determine just what $a_1$ and $a_2$ would be (as functions of $x$ and $y$). Recall that our vector space operations are unconventional and are defined in .

Equality in $C$ then yields the two equations, which becomes the linear system with a matrix representation \begin{bmatrix} \end{bmatrix} \colvector{a_1\\a_2} = \colvector{x+1\\y+1}

The coefficient matrix of this system is nonsingular, hence invertible (), and we can employ its inverse to find a solution (, ), \colvector{a_1\\a_2}= \colvector{4x-7y-3\\-x+2y+1}

We could chase through the above implications backwards and take the existence of these solutions as sufficient evidence for $R$ being a spanning set for $C$. Instead, let us view the above as simply scratchwork and now get serious with a simple direct proof that $R$ is a spanning set. Ready? Suppose $(x,\,y)$ is any vector from $C$, then compute the following linear combination using the definitions of the operations in $C$,

This final sequence of computations in $C$ is sufficient to demonstrate that any element of $C$ can be written (or expressed) as a linear combination of the two vectors in $R$, so $C\subseteq\spn{R}$. Since the reverse inclusion $\spn{R}\subseteq C$ is trivially true, $C=\spn{R}$ and we say $R$ spans $C$ (). Notice that this demonstration is no more or less valid if we hide from the reader our scratchwork that suggested $a_1=4x-7y-3$ and $a_2=-x+2y+1$.

Vector Representation

In we will take up the matter of representations fully, where will be critical for . We will now motivate and prove a critical theorem that tells us how to represent a vector. This theorem could wait, but working with it now will provide some extra insight into the nature of linearly independent spanning sets. First an example, then the theorem.

A vector representation

Consider the set S=\set{\colvector{-7\\5\\1},\,\colvector{-6\\5\\0},\,\colvector{-12\\7\\4}} from the vector space $\complex{3}$. Let $A$ be the matrix whose columns are the set $S$, and verify that $A$ is nonsingular. By the elements of $S$ form a linearly independent set. Suppose that $\vect{b}\in\complex{3}$. Then $\linearsystem{A}{\vect{b}}$ has a (unique) solution () and hence is consistent. By , $\vect{b}\in\spn{S}$. Since $\vect{b}$ is arbitrary, this is enough to show that $\spn{S}=\complex{3}$, and therefore $S$ is a spanning set for $\complex{3}$ (). (This set comes from the columns of the coefficient matrix of .)

Now examine the situation for a particular choice of $\vect{b}$, say $\vect{b}=\colvector{-33\\24\\5}$. Because $S$ is a spanning set for $\complex{3}$, we know we can write $\vect{b}$ as a linear combination of the vectors in $S$, \colvector{-33\\24\\5}= (-3)\colvector{-7\\5\\1}+(5)\colvector{-6\\5\\0}+(2)\colvector{-12\\7\\4}.

The nonsingularity of the matrix $A$ tells that the scalars in this linear combination are unique. More precisely, it is the linear independence of $S$ that provides the uniqueness. We will refer to the scalars $a_1=-3$, $a_2=5$, $a_3=2$ as a representation of $\vect{b}$ relative to $S$. In other words, once we settle on $S$ as a linearly independent set that spans $\complex{3}$, the vector $\vect{b}$ is recoverable just by knowing the scalars $a_1=-3$, $a_2=5$, $a_3=2$ (use these scalars in a linear combination of the vectors in $S$). This is all an illustration of the following important theorem, which we prove in the setting of a general vector space.

Vector Representation Relative to a Basis

Suppose that $V$ is a vector space and $B=\set{\vectorlist{v}{m}}$ is a linearly independent set that spans $V$. Let $\vect{w}$ be any vector in $V$. Then there exist unique scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ such that \vect{w}=\lincombo{a}{v}{m}.

That $\vect{w}$ can be written as a linear combination of the vectors in $B$ follows from the spanning property of the set (). This is good, but not the meat of this theorem. We now know that for any choice of the vector $\vect{w}$ there exist some scalars that will create $\vect{w}$ as a linear combination of the basis vectors. The real question is: Is there more than one way to write $\vect{w}$ as a linear combination of $\{\vectorlist{v}{m}\}$? Are the scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ unique? ()

Assume there are two different linear combinations of $\{\vectorlist{v}{m}\}$ that equal the vector $\vect{w}$. In other words there exist scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ and $b_1,\,b_2,\,b_3,\,\ldots,\,b_m$ so that

Then notice that \zerovector \text{}\\ \text{}\\ \text{}\\ \text{, }

But this is a relation of linear dependence on a linearly independent set of vectors ()! Now we are using the other assumption about $B$, that $\{\vectorlist{v}{m}\}$ is a linearly independent set. So by it must happen that the scalars are all zero. That is,

And so we find that the scalars are unique.

The converse of is true as well, but is not important enough to rise beyond an exercise (see ).

This is a very typical use of the hypothesis that a set is linearly independent obtain a relation of linear dependence and then conclude that the scalars must all be zero. The result of this theorem tells us that we can write any vector in a vector space as a linear combination of the vectors in a linearly independent spanning set, but only just. There is only enough raw material in the spanning set to write each vector one way as a linear combination. So in this sense, we could call a linearly independent spanning set a minimal spanning set. These sets are so important that we will give them a simpler name (basis) and explore their properties further in the next section.

1. Is the set of matrices below linearly independent or linearly dependent in the vector space $M_{22}$? Why or why not? \set{ \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix} }
2. Explain the difference between the following two uses of the term span:
1. $S$ is a subset of the vector space $V$ and the span of $S$ is a subspace of $V$.
2. $W$ is a subspace of the vector space $Y$ and $T$ spans $W$.
3. The set S=\set{ \colvector{6\\2\\1},\, \colvector{4\\-3\\1},\, \colvector{5\\8\\2} } is linearly independent and spans $\complex{3}$. Write the vector $\vect{x}=\colvector{-6\\2\\2}$ as a linear combination of the elements of $S$. How many ways are there to answer this question, and which theorem allows you to say so?
In the vector space of $2\times 2$ matrices, $M_{22}$, determine if the set $S$ below is linearly independent. S=\set{ \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix},\, \begin{bmatrix} \end{bmatrix} } Begin with a relation of linear dependence on the vectors in $S$ and massage it according to the definitions of vector addition and scalar multiplication in $M_{22}$, a_1 \begin{bmatrix} \end{bmatrix}+ a_2 \begin{bmatrix} \end{bmatrix}+ a_3 \begin{bmatrix} \end{bmatrix}\\ \begin{bmatrix} \end{bmatrix} \begin{bmatrix} -a_1+4a_2+2a_3\\ 3a_1+2a_2+3a_3 \end{bmatrix} By our definition of matrix equality () we arrive at a homogeneous system of linear equations, The coefficient matrix of this system row-reduces to the matrix, \begin{bmatrix} \end{bmatrix} and from this we conclude that the only solution is $a_1=a_2=a_3=0$. Since the relation of linear dependence () is trivial, the set $S$ is linearly independent (). In the crazy vector space $C$ (), is the set $S=\set{(0,\,2),\ (2,\,8)}$ linearly independent? We begin with a relation of linear dependence using unknown scalars $a$ and $b$. We wish to know if these scalars must both be zero. Recall that the zero vector in $C$ is $(-1,\,-1)$ and that the definitions of vector addition and scalar multiplication are not what we might expect. \zerovector \text{}\\ \text{Scalar mult., }\\ \text{Vector addition, }\\ From this we obtain two equalities, which can be converted to a homogeneous system of equations, This homogeneous system has a singular coefficient matrix, and so has more than just the trivial solution (). Any nontrivial solution will give us a nontrivial relation of linear dependence on $S$. So $S$ is linearly dependent (). In the vector space of polynomials $P_3$, determine if the set $S$ is linearly independent or linearly dependent. S=\set{2 +x -3x^2 -8x^3,\, 1+ x + x^2 +5x^3,\, 3 -4x^2 -7x^3} Begin with a relation of linear dependence (), a_1\left(2 +x -3x^2 -8x^3\right)+a_2\left(1+ x + x^2 +5x^3\right)+a_3\left(3 -4x^2 -7x^3\right)=\zerovector Massage according to the definitions of scalar multiplication and vector addition in the definition of $P_3$ () and use the zero vector for this vector space, \left(2a_1+a_2+3a_3\right)+ \left(a_1+a_2\right)x+ \left(-3a_1+a_2-4a_3\right)x^2+ \left(-8a_1+5a_2-7a_3\right)x^3 =0+0x+0x^2+0x^3 The definition of the equality of polynomials allows us to deduce the following four equations, Row-reducing the coefficient matrix of this homogeneous system leads to the unique solution $a_1=a_2=a_3=0$. So the only relation of linear dependence on $S$ is the trivial one, and this is linear independence for $S$ (). Determine if the set $S=\set{(3,\,1),\,(7,\,3)}$ is linearly independent in the crazy vector space $C$ (). Notice, or discover, that the following gives a nontrivial relation of linear dependence on $S$ in $C$, so by , the set $S$ is linearly dependent. 2(3,\,1)+(-1)(7,\,3)=(7,\,3)+(-9,\,-5)=(-1,\,-1)=\zerovector In the vector space of real-valued functions $F = \setparts{f}{f:\mathbb{R}\rightarrow\mathbb{R}}$, determine if the following set $S$ is linearly independent. S = \set{\sin^2{x}, \cos^2{x}, 2} One of the fundamental identities of trigonometry is $\sin^2(x) + \cos^2(x) = 1$. Thus, we have a dependence relation $2(\sin^2{x}) + 2(\cos^2{x}) + (-1)(2) = 0$, and the set is linearly dependent. Let S = \set{ }
1. Determine if $S$ spans $M_{2,2}$.
2. Determine if $S$ is linearly independent.
1. If $S$ spans $M_{2,2}$, then for every $2 \times 2$ matrix there exist constants $\alpha, \beta, \gamma$ so that Applying , this leads to the linear system We need to row-reduce the augmented matrix of this system by hand due to the symbols $x$, $y$, $z$, and $w$ in the vector of constants. \begin{bmatrix} \end{bmatrix} \rref \begin{bmatrix} \end{bmatrix} With the apperance of a leading 1 possible in the last column, by there will exist some matrices
2. To check for linear independence, we need to see if there are nontrivial coefficients $\alpha, \beta, \gamma$ that solve This requires the same work that was done in part (a), with $x = y = z = w = 0$. In that case, the coefficient matrix row-reduces to have a leading 1 in each of the first three columns and a row of zeros on the bottom, so we know that the only solution to the matrix equation is $\alpha = \beta = \gamma = 0$. So the set $S$ is linearly independent.
Let S = \set{ }
1. Determine if $S$ spans $M_{2,2}$.
2. Determine if $S$ is linearly independent.
1. Thus, we have \begin{bmatrix} \end{bmatrix} \begin{bmatrix} \end{bmatrix}\\ so we have the matrix equation \begin{bmatrix} \end{bmatrix} \colvector{a\\b\\c\\d\\e} \colvector{x\\y\\z\\w} This system will have a solution for every vector on the right side if the row-reduced coefficient matrix has a leading one in every row, since then it is never possible to have a leading 1 appear in the final column of a row-reduced augmented matrix. \begin{bmatrix} \end{bmatrix} \rref \begin{bmatrix} \end{bmatrix}
2. The matrices in $S$ are linearly independent if the only solution to is $a = b = c = d = e = 0$.

We have \begin{bmatrix} \end{bmatrix} \begin{bmatrix} \colvector{a\\b\\c\\d\\e} = \begin{bmatrix} so we need to find the nullspace of the matrix \begin{bmatrix} \end{bmatrix} We row-reduced this matrix in part (a), and found that there is a column without a leading 1, which correspons to a free variable in a description of the solution set to the homogeneous system, so the nullspace is nontrivial and there are an infinite number of solutions to Thus, this set of matrices is not linearly independent.
In , find another nontrivial relation of linear dependence on the linearly dependent set of $3\times 2$ matrices, $S$. Determine if the set $T=\set{x^2-x+5,\,4x^3-x^2+5x,\,3x+2}$ spans the vector space of polynomials with degree 4 or less, $P_4$. The polynomial $x^4$ is an element of $P_4$. Can we write this element as a linear combination of the elements of $T$? To wit, are there scalars $a_1$, $a_2$, $a_3$ such that Massaging the right side of this equation, according to the definitions of , and then equating coefficients, leads to an inconsistent system of equations (check this!). As such, $T$ is not a spanning set for $P_4$. The set $W$ is a subspace of $M_{22}$, the vector space of all $2\times 2$ matrices. Prove that $S$ is a spanning set for $W$. } We want to show that $W=\spn{S}$ (), which is an equality of sets ().

First, show that $\spn{S}\subseteq W$. Begin by checking that each of the three matrices in $S$ is a member of the set $W$. Then, since $W$ is a vector space, the closure properties (, ) guarantee that every linear combination of elements of $S$ remains in $W$.

Second, show that $W\subseteq\spn{S}$. We want to convince ourselves that an arbitrary element of $W$ is a linear combination of elements of $S$. Choose The values of $a,\,b,\,c,\,d$ are not totally arbitrary, since membership in $W$ requires that $2a-3b+4c-d=0$. Now, rewrite as follows, \vect{x} \text{}\\ \text{}\\ \text{}
Determine if the set $S=\set{(3,\,1),\,(7,\,3)}$ spans the crazy vector space $C$ (). We will try to show that $S$ spans $C$. Let $(x,\,y)$ be an arbitrary element of $C$ and search for scalars $a_1$ and $a_2$ such that Equality in $C$ leads to the system This system has a singular coefficient matrix whose column space is simply $\spn{\colvector{2\\1}}$. So any choice of $x$ and $y$ that causes the column vector $\colvector{x+1\\y+1}$ to lie outside the column space will lead to an inconsistent system, and hence create an element $(x,\,y)$ that is not in the span of $S$. So $S$ does not span $C$.

For example, choose $x=0$ and $y=5$, and then we can see that $\colvector{1\\6}\not\in\spn{\colvector{2\\1}}$ and we know that $(0,\,5)$ cannot be written as a linear combination of the vectors in $S$. A shorter solution might begin by asserting that $(0,\,5)$ is not in $\spn{S}$ and then establishing this claim alone.
Halfway through , we need to show that the system of equations \linearsystem{ \begin{bmatrix} \end{bmatrix} } {\colvector{a\\b\\c\\d\\e}} is consistent for every choice of the vector of constants satisfying $16a+8b+4c+2d+e=0$.

Express the column space of the coefficient matrix of this system as a null space, using . From this use to establish that the system is always consistent. Notice that this approach removes from the need to row-reduce a symbolic matrix.
provides the matrix L= \begin{bmatrix} \end{bmatrix} and so if $A$ denotes the coefficient matrix of the system, then $\csp{A}=\nsp{L}$. The single homogeneous equation in $\homosystem{L}$ is equivalent to the condition on the vector of constants (use $a,\,b,\,c,\,d,\,e$ as variables and then multiply by 16).
Suppose that $S$ is a finite linearly independent set of vectors from the vector space $V$. Let $T$ be any subset of $S$. Prove that $T$ is linearly independent. We will prove the contrapositive (): If $T$ is linearly dependent, then $S$ is linearly dependent. This might be an interesting statement in its own right.

Write $S=\set{\vectorlist{v}{m}}$ and without loss of generality we can assume that the subset $T$ is the first $t$ vectors of $S$, $t\leq m$, so $T=\set{\vectorlist{v}{t}}$. Since $T$ is linearly dependent, by there are scalars, not all zero, $\scalarlist{a}{t}$, so that \zerovector which is a nontrivial relation of linear dependence () on the set $S$, so we can say $S$ is linearly dependent.
Prove the following variant of that has a weaker hypothesis: Suppose that $C=\set{\vectorlist{u}{p}}$ is a linearly independent spanning set for $\complex{n}$. Suppose also that $A$ and $B$ are $m\times n$ matrices such that $A\vect{u}_i=B\vect{u}_i$ for every $1\leq i\leq n$. Then $A=B$.

Can you weaken the hypothesis even further while still preserving the conclusion?
Suppose that $V$ is a vector space and $\vect{u},\,\vect{v}\in V$ are two vectors in $V$. Use the definition of linear independence to prove that $S=\set{\vect{u},\,\vect{v}}$ is a linearly dependent set if and only if one of the two vectors is a scalar multiple of the other. Prove this directly in the context of an abstract vector space ($V$), without simply giving an upgraded version of for the special case of just two vectors. If $S$ is linearly dependent, then there are scalars $\alpha$ and $\beta$, not both zero, such that $\alpha\vect{u}+\beta\vect{v}=\zerovector$. Suppose that $\alpha\neq 0$, the proof proceeds similarly if $\beta\neq 0$. Now, \vect{u} \text{}\\ \text{}\\ \text{}\\ \text{}\\ \text{}\\ \text{}\\ \text{}\\ \text{} which shows that $\vect{u}$ is a scalar multiple of $\vect{v}$.

Suppose now that $\vect{u}$ is a scalar multiple of $\vect{v}$. More precisely, suppose there is a scalar $\gamma$ such that $\vect{u}=\gamma\vect{v}$. Then (-1)\vect{u}+\gamma\vect{v} \text{}\\ \text{}\\ \text{}\\ \text{} This is a relation of linear of linear dependence on $S$ (), which is nontrivial since one of the scalars is $-1$. Therefore $S$ is linearly dependent by .

Be careful using this theorem. It is only applicable to sets of two vectors. In particular, linear dependence in a set of three or more vectors can be more complicated than just one vector being a scalar multiple of another.
Carefully formulate the converse of and provide a proof. The converse could read: Suppose that $V$ is a vector space and $S=\set{\vectorlist{v}{m}}$ is a set of vectors in $V$. If, for each $\vect{w}\in V$, there are unique scalars $a_1,\,a_2,\,a_3,\,\ldots,\,a_m$ such that \vect{w}=\lincombo{a}{v}{m} then $S$ is a linearly independent set that spans $V$.

Since every vector $\vect{w}\in V$ is assumed to be a linear combination of the elements of $S$, it is easy to see that $S$ is a spanning set for $V$ ().

To establish linear independence, begin with an arbitrary relation of linear dependence on the vectors in $S$ (). One way to form such a relation is the trivial way, where each scalar is zero. But our hypothesis of uniqueness then implies that that the only way to form this relation of linear dependence is the trivial way. But this establishes the linear independence of $S$ ().