# Linear Algebra: Linear Combinations

## Linear Independence

A linear combination is **trivial** if all of its coefficients are $0$. A trivial combination is, of course, always equal to $0$:

$$0v_1 + 0v_2 + \ldots = 0$$

A linear combination is **nontrivial** if it is not trivial, i.e., if at least one coefficient is nonzero.

A set of vectors is **linearly independent** if the only linear combination of them that equals $0$ is the trivial combination. The empty set is defined to be linearly independent. A set of vectors is **linearly dependent** if it is not linearly independent.

For example, consider the vectors $[1,0]$ and $[0,1]$ in $\mathbb{R}^2$. They are linearly independent, since $c_1[1,0] + c_2[0,1] = [c_1,c_2] = [0,0]$ shows that $c_1$ and $c_2$ are both $0$. Meanwhile, if we add a third vector to this set, $[1,1]$, we see that the vectors are now linearly dependent, since $c_1[1,0] + c_2[0,1] + c_3[1,1] = [c_1 + c_3, c_2 + c_3] = [0,0]$ shows that any set of coefficients where $c_1 = c_2 = -c_3$ will sum the vectors to $0$.

This definition of linear independence is equivalent (as you'll prove below) to the statement that no vector in a linearly independent set of vectors is a linear combination of the other ones. Linearly independent vectors are a key feature of many core properties of vector spaces, such as basis, dimension, and uniqueness of solutions to systems of equations.

## Problems

Explain why a set of vectors containing $0$ is never linearly independent.

Linear independence requires that there be only one combination of coefficients that makes the set of vectors sum to $0$. However, the $0$ vector can have any coefficient from the field.

If you thought you could be clever and use a field that had only one element, then you're not clever enough, as a field must by definition have at least two elements - the $0$ element and the $1$ element.

What are the conditions for a set of a single vector to be linearly independent?

Let $v $ be the vector in question. If it is the $0$ vector, the set is linearly dependent. If not, then the only coefficient $c$ that satisfies $cv = 0$ is $0$. Therefore $v$ can be any vector but the $0$ vector.

Show that the set containing a single nonzero vector is linearly independent.

Let $v \in V$ where $v \neq 0$. We must ensure that there is a unique coefficient $c$ such that $cv=0$. We have previously shown that $0v=0$ for all $v$, so $0$ is one possible coefficient of $v$. Assume there is another possible coefficient, $d$, such that $dv=0$ and $d \neq 0$. Then there exists a multiplicative inverse $d^{-1} \in F$, and so $d^{-1}dv = d^{-1}0$, which simplifies to $v=0$. However, this is a contradiction, as $v \neq0$, so $0$ was the only possibly coefficient for $v$ after all. Therefore the set containing $v$ is linearly independent.

Under what conditions is a set of two vectors linearly independent?

Let $v_1, v_2$ be vectors in a vector space $V$ over a field $F$. The question is what coefficients $c_1, c_2 \in F$ ensure that $c_1v_1 + c_2v_2 \neq 0$. For $c_1 \neq 0$, rewrite the equation as

$ v_1 + (c_1)^{-1}c_2v \neq 0 \\ v_1 \neq -(c_1)^{-1}c_2v \\ $

We see that $v_1$ cannot be a scalar multiple of $v_2$. If $c_1=0$, then repeat the process with $c_2 \neq 0$ to arrive at the same conclusion. Thus in order two be linearly independent, two vectors cannot be scalar multiples of each other.

Determine whether the vectors $\langle 1, 0, 0 \rangle, \langle 0, 1, 0 \rangle, \langle 0, 0, 1 \rangle \in \mathbb{R}^3$ are linearly independent.

The set of vectors is linearly independent if the only set of coefficients satisfying the equation

$ 0 = a_1 \langle 1, 0, 0 \rangle + a_2 \langle 0, 1, 0 \rangle + a_3 \langle 0, 0, 1 \rangle $

are the coefficients $a_1=a_2=a_3=0$. Suppose there is another set of coefficients also satisfying the equation:

$ 0 = b_1 \langle 1, 0, 0 \rangle + b_2 \langle 0, 1, 0 \rangle + b_3 \langle 0, 0, 1 \rangle $

Let's see if they're unique.

$ 0 = (a_1 - b_1) \langle 1, 0, 0 \rangle + (a_2 - b_2) \langle 0, 1, 0 \rangle + (a_3 - b_3) \langle 0, 0, 1 \rangle \\ 0 = \langle (a_1 - b_1), 0, 0 \rangle + \langle 0, (a_2 - b_2), 0 \rangle + \langle 0, 0, (a_3 - b_3) \rangle \\ \langle 0, 0, 0 \rangle = \langle (a_1 - b_1), 0, 0 \rangle + \langle 0, (a_2 - b_2), 0 \rangle + \langle 0, 0, (a_3 - b_3) \rangle \\ \langle 0, 0, 0 \rangle = \langle (a_1 - b_1), (a_2 - b_2), (a_3 - b_3) \rangle \\ $

The only way for each vector component $a_i - b_i$ to equal $0$ is for $b_i$ to equal $a_i$. Therefore the coefficients are uniquely equal to $0$, so the three vectors in question are linearly independent.

Recall the vectors $e_1, \ldots, e_n \in F^n$, where $F$ is a field, such that $e_1 = \langle 1, 0, \ldots, 0 \rangle, e_2 = \langle 0, 1, \ldots, 0 \rangle, \ldots, e_n = \langle 0, 0, \ldots, 1 \rangle$. Show that these vectors are linearly independent.

To show a set of vectors is linearly independent, we must show that the only coefficients that cause the weighted sum to go to $0$ are $c_1=c_2=\ldots=c_n=0$. To show this uniqueness, we imagine there is another set of coefficients, $d_1, d_2, \ldots, d_n$, that also satsifies the equation, and prove that the two sets of coefficients are equal.

$ c_1e_1 + c_2e_2 + \ldots + c_ne_n = 0 \\ c_1e_1 + c_2e_2 + \ldots + c_ne_n = d_1e_1 + d_2e_2 + \ldots + d_ne_n \\ 0 = (d_1e_1 - c_1e_1) + (d_2e_2 - c_2e_2) + \ldots + (d_ne_n - c_ne_n) \\ 0 = \langle d_1-c_1, 0, \ldots, 0 \rangle + \langle 0, d_2-c_2, \ldots, 0 \rangle + \langle 0, 0, \ldots, d_n - c_n \rangle \\ \langle 0, 0, \ldots, 0 \rangle = \langle d_1-c_1, 0, \ldots, 0 \rangle + \langle 0, d_2-c_2, \ldots, 0 \rangle + \langle 0, 0, \ldots, d_n - c_n \rangle \\ $

Clearly each $d_i=c_i$, therefore the coefficients are all uniquely $0$, and the set of vectors $e_1, e_2, \ldots, e_n$ is linearly independent.

**Equivalent Definition of Linear Independence:**Show that a set of vectors $v_1, \ldots, v_n$ is linearly independent if and only if every for each $v \in \text{span}(v_1, \ldots, v_n)$ there is a unique set of coefficients $c_1, \ldots, c_n$ such that $v = c_1v_1 + \ldots c_nv_n$.If there are unique coefficients $c_1, \ldots, c_n \in F$ for each $v \in V$ such that $v = c_1v_1 + \ldots + c_nv_n$, then the equation $0 = c_1v_1 + \ldots + c_nv_n$ has unique coefficients that all equal $0$.

Conversely, suppose $v_1, \ldots, v_n$ are linearly independent, and let $v \in \text{span}(v_1, \ldots, v_n)$. Then

$ v = c_1v_1 + \ldots + c_nv_n $

for some set of coefficients $c_1, \ldots, c_n$. Suppose further that there is another set of coefficients, $d_1, \ldots, d_n$ that also form a linear combination:

$v = d_1v_1 + \ldots + d_nv_n$

Then

$ c_1v_1 + \ldots + c_nv_n = d_1v_1 + \ldots + d_nv_n \\ 0 = (d_1 - c_1)v_1 + \ldots + (d_n - c_n)v_n \\ $

Because the set of vectors $v_1, \ldots, v_n$ is linearly independent, all the above coefficients $(d_i-c_i)$ equal $0$. This means that $c_i=d_i$, and therefore that the coefficients $c_1, \ldots, c_n$ are unique.

**Linear Dependence Lemma:**Let $S$ be a finite set of vectors that span a vector space $V$. Then there is some vector $v \in S$ such that $v \in \text{span}(S - \{v\})$ and $\text{span}(S) = \text{span}(S - \{v\})$.Because the vectors in $S$ span $V$, there is some set of coefficients $\{a_v\}_{v \in S}$ such that $0 =\sum\limits_{v \in S} a_v v$. By the linear dependence of $S$, these coefficients are not all zero. Therefore we can select a nonzero $a_u \in S$ and show that $u$ is a linear combination of the other vectors in $S$:

$ 0 = \sum\limits_{v \in S} a_v v \\ 0 = \left(\sum\limits_{v \in (S - \{u\})} a_v v \right) + a_u u\\ -a_u u = \sum\limits_{v \in (S - \{u\})} a_v v \\ u = \sum\limits_{v \in (S - \{u\})} \dfrac{-a_v}{a_u} v \\ $

Since $u$ is a linear combination of the vectors in $S - \{u\}$, then $u \in \text{span}(S - \{u\})$.

To show that $\text{span}(S) = \text{span}(S - \{v\})$, let $w$ be some vector in $V$, and express it as a linear combination of vectors in $S - \{u\}$ with coefficients $b_v$ by replacing $u$ with a linear combination of the other vectors:

$ w = \sum\limits_{v \in S} b_v v \\ w = \left(\sum\limits_{v \in (S - \{u\})} b_v v\right) + b_u u \\ w = \left(\sum\limits_{v \in (S - \{u\})} b_v v\right) + b_u\left( \sum\limits_{v \in (S - \{u\})} \dfrac{-a_v}{a_u} v \right) \\ w = \sum\limits_{v \in (S - \{u\})} \left(b_v - \dfrac{b_ua_v}{a_u}\right) v. \\ $

Show that a set of vectors is linearly dependent if one of the vectors can be written as a linear combination of the others.

Let $S = v_1, \ldots, v_n$ be a set of vectors in $V$ be such that $v_n = c_1v_1 + \ldots c_{n-1}v_{n-1}$. Then $0$ can be expressed as the nontrivial linear combination $0 = -v_n + c_1v_1 + \ldots c_{n-1}v_{n-1}$. Therefore $S$ is linearly dependent.

Let $S$ be a set of vectors that spans $V$. Show that $S \cup \{u\}$ is linearly dependent for any vector $u \in V$ where $u \notin S$.

Since $S$ spans $V$, $u$ can be written as a linear combination of vectors in $S$. By the preceding proof, $S \cup \{u\}$ is linearly dependent.

Let $v$ be a linear combination of vectors $v_1, \ldots, v_n$ where $v \neq 0$. Show that there is some $a_i$ that is a nontrivial linear combination of the other vectors $v_{j \neq i}$ and $v$.

Express $v$ as a linear combination of $v_1, \ldots, v_n$ with coefficients $c_1, \ldots, c_n$:

$$v = c_1v_1 + \ldots + c_nv_n$$

Because $v \neq 0$, there must be some vector $v_i \neq 0$ with nonzero coefficient $c_i$. Then rewrite the linear combination as follows:

$ v - c_iv_i = c_1v_1 + \ldots + c_{i-1}v_{i-1} + c_{i+1}v_{i+1} + \ldots + c_{n}v_n \\ - c_iv_i = -v + c_1v_1 + \ldots + c_{i-1}v_{i-1} + c_{i+1}v_{i+1} + \ldots + c_{n}v_n \\ v_i = -c_i^{-1}\left(-v + c_1v_1 + \ldots + c_{i-1}v_{i-1} + c_{i+1}v_{i+1} + \ldots + c_{n}v_n\right) \\ v_i = c_i^{-1}v - c_i^{-1}c_1v_1 - \ldots - c_i^{-1}c_{i-1}v_{i-1} - c_i^{-1}c_{i+1}v_{i+1} - \ldots - c_i^{-1}c_{n}v_n \\ $

Since $v \neq 0$ and its coefficient $c_i$ is nonzero, it follows that $v_i$ is a nontrivial combination of the vectors $v$ and $c_{j \neq i}$.

Show that a set of linearly independent vectors has at most as many vectors as a set of spanning vectors.

*Hint: Use induction.**Hint: Put linearly independent vectors into a set of spanning vectors one at a time, and find a way to remove the original spanning vectors one at a time while still maintaining the same span.*We show by induction that for every linearly independent set $A = \{a_1, \ldots, a_n\}$, any spanning set $B = \{b_1, \ldots, b_m\}$ must have at least $n$ vectors.

**Base case:**Let $G_1 = \{a_1\} \cup B = \{a_1, b_1, \ldots, b_m\}$. Then $G_1$ is linearly dependent, since the superset of any spanning set is linearly dependent. Specifically, since $B$ spans $V$, we can write $a_1$ as a linear combination of the other vectors:$a_1 = c_1b_1 + \ldots c_mb_m \\$

for some coefficients $c_1, \ldots, c_m$. Since $A$ is linearly independent, $a_1 \neq 0$. Therefore the above linear combination is nontrivial, so there is at least one $b_i \in B$ with a nonzero coefficient $c_i$. Without loss of generality, assume this is $b_1$. Then $b_1$ is a linear combination of the other vectors in $G_1$. By the Linear Dependence Lemma, we can remove $b_1$ from $G_1$ to form a new set $G_1' = \{a_1, b_2, \ldots, b_m\}$, which still spans $V$.

**Inductive step:**Assume the set $G_{i-1}' = \{a_1, \ldots, a_{i-1}, b_{i}, \ldots b_m\}$ spans $V$. Form $G_i = \{a_i\} \cup G_{i-1}'$. Then $G_i$ is linearly dependent, so it contains some vector that is a linear combination of the others. If a vector originally from $B$, call it $b_i$, is a linear combination of the others, then we can remove it to form $G_i'$, which still spans $V$. If on the other hand some $a_j$ originally from $A$ is a linear combination of the other vectors, there must be additional vectors from $B$ in $G_i$ of which it is a nontrivial linear combination, since by the linear independence of $A$ it cannot be a linear combination of vectors from $A$. Select one such vector from $B$ in that linear combination with nonzero coefficient, call it $b_i$. Then $b_i$ is in turn a linear combination of other vectors in $G_i$. Remove $b_i$ from $G_i$ to form $G_i'$, which spans $V$. Thus in either case the addition of another vector from $A$ to $G_{i-1}'$ to form $G_{i}$ implies the existence of another vector from $B$, which can then be removed to form $G_i'$, which still spans $V$.The result follows by induction.