Linear Algebra: Linear Transformations
Linear Transformations
Motivation: Systems of Linear Equations
Consider the following system of linear equations:
$$ \begin{array}{rcrcrcr} 3x & + & 2y & - & z & = & 4 \\ 2x & + & y & + & 3z & = & 13 \\ 4x & - & 3y & + & 7z & = & 19 \\ \end{array} $$
Rather than think of these as three separate equations from $\mathbb{R}^3$ to $\mathbb{R}$, we can instead think of them as a single function $T : \mathbb{R}^3 \rightarrow \mathbb{R}^3$ where $T(x,y,z) = [3x + 2y - z, 2x + y + 3z, 4x - 3y + 7z]$. We can then think of asking for the solution to the system as asking for values of $x$, $y$, and $z$ such that $T(x, y, z) = [4, 13, 19]$. The general class of functions like $T$ that describe systems of linear equations we call linear transformations. We would like to be able to determine the conditions under which such solutions exist, and if so, how many there are and what each one is.
Linear Transformations
A linear transformation is a function $T : V \rightarrow W$, where $V$ and $W$ are vector spaces over the same field $F$, such that the following two properties hold:
-
Additivity: $T(u + v) = T(u) + T(v)$ for all $u, v \in V$.
-
Homogeneity: $T(cv) = cT(v)$ for all $c \in F$ and $v \in V$.
Linear transformations are also sometimes called linear transforms or linear maps.
A linear transformation $T$ of a vector $v$ is not written the way a typical functions is, as $T(v)$, but instead as $Tv$, with the parentheses omitted. This notation is due to the fact that algebra on linear transformations works very similarly to algebra on matrices, which we'll encounter later.
The Vector Space of Linear Transformations $\mathcal{L}(V, W)$
The set of all linear transformations from $V$ to $W$ is denoted $\mathcal{L}(V, W)$. If $W = V$, we simply write $\mathcal{L}(V)$, which we refer to as the set of all linear operators on $V$. For example, $T(x, y, z) = [2x - z, 3y - x]$ is a member of $\mathcal{L}(\mathbb{R}^3, \mathbb{R}^2)$, and $H(x, y) = [x + y, y - x]$ is a member of $\mathcal{L}(\mathbb{R}^2)$.
An interesting property of $\mathcal{L}(V,W)$ is that it forms a vector space. That is, the set of linear transformations between two given vector spaces is itself a vector space. Of course, we need definitions for vector addition and scalar multiplication, which we give here. Let $T, S \in \mathcal{L}(V,W)$. Then define vector addition on $\mathcal{L}(V,W)$ as
$$(T + S)(v) = Tv + Sv,$$
and define scalar multiplication on $\mathcal{L}(V,W)$ as
$$(cT)(v) = c(Tv).$$
The proof that this forms a vector space is found in the problems below.
Since the vectors in $\mathcal{L}(V,W)$ are themselves functions, we can add some additional algebra to them. Consider $T \in \mathcal{L}(V,W)$ and $S \in \mathcal{L}(U,V)$. Define the product of these two linear transformations as their composition:
$$(TS)(v) = T(Sv).$$
In other words, $TS$ is just shorthand for $T \circ S$.
The Identity Transformation
A linear transformation on $I_V \in \mathcal{L}(V)$ is called an identity if $I_Vv = v$ for all $v \in V$. When $V$ is clear from the context, the identity is simply written as $I$. The identity transformation for each vector space is unique. While identity transformations don't "do" anything by themselves, they are useful in two ways. First, they serve as a functional equivalent of the number $1$, whereby transforming by $I$ and multiplying by $1$ both leave the input unchanged, and second, if some sequence of algebraic operations ends up producing $I$, you know you've probably done something interesting. Well, hopefully, anyway.
Problems
Show that the following functions are linear transformations:
-
Zero: $T(v) = 0$
-
Identity: $T(v) = v$
-
Negation: $T(v) = -v$
For each transform, we check both conditions:
-
Zero:
-
Additivity: $T(v + w) = 0 = 0 + 0 = T(v) + T(w)$
-
Homogeneity: $T(cv) = 0 = c0 = cT(v)$
-
-
Identity:
-
Additivity: $T(v + w) = v + w = T(v) + T(w)$
-
Homogeneity: $T(cv) = cv = cT(v)$
-
-
Negation:
-
Additivity: $T(v + w) = -(v + w) = -v -w = T(v) + T(w)$
-
Homogeneity: $T(cv) = -(cv) = c(-v) = cT(v)$
-
-
Let $T$ be a linear transformation. Show that $T0 = 0$.
$ T0 = T(0 + 0) \\ T0 = T0 + T0 \\ T0 = 0 $
Determine whether the following functions $T : \mathbb{R}^2 \rightarrow \mathbb{R}^2$ are members of $\mathcal{L}(\mathbb{R}^2, \mathbb{R}^2)$:
-
$T_1(x, y) = [4x, 2y + 9x]$
-
$T_2(x,y) = [xy, 2x]$
-
$T_3(x,y) = [x + b, y + c]$
For each transform, we check both conditions:
-
$T_1(x, y) = [4x, 2y + 9x]$
-
Additivity:
$ T_1(v + w) = [4(v_x + w_x), 2(v_y + w_y) + 9(v_x + w_x)] \\ T_1(v + w) = [4v_x + 4w_x, 2v_y + 2w_y + 9v_x + 9w_x] \\ T_1(v + w) = [4v_x, 2v_y + 9v_x] + [4w_x, 2w_y + 9w_x] \\ T_1(v + w) = T_1(v) + T_1(w) $
-
Homogeneity:
$ T_1(cv) = [4cv_x, 2cv_y + 9cv_x] \\ T_1(cv) = [c4v_x, c(2v_y + 9v_x)] \\ T_1(cv) = c[4v_x, 2v_y + 9v_x] \\ T_1(cv) = cT_1(v) $
$T_1$ fulfills both the additive and the homogeneity properties, so $T_1 \in \mathcal{L}(\mathbb{R}^2, \mathbb{R}^2)$.
-
-
$T_2(x,y) = [xy, 2x]$
-
Additivity:
$ T_2(v + w) = [(v_x + w_x)(v_y + w_y), 2(v_x + w_x)] \\ T_2(v + w) = [v_xv_y + v_xw_y + v_yw_x + w_xw_y, 2v_x + 2w_x] \\ T_2(v + w) = [v_xv_y + w_xw_y, 2v_x + 2w_x] + [v_xw_y + v_yw_x, 0] \\ T_2(v + w) = [v_xv_y, 2v_x] + [w_xw_y, 2w_x] + [v_xw_y + v_yw_x, 0] \\ T_2(v + w) = T_2(v) + T_2(w) + [v_xw_y + v_yw_x, 0] \\ T_2(v + w) \neq T_2(v) + T_2(w) $
-
Homogeneity:
$ T_2(cv) = [(cv_x)(cv_y), 2(cv_x)] \\ T_2(cv) = [c^2(v_xv_y), c(2v_x)] \\ cT_2(v) = c[v_xv_y, 2v_x] \\ cT_2(v) = [c(v_xv_y), c(2v_x)] \\ T_2(cv) \neq cT_2(v) $
$T_2$ fulfills neither the additive nor the homogeneity property, so $T_2 \notin \mathcal{L}(\mathbb{R}^2, \mathbb{R}^2)$.
-
-
$T_3(x,y) = [x + a, y + b]$
-
Additivity:
$ T_3(v + w) = [(v_x + w_x) + a, (v_y + w_y) + b] \\ T_3(v + w) = [(v_x + w_x) + a + (a - a), (v_y + w_y) + b + (b - b)] \\ T_3(v + w) = [(v_x + a) + (w_x + a) - a , (v_y + b) + (w_y + b) - b] \\ T_3(v + w) = [(v_x + a) + (w_x + a), (v_y + b) + (w_y + b)] + [-a, -b] \\ T_3(v + w) = [v_x + a, v_y + b] + [w_x + a, w_y + b] + [-a, -b] \\ T_3(v + w) = T_3(v) + T_3(w) + [-a, -b] \\ T_3(v + w) \neq T_3(v) + T_3(w) $
-
Homogeneity:
$ T_3(cv) = [cv_x + a, cv_y + b] \\ cT_3(v) = c[v_x + a, v_y + b] \\ cT_3(v) = [c(v_x + a), c(v_y + b)] \\ cT_3(v) = [cv_x + ca, cv_y + cb] \\ T_3(cv) \neq cT_3(v) $
$T_3$ fulfills neither the additive nor the homogeneity property, so $T_3 \notin \mathcal{L}(\mathbb{R}^2, \mathbb{R}^2)$.
-
-
Unique Transformation of Basis: Let $A = \{a_1, \ldots, a_n\}$ be a basis for $V$ and $B = \{b_1, \ldots, b_n\}$ be a basis for $W$. Show that there is a unique linear transform $T$ such that $Ta_i = b_i$.
Hint: Map coefficients for linear combinations of vectors in $A$ to vectors in $B$.
First we must show that such a function $T$ exists. Then we must show that it is unique.
Let $v \in V$. Because $A$ is a basis, we can write $v$ as a unique linear combination of basis element:
$v = c_1a_1 + \ldots + c_na_n$
Now define $T$ in terms of the coefficients:
$T(v) = c_1b_1 + \ldots c_nb_n$
That is, we take the unique coefficients for vectors in $A$ that form $v$ and multiply them instead by the basis element in $B$. From here, we can confirm that $Ta_i = b_i$, since the coefficient $c_i = 1$ for $a_i$, and the other coefficients all equal $0$.
Next, we check that $T$ is a linear transformation. First we check additivity. Let $u, v \in V$. Then $u = c_1a_1 + \ldots + c_na_n$ and $v = d_1a_1 + \ldots + d_na_n$ for some unique coefficients $c_i$ and $d_i$. Then
$ u + v = c_1a_1 + \ldots + c_na_n + d_1a_1 + \ldots d_na_n \\ u + v = (c_1 + d_1)a_1 + \ldots + (c_n + d_n)a_n \\ T(u + v) = (c_1 + d_1)b_1 + \ldots + (c_n + d_n)b_n \\ T(u + v) = c_1b_1 + \ldots c_nb_n + d_1b_1 + \ldots d_nb_n \\ T(u + v) = Tu + Tv $
Thus $T$ is additive. Next we check homogeneity:
$ dv = d(c_1a_1 + \ldots + c_na_n) \\ dv = dc_1a_1 + \ldots + dc_na_n \\ T(dv) = dc_1b_1 + \ldots + dc_nb_n \\ T(dv) = d(c_1b_1 + \ldots + c_nb_n) \\ T(dv) = d(Tv) $
Thus $T$ is homogeneous. Therefore $T$ is a linear transformation.
To show uniqueness, we must show that there isn't any other linear transformation $H$ such that $Ha_i = b_i$. So assume $H : V \rightarrow W$ is a linear transformation such that $Ha_i = b_i$, and let $c_1, \ldots, c_n$ be any coefficients. By homogeneity, we see that $H(c_ia_i) = c_ib_i$. Likewise, by additivity, we see that
$H(c_1a_1 + \ldots + c_na_n) = c_1b_1 + \ldots + c_1b_n$
But this is exactly our construction of $T$, so any such linear transform $H$ must equal $T$ after all. In other words, the requirement that $Ha_i = b_i$ implies a unique output for all other vectors in the domain of $H$.
Let $T : V \rightarrow W$ be a linear transformation and let $T(v) = w$. Show that $T(-v) = -w$.
$ T(-v) = T((-1)v) \\ T(-v) = (-1)(Tv) \\ T(-v) = (-1)w \\ T(-v) = -w $
Show that $\mathcal{L}(V, W)$ is a vector space.
To show that $\mathcal{L}(V, W)$ is a vector space, we must show that it meets all eight of the definitional requirements. Let $R, S, T \in \mathcal{L}(V, W)$.
-
Closure under Vector Addition: To show that $S + T \in \mathcal{L}(V,W)$, we must show that it is a linear transformation.
-
Additivity:
$ (S + T)(v + w) = S(v + w) + T(v + w) \\ (S + T)(v + w) = (Sv + Sw) + (Tv + Tw) \\ (S + T)(v + w) = (Sv + Tv) + (Sw + Tw) \\ (S + T)(v + w) = (S + T)v + (S+T)w $
-
Homogeneity:
$ (c(T + S))(v) = c((T + S)v) \\ (c(T + S))(v) = c(Tv + Sv) \\ (c(T + S))(v) = cTv + cSv \\ (c(T + S))(v) = T(cv) + S(cv) \\ (c(T + S))(v) = (T+S)(cv) $
-
-
Associativity of Addition:
$ (R + (S + T))(v) = Rv + (S + T)(v) \\ (R + (S + T))(v) = Rv + (Sv + Tv) \\ (R + (S + T))(v) = (Rv + Sv) + Tv \\ (R + (S + T))(v) = (R + S)(v) + Tv \\ (R + (S + T))(v) = ((R + S) + T)(v) $
-
Commutativity of Addition:
$ (S + T)(v) = Sv + Tv \\ (S + T)(v) = Tv + Sv \\ (S + T)(v) = (T + S)v $
-
Additive Identity: Define $Z : V \rightarrow W$ as $Z(v) = 0$. We first check that $Z$ is a linear transformation:
-
Additivity
$ Z(v + w) = 0 \\ Z(v + w) = 0 + 0 \\ Z(v + w) = Zv + Zw $
-
Homogeneity:
$ (cZ)(v) = c(Zv) \\ (cZ)(v) = c0 \\ (cZ)(v) = 0 \\ (cZ)(v) = Z(cv) $
From here, we simply check that $Z$ behaves like an additive identity:
$ (T + Z)(v) = Tv + Zv \\ (T + Z)(v) = Tv + 0 \\ (T + Z)(v) = Tv \\ $
-
-
Additive Inverse: For $T \in \mathcal{L}(V, W)$, define $-T$ as $(-1)(T)$. Then
$ (T + (-T))(v) = Tv + (-T)v \\ (T + (-T))(v) = Tv + ((-1)T)v \\ (T + (-T))(v) = Tv + (-1)(Tv) \\ (T + (-T))(v) = Tv - Tv \\ (T + (-T))(v) = 0 $
-
Closure under Scalar Multiplication: To show that $(cT) \in \mathcal{L}(V, W)$, we must show that it is a linear transformation:
-
Additivity:
$ (cT)(v + w) = c(T(v + w)) \\ (cT)(v + w) = c(Tv + Tw) \\ (cT)(v + w) = c(Tv) + c(Tw) \\ (cT)(v + w) = (cT)v + (cT)w $
-
Homogeneity:
$ (cT)(dv) = c(T(dv)) \\ (cT)(dv) = T((cd)v) $
-
-
Scalar Multiplicative Identity: $1 \in F$ by definition. Then $(1T)(v) = 1(Tv) = Tv$.
-
Distributivity 1:
$ c(S + T)(v) = c(Sv + Tv) \\ c(S + T)(v) = c(Sv) + c(Tv) \\ c(S + T)(v) = (cS)v + (cT)v \\ $
Distributivity 2:
$ ((c + d)T)(v) = (c + d)(Tv) \\ ((c + d)T)(v) = c(Tv) + d(Tv) \\ ((c + d)T)(v) = (cT)(v) + (dT)(v) \\ $
-
Let $V$ be a $1$-dimensional vector space. Show that any operator $T \in \mathcal{L}(V)$ is equivalent to multiplication by a scalar. In other words, show that there is some $c \in F$ such that for all $v \in V$, $Tv = cv$.
Let $B$ be a basis of $V$. Because $\text{dim}(V) = 1$, there is only one vector in $B$, call it $b$. Since $Tb \in V$, we can write $Tb$ as a scalar multiple of $b$ (i.e. a linear combination involving just $b$), call it $cb$. By the same logic, every vector $v \in V$ can be written as a scalar multiple of $b$, call it $c_vb$. By homogeneity we can then show that $Tv = cv$:
$ Tv = T(c_vb) \\ Tv = c_vTb \\ Tv = c_v(cb) \\ Tv = c(c_vb) \\ Tv = cv $