The Matrix of Linear Transformation
Let V be n-dimensional vector space over the field F and W be an m-dimensional vector space over the same field F.
Let \( B=\{\alpha_1, \alpha_2, … , \alpha_n\} \) be an ordered basis for V and \( B’=\{\beta_1, \beta_2, … , \beta_n\} \) be an ordered basis for W.
Let T be a linear transformation for V onto W. Then T is determined by its action on each vector \( \alpha_j \in V \).
Each \( T(\alpha_j) \) can be expressed as
\( T(\alpha_j)=\displaystyle\sum_{i=1}^{m}A_{ij}\beta_i \ \ \ \ (1\le j \le n) \)
where \( A_{1j}, A_{2j}, … , A_{mj} \) are scalars and are the coordinates of \( T(\alpha_j) \)
The \( m\times n \) matrix A defined by \( A(i, j)=A_{ij}, (1\le i \le m \ \& \ 1\le j \le n) \) is the matrix of T relative to the basis B and B’.
Conversely, if \( \alpha \in V \), then
\( \alpha=a_1\alpha_1+a_2\alpha_2+ … +a_n\alpha_n \)
\( =\displaystyle\sum_{j=1}^{n}a_j\alpha_j \)
\( \implies T(\alpha)=T\left(\displaystyle\sum_{j=1}^{n}a_j\alpha_j\right) \)
\( =\displaystyle\sum_{j=1}^{n}a_jT(\alpha_j) \)
\( =\displaystyle\sum_{j=1}^{n}\displaystyle\sum_{i=1}^{m}a_jA_{ij}\beta_i \)
\( \implies T(\alpha)=\displaystyle\sum_{j=1}^{n}\displaystyle\sum_{i=1}^{m}a_jA_{ij}\beta_i \ \ … \ \ (1) \)
If A is \( m \times n \) matrix on the field F, then the definition of T is as given in (1).
Definition (1)
Let V be a vector space over the field F. A linear transformation from V into V is called a linear operator defined on V.
Example (1)
Let F be a field and T be a linear operator defined on \( F^2 \) as \( T(x_1, x_2)=(x_1, 0) \). Then find the matrix T w.r.t. the standard ordered basis for \( F^2 \).
Solution
Let \( B=\{e_1, e_2\} \) be the standard basis for the \( F^2 \) where \( e_1=(1, 0), e_2=(0,1) \).
Then,
\( T(e_1)=T(1, 0)=(1, 0)=1.e_1+0.e_2 \)
\( T(e_2)=T(0, 1)=(0, 0)=0.e_1+0.e_2 \)
Therefore, the matrix of T w.r.t. B is \( [T]_B=\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} \)
Example (2)
Let V the space of all polynomial functions defined on \( \mathbb{R} \) of the type \( f(x)=a_0+a_1x+a_2x^2+a_3x^3 \). Let D be the differential operator which is defined on V. Let \( B=\{f_1, f_2, f_3, f_4\} \) be an ordered basis such that \( f_j(x)=x^{j-1} \) then find \( [D]_B \).
Solution
\( D(f_1(x))=D(x^0)=D(1)=0=0.f_1+0.f_2+0.f_3+0.f_4 \)
\( D(f_2(x))=D(x^1)=D(x)=1=1.f_1+0.f_2+0.f_3+0.f_4 \)
\( D(f_3(x))=D(x^2)=2x=0.f_1+2.f_2+0.f_3+0.f_4 \)
\( D(f_4(x))=D(x^3)=3x^2=0.f_1+0.f_2+3.f_3+0.f_4 \)
\( [D]_B=\begin{bmatrix} 0 & 1 & 0 & 0 \\ 0 & 0 & 2 & 0 \\ 0 & 0 & 0 & 3 \\ 0 & 0 & 0 & 0 \end{bmatrix} \)
Definition (2)
Let A and B be \( m \times n \) matrices over the field F. Then B is said to be similar to A if there exists an invertible matrix P such that \( B=P^{-1}AP \).
Linear Functionals
Let V be a vector space over the field F. Then a linear transformation f from V into F is called a linear functionals. i.e. \( f:V \rightarrow F \) is called a linear functional if for any vector \( \alpha, \beta \in V \ \text{and} \ c \in F \),
\( f(c\alpha+\beta)=cf(\alpha)+f(\beta) \)
Example (3)
Let F be a field and for some scalars \( a_1, a_2, … , a_n \) define f on \( F^n \) by
\( f(x_1, x_2, … , x_n)=a_1x_1+a_2x_2+ … +a_nx_n. \)
Then show that f is a linear functional on \( F^n \).
Solution
Let\( x, y \in F^n \)
\( \therefore x=(x_1, x_2, …, x_n)\ \& \ y=(y_1, y_2, …, y_n) \)
Let \( c \in F \)
Consider,
\( f(cx+y) \)
\( =f(c(x_1, x_2, …, x_n)+(y_1, y_2, …, y_n)) \)
\( =f((cx_1, cx_2, …, cx_n)+(y_1, y_2, …, y_n)) \)
\( =f(cx_1+y_1, cx_2+y_2 …, cx_n+y_n) \)
\( =a_1(cx_1+y_1)+a_2(cx_2+y_2)+ … +a_n(cx_n+y_n) \)
\( =ca_1x_1+a_1y_1+ca_2x_2+a_2y_2+ … +ca_nx_n+a_ny_n \)
\( =(ca_1x_1+ca_2x_2+…+ca_nx_n)+(a_1y_1+a_2y_2+ … +a_ny_n) \)
\( =c(a_1x_1+a_2x_2+…+a_nx_n)+(a_1y_1+a_2y_2+ … +a_ny_n) \)
\( =cf(x_1, x_2, …, x_n)+f(y_1, y_2, …, y_n) \)
\( \therefore f(cx+y)=cf(x)+f(y) \)
\( \therefore \) f is linear functional.
Note
The matrix of f w.r.t. the standard basis \( B=\{e_1, e_2, … , e_n\} \) (where \( e_i=(0, 0, … , 1_{i^{th}}, … , 0 \)) and \( B’=\{1\} \) is given by \( [a_1, a_2, … , a_n] \).
Here \( f(e_j)=a_j \), for each \( j \ (1\le j\le n) \)
Now, \( f(x_1, x_2, …, x_n)=f\left(\displaystyle\sum_{j=1}^{n}x_je_j\right) \)
\( =\displaystyle\sum_{j=1}^{n}x_jf(e_j) \)
\( =\displaystyle\sum_{j=1}^{n}a_jx_j \)
Definition (3)
Let A be \( n\times n \) matrix over the filed F, i.e.\( A=(A_{ij}) \), then \( trace \ (A)=A_{11}+A_{22}+ … +A_{nn} \).
Example (4)
Define \( tr:F^{n\times n}\rightarrow F \) by \( tr(A)=A_{11}+A_{22}+ … +A_{nn} \) (\( F^{n\times n} \) is the set of all \( n\times n \) matrices in F). Then tr is linear functional.
Solution
Let \( A, B \in F^{n\times n} \) and \( c\in F \).
\( tr(cA+B)=cA_{11}+cA_{22}+ … +cA_{nn}+B_{11}+B_{22}+ … +B_{nn} \)
\( =\displaystyle\sum_{i=1}^{n}\left(cA_{ii}+B_{ii}\right) \)
\( =c\displaystyle\sum_{i=1}^{n}A_{ii}+\displaystyle\sum_{i=1}^{n}B_{ii} \)
\( =c \ tr(A)+tr(B) \)
Note
If V is a vector space over the field F, the set of all linear functionals \( f:V\rightarrow F \) is denoted by \( L(V, F) \) and \( L(V, F) \) is a vector space over the field F. w.r.t. following
(i) \( (f+g)(\alpha)=f(\alpha)+g(\alpha) \)
(ii) \( (cf)(\alpha)=c[f(\alpha)] \)
\( \forall \ \alpha, \beta \in V, c\in F \ and \ f, g \in L(V, F) \)
If V is a finite dimensional vector space over the field F, then \( dim \ V=dim \ L(V, F) \).
The space all linear functionals i.e. \( L(V, F) \) is denoted by \( V^* \). Then \( dim \ V^* =dim \ V \), if V is finite dimensional vector space. Let \( B=\{\alpha_1, \alpha_2, … , \alpha_n\} \) be an ordered basis for an n-dimensional vector space V over the field F. Then we can determine n distinct linear functional \( f_1, f_2, …, f_n \) from B such that
\( f_i(\alpha_j)=\delta_{ij} \) where \( \delta_{ij}=0 \), \( \text{if} \ i\ne j \)
\( =1 \), \( \text{if} \ i=j \)
Claim that: \( \{f_1, f_2, …, f_n\} \) is a linearly independent set in\( V^* \).
Consider, \( f=\sum c_if_i \)
\( \implies f(\alpha_j)=\left(\sum c_if_i\right)(\alpha_j) \)
\( =\sum c_if_i(\alpha_j) \)
\( =\sum c_i\delta_{ij} \)
\( =c_j \)
Then \( f(\alpha_j)=c_j \) , for each j.
If \( f=0 \) then \( c_j=0 \) for all j.
Hence \( \{f_1, f_2, …, f_n\} \) is a linearly independent.
Since \( dim \ V=n=dim \ V^* \)
\( \therefore B^*= \{f_1, f_2, …, f_n\} \) is the basis for \( V^* \). This particular basis for \( V^* \) is called dual basis.