% DEFINE some information that will be populated throughout the course notes. \def \coursename {Advanced Linear Algebra} \def \coursecode {MATH 3221} \def \courseterm {Fall 2020} \def \instructorname {Nathan Johnston} % END DEFINITIONS % IMPORT the course note formatting and templates \input{course_notes_template} % END IMPORT %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% \setcounter{chapter}{3} % Set to one less than the week number \chapter{Isomorphisms and Properties\\ of Linear Transformations} %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% {\large This week we will learn about: \begin{itemize} \item Invertibility of linear transformations, \item Isomorphisms, \item Properties of linear transformations, and \item Non-integer powers of linear transformations. \end{itemize}\bigskip\bigskip \noindent Extra reading and watching: \begin{itemize} \item Sections 1.2.4 and 1.3.1 in the textbook \item Lecture videos \href{https://www.youtube.com/watch?v=J2IbR6FAXG4&list=PLOAf1ViVP13jdhvy-wVS7aR02xnDxueuL&index=14}{13}, \href{https://www.youtube.com/watch?v=b7ADlJXkEe0&list=PLOAf1ViVP13jdhvy-wVS7aR02xnDxueuL&index=15}{14}, \href{https://www.youtube.com/watch?v=R_WlhwuqMJ0&list=PLOAf1ViVP13jdhvy-wVS7aR02xnDxueuL&index=16}{15}, and \href{https://www.youtube.com/watch?v=RQazaNdVLqI&list=PLOAf1ViVP13jdhvy-wVS7aR02xnDxueuL&index=17}{16} on YouTube \item \href{https://en.wikibooks.org/wiki/Linear_Algebra/Definition_and_Examples_of_Isomorphisms}{Definition and Examples of Isomorphisms} at WikiBooks \item \href{https://en.wikipedia.org/wiki/Isomorphism}{Isomorphism} at Wikipedia (be slightly careful -- this page talks about isomorphisms on a broader context than just linear algebra) \end{itemize}\bigskip\bigskip \noindent Extra textbook problems: \begin{itemize} \item[$\star$] 1.2.4(i,j), 1.3.1, 1.3.4(a--c), 1.3.5 \item[$\phantom{\star}\star\star$] 1.2.10, 1.2.13--1.2.15, 1.2.17, 1.2.24, 1.2.25, 1.3.6 \item[$\star\star\star$] 1.2.19, 1.2.21, 1.2.33 \item[$\skull$] none this week \end{itemize}} \newpage This week, we look at several important properties of linear transformations that you already saw for matrices back in introductory linear algebra. Thanks to standard matrices, all of these properties can be computed or determined using methods that we are already familiar with.\\ \section*{Invertibility of Linear Transformations} A linear transformation $T : \V \rightarrow \W$ is called \textbf{invertible} if there exists a linear transformation $T^{-1} : \W \rightarrow \V$ such that \horlines{3}\vspace*{-0.5cm} % $T^{-1}(T(\v)) = \v$ for all $\v \in \V$ and $T(T^{-1}(\w)) = \w$ for all $\w \in \W$. % Equivalently, T^{-1} \circ T = I and T \circ T^{-1} = I. The following theorem shows us that we can find the inverse of a linear transformation (if it exists) simply by inverting its standard matrix. \begin{theorem}[Invertibility of Linear Transformations] Let $T : \V \rightarrow \W$ be a linear transformation between $n$-dimensional vector spaces $\V$ and $\W$, which have bases $B$ and $D$, respectively. Then $T$ is invertible if and only if the matrix $[T]_{D \leftarrow B}$ is invertible. Furthermore, \[ ([T]_{D\leftarrow B})^{-1} = \big[T^{-1}\big]_{B\leftarrow D}. \] \end{theorem} \begin{proof} For the ``only if'' direction, note that if $T$ is invertible then we have \horlines{7}\vspace*{-1.3cm} %\[ %I = [I_{\V}]_{B} = \big[T^{-1} \circ T\big]_{B} = \big[T^{-1}\big]_{B \leftarrow D}[T]_{D \leftarrow B}. %\] %Since $[T^{-1}]_{B \leftarrow D}$ and $[T]_{D \leftarrow B}$ multiply to the identity matrix, it follows that they are inverses of each other.\smallskip % From here, just say that the ``if'' direction is similar. %For the ``if'' direction,\marginnote{We show in Exercise~\ref{exer:abstract_basis_diff_dim_not_inv} that if $\dim(\V) \neq \dim(\W)$ then $T$ cannot possibly be invertible.} suppose that $[T]_{D \leftarrow B}$ is invertible, with inverse matrix $A$. Then there is some linear transformation $S : \W \rightarrow \V$ such that $A = [S]_{B \leftarrow D}$, so for all $\v \in \V$ we have %\[ %[\v]_B = A[T]_{D \leftarrow B}[\v]_B = [S]_{D \leftarrow B}[T]_{D \leftarrow B}[\v]_B = [(S \circ T)(\v)]_B. %\] %This implies $[S \circ T]_{B} = I$, so $S \circ T = I_{\V}$, and a similar argument shows that $T \circ S = I_{\W}$. It follows that $T$ is invertible, and its inverse is $S$. \end{proof} \newpage \exx[14]{Compute $\displaystyle \int x^2 e^{3x} \, dx$. \hfill {\color{gray}(wait, what course is this?)}} % Tough integral: integration by parts twice. % Start by letting V be the span of e^{3x}, xe^{3x}, and x^2e^{3x}. This is a 3-dimensional vector space, and those three vectors form a basis of it. % Matrix of D with respect to this basis is [3 1 0;0 3 2;0 0 3]. % Inverse of this matrix is [1/3 -1/9 2/27; 0 1/3 -2/9; 0 0 1/3]. % Integration is the inverse of differentiation. So the integral is D^{-1}*[0;0;1] = [2/27; -2/9; 1/3], which is (2/27)*e^{2x} - (2/9)*x*e^{3x} + (1/3)*x^2*e^{3x}. % (plus constant) \noindent \textcolor{red}{\textbf{Be careful:}} Differentiation is usually not an invertible transformation (why not?). The only reason it was invertible in the previous example was because we were able to choose the vector space $\V$ to not have any constant functions in it.\\ All of our methods of checking invertibility of matrices carry over straightforwardly to linear transformations on finite-dimensional vector spaces. For example... \horlines{3} % if $\V$ and $\W$ are finite-dimensional and dim(V) = dim(W) then T : V \rightarrow W is invertible if and only if the only solution to T(v) = 0 is v = 0. \newpage \section*{Isomorphisms} Recall that every finite-dimensional vector space $\V$ has a basis $B$, and we can use that basis to represent a vector $\v \in \V$ as a coordinate vector $[\v]_B \in \mathbb{F}^n$, where $\mathbb{F}$ is the ground field. We used this correspondence between $\V$ and $\mathbb{F}^n$ to motivate the idea that... \horlines{2} % these vector spaces are ``the same'': to do a linear algebraic calculation in $\V$, we can instead do the corresponding calculation on coordinate vectors in $\mathbb{F}^n$.\smallskip We now make this idea of vector spaces being ``the same'' a bit more precise and clarify under exactly which conditions this ``sameness'' happens. % DEFINITION: Isomorphisms \begin{definition}[Isomorphisms]\label{defn:isomorphism}\index{isomorphism}\index{isomorphic} Suppose $\V$ and $\W$ are vector spaces over the same field. We say that $\V$ and $\W$ are \textbf{isomorphic}, denoted by $\V \cong \W$, if there exists an invertible linear transformation $T : \V \rightarrow \W$ (called an \textbf{isomorphism} from $\V$ to $\W$). \end{definition} % END DEFINITION The idea behind this definition is that if $\V$ and $\W$ are isomorphic then they have the same structure as each other---the only difference is the label given to their members ($\v$ for the members of $\V$ and $T(\v)$ for the members of $\W$). \exx[3]{Show that $\M_{1,n}$ and $\M_{n,1}$ are isomorphic.} Similarly, $\M_{1,n}$ and $\M_{n,1}$ are both isomorphic to... \horlines{4} % F^n. % This is why we treat row vectors (M_{1,n}), column vectors (M_{n,1}), and vectors without a shape (F^n) the same. \newpage \exx[5]{Show that $\P^3$ and $\R^4$ are isomorphic.} % Given obvious linear transformation, show it's invertible. More generally, we have the following theorem that pins down the idea that every finite-dimensional vector space ``behaves like'' $\mathbb{F}^n$: % THEOREM: Isomorphism via coordinate vectors \begin{theorem}[Isomorphisms of Finite-Dimensional Vector Spaces]\label{thm:isomorphism_via_coord_vecs} Suppose $\V$ is an $n$-dimensional vector space over a field $\mathbb{F}$. Then $\V \cong \mathbb{F}^n$. \end{theorem} % END THEOREM \begin{proof} Pick some basis $B$ of $\V$ and consider the function $T : \V \rightarrow \mathbb{F}^n$ defined by... \horlines{7}\vspace*{-1.3cm} %$T(\v) = [\v]_B$. % We noted last week that [v+w]_B = [v]_B + [w]_B and similarly for scalar mult, so it is a linear transformation. % Just need to show that it's invertible: % To this end, notice that dim(V) = dim(F^n) = n, so we just need to check if T(v) = 0 implies v = 0. But it does: if $[\v]_B = \0$ then $\v = 0\v_1 + \cdots + 0\v_n = \0$, so $\v = \0$. \end{proof} It is straightforward to check that if $\V \cong \W$ and $\W \cong \mathcal{X}$ then $\V \cong \mathcal{X}$. We thus get the following immediate corollary of the above theorem: \horlines{2} % Two finite dimensional vector spaces over the same field are isomorphic if and only if they have the same dimension. \newpage \section*{Properties of Linear Transformations} Now that we know we can think of arbitrary linear transformations (on finite-dimensional vector spaces) as matrices, we can apply all of our machinery from the previous course to them. For example, we can talk about the eigenvalues, range, null space, and rank of a linear transformation, and the definitions are just ``what you would expect'': \horlines{5} % $\range(T) \defeq \{T(\x) : \x \in \V\}$,\index{range}\smallskip % $\nullspace(T) \defeq \{\x \in \V : T(\x) = \0\}$,\marginnote{We show that $\range(T)$ is a subspace of $\W$ and $\nullspace(T)$ is a subspace of $\V$ in Exercise~\ref{exer:abs_basis_range_null}.}[-0.5cm]\index{null space}\smallskip % $\rank(T) \defeq \dim(\range(T))$, and\index{rank}\smallskip % lambda is an eigenvalue with corresponding eigenvector v if T(v) = lambda*v. \noindent Furthermore, these properties can all be computed from the standard matrix. \exx[11]{Find the eigenvalues of the transposition map $T : \M_2 \rightarrow \M_2$, as well as a set of corresponding eigenvectors.} % Do it in three different ways: Via the standard basis standard matrix, then via the Pauli basis standard matrix, then via the definition. \newpage \exx[10]{Find the range and rank of the differentiation map $D : \mathcal{P}^3 \rightarrow \mathcal{P}^3$.} % Write down the definition of range and rank in this more general setting. Compute via standard matrix FIRST. Write down linear transformation version. \section*{Application: Diagonalization and Square Roots} Recall from introductory linear algebra that we can diagonalize many matrices. That is, for many $A \in \M_n$ we can write... \horlines{3} % A = PDP^{-1}, where D is diagonal and P is invertible % D has eigenvalues of A on its diagonal, P has corresponding eigenvectors as its columns, in the same order Doing so lets us easily take arbitrary (even non-integer) powers of matrices: \horlines{1} % A^r = PD^rP^{-1}. \noindent where $D^r$ can simply be computed entrywise. \newpage Thanks to standard matrices, we can now do the same thing for most linear transformations. We illustrate what we mean via an example. \exx[12]{Find a square root of the transpose map acting on $\M_2$.} % We want a linear transformation S such that S^2 = T. Well, what about T^(1/2)? That is, what about the linear transformation that we get by raising [T] to the power 1/2? % [T] = swap matrix, found earlier. Better to use diagonal one via Pauli basis though. % sqrt via sqrt of diagonal entries, then convert back to transformation from matrix. Give explicit formula. % Mention that we can only do this over C: transpose does not have a square root over R. As perhaps an even more striking example, recall from last week that we could take powers of the standard matrix of the derivative to compute (for example) the fourth derivative of a function. If we use this method based on diagonalization to take non-integer powers of the standard matrix, we can compute \emph{fractional} derivatives! \exx[4]{Compute the half-derivative of $\sin(x)$ and $\cos(x)$. Then find a formula for the $r$-th derivative of these functions for arbitrary (not necessarily integer) $r \in \mathbb{R}$.} \newpage \horlines{22} \end{document}