diff --git a/materials/section/linear.tex b/materials/section/linear.tex index 7fb3811..0c4fbf3 100644 --- a/materials/section/linear.tex +++ b/materials/section/linear.tex @@ -17,7 +17,7 @@ \begin{center} \Large\textbf{Linear Algebra Review}\\ - \large\textit{Conner DiPaolo} + \large\textit{Conner DiPaolo} (\large\textit{updated by Iraj Jelodari}) \end{center} \vspace*{1em} @@ -592,7 +592,7 @@ \subsection{Eigendecomposition: $A = X\Lambda X^{-1}$} \[ A = X \Lambda X^{-1} \] -where $X = \m{\xx_1 & \xx_2 & \dots & \xx_n}$ are the $n$ eigenvalues of +where $X = \m{\xx_1 & \xx_2 & \dots & \xx_n}$ are the $n$ eigenvectors of $A$ and $\Lambda = \mathrm{diag}(\lambda_1, \lambda_2,\dots, \lambda_n)$ are the eigenvalues corresponding to $\xx_i$. If $A$ is symmetric, this becomes \[ @@ -681,6 +681,19 @@ \subsection{Singular Value Decomposition: $A = U\Sigma V^*$} factorization we will see later in the context of recommender systems. \end{enumerate} +\subsection{QR Decomposition: $A = QR$} +A $QR$ decomposition, also known as a $QR$ factorization or $QU$ factorization, is a decomposition of a matrix $A$ into a product of an orthogonal matrix $Q$ (i.e. $Q^TQ = I$) and an upper triangular matrix $R$: +\[ + A = QR +\] + +$QR$ decomposition is often used to solve the linear least squares problem and is the basis for a particular eigenvalue algorithm, the $QR$ algorithm.\\ + +Analogously, we can define $QL$, $RQ$, and $LQ$ decompositions, with $L$ being a lower triangular matrix.\\ + +There are several methods for actually computing the $QR$ decomposition, such as by means of the \textit{Gram–Schmidt process}, \textit{Householder transformations}, or \textit{Givens rotations}. Each has a number of advantages and disadvantages. + + \section{Matrix Calculus} Most of you probably haven't been taught all of this yet. That's okay. Matrix calculus