Linear Algebra - More on matrices
Friday, September 10, 2021
A matrix \(\matA\in\bbR^{m\times n}\) is a collection of vectors concatenated together \[ \matA \eqdef\mat{c}{-\vecr_1-\\-\vecr_2-\\\vdots\\-\vecr_m-} \eqdef \mat{cccc}{\vert&\vert&&\vert\\\vecc_1&\vecc_2&\cdots&\vecc_n\\\vert&\vert&&\vert} \]
Note that \(\text{col}(\matA)=\text{row}(\matA^\intercal)\)
These spaces play an important role, in particular to solve systems of equations \(\matA\bfx=\bfy\) (You’ll be surprised how often this shows up in ECE!)
In the following \(\matA\) is an \(m\times n\) real-valued matrix
\(\text{row}(\matA)\) and \(\text{null}(\matA)\) are orthogonal complements, i.e.,
Given \(\matA\in\bbR^{n\times n}\), the matrix \(\bfA_{ij}\in\bbR^{(n-1)\times(n-1)}\) is the matrix obtained by removing the \(i\)th row and \(j\)th column of \(\matA\)
This definition is indeed correct (but equality between the quantities requires some proof)
Diagonal matrices \[ \bfA = \mat{cccc}{a_{11}&0&\cdots&0\\0&\ddots&\ddots&\vdots\\\vdots&\ddots&\ddots&0\\0&\cdots&0&a_{nn}} \]
Other important matrices to recognize
Regroup the coefficients \[ \bfx = \cdots + \left(\sum_{i=1}^n\beta_{ij}\alpha_i\right) v_j + \cdots \]
In matrix form
\[ \bfx_{\text{new}} = \mat{cccc}{\beta_{11}&\beta_{21}&\cdots&\beta_{n1}\\ \beta_{12}&\beta_{22}&\cdots&\beta_{n2}\\\vdots&\vdots&\vdots&\vdots\\\beta_{1n}&\beta_{2n}&\cdots&\beta_{nn}}\bfx \]
A change of basis matrix \(\matS\) is full rank (basis vectors are linearly independent)
Any full rank matrix \(\matS\) can be viewed as a change of basis
\(\matS^{-1}\) takes you back to the original basis
Warning: the columns of \(\bfS\) describe the old coordinates as a function of the new ones
Intuition: similar matrices are the same up to a change of basis