Linear Algebra - \(\bbR^n\), bases, matrices, rank
Monday August 30, 2021
The focus will be on \(\bbR^n\) (\(n\in\bbN^*\)) as the canonical (and very useful) examplar of a finite dimensional space.
\(\bbR^n\) is the collection of column vectors consisting of \(n\) elements \(\{v_i\}_{i=1}^n\in\bbR\), i.e., \[ \bfv=\mat{c}{v_1\\v_2\\\vdots\\v_n} = \mat{cccc}{v_1&v_2&\cdots&v_n}^\intercal \]
\(\bbR^n\) is formally a vector space, but what matters is that you can scale (multiply by real number) and add vectors together, and that the scaling and additions work well together.
In other words, a subvector space \(\calV\) is a vector space that is a stable (for addition and scale multiplication) subset of \(\bbR^n\)
The span of set of vectors \(\set{\bfv_i}_{i=1}^m\) is \[ \text{span}\left(\set{\bfv_i}_{i=1}^m\right)\eqdef \set{\sum_{i=1}^m\alpha_i\bfv_i:\set{\alpha_i}_{i=1}^m\in\bbR^m}, \] i.e., it is the set of all possible linear combinations of the vectors.
The span of set of vectors \(\set{\bfv_i}_{i=1}^m\) is a subvector space.
The vectors \(\set{\bfv_i}_{i=1}^m\) are linearly independent if and only if \[ \sum_{i=1}^m\alpha_i\bfv_i \Rightarrow \forall i\in\intseq{1}{n}\quad\alpha_i=0. \] Otherwise the vectors are linearly dependent.
Any set of vectors \(\set{\bfv_i}_{i=1}^m\) is linearly dependent if and only if one vector can be expressed as a linear combination of the others.
Any set of vectors \(\set{\bfv_i}_{i=1}^m\) contains a subset of linearly independent vectors with the same span.
A set \(\set{\bfv_i}_{i=1}^m\) is a basis for a subvector space \(\calV\subset\bbR^n\) if
The decomposition on the basis is unique, but the basis is not!
A basis is useful to reduce the operation on vectors to operations on components of the vectors w.r.t to the basis (crucial to do things beyond \(\bbR^n\))
In the following, \(\calV\) denotes a subvector space of \(\bbR^n\)
\(\bbR^n\) has other nice properties too!
For \(\bfu,\bfv\in\bbR^n\), \(\bfu^\intercal\bfv \eqdef \sum_{i=1}^n u_i v_i = \bfv^\intercal\bfu\) is an inner product such that
Striclty speaking, this is not a definition, we’re checking that \(\bfu^\intercal\bfv\) defines a symmetric homogeneous bilinear form.
A matrix \(\matA\in\bbR^{m\times n}\) is a collection of vectors concatenated together \[ \matA \eqdef\mat{c}{-\vecr_1-\\-\vecr_2-\\\vdots\\-\vecr_m-} \eqdef \mat{cccc}{\vert&\vert&&\vert\\\vecc_1&\vecc_2&\cdots&\vecc_n\\\vert&\vert&&\vert} \]
Let \(\matA\in\bbR^{m\times n}\) be a matrix with columns \(\set{\vecc_j}_{j=1}^n\) and rows \(\set{\vecr_i}_{i=1}^m\).
Note that \(\text{col}(\matA)=\text{row}(\matA^\intercal)\)
These spaces play an important role, in particular to solve systems of equations \(\matA\bfx=\bfy\) (You’ll be surprised how often this shows up in ECE!)
In the following \(\matA\) is an \(m\times n\) real-valued matrix
\(\text{row}(\matA)\) and \(\text{null}(\matA)\) are orthogonal complements, i.e.,