The Mathematics of ECE

Linear Algebra - Vector spaces, bases, rank

Wednesday, August 31, 2022

Logistics

Image
https://xkcd.com/1050/
  • The Mathematics of ECE

    • Review sessions not tied to any class coverting probabilities, linear algebra, and real analysis
    • Objective: small curriculum but opportunity to ask anything!
    • No grade, no pressure, no stakes
  • Instructor

    • Brighton Ancelin, Ph.D. student in Machine Learning
    • Piazza for course: sign-up
    • Slides available here
  • Today’s class: Linear algebra fundamentals

    • Objective #1: review concepts that you might be already familiar with
    • Objective #2: get used to the level of rigor expected in grad school
    • Objective #3: meet classmates!
  • Next workshop: Friday September 02, 2022

  • Please leave your email! We’d like to hear from you to do a better job

Vector space

  • A vector space \(\calV\) over a field (typically \(\bbR\) or \(\bbC\)) consists of a set \(\calV\) of vectors, a closed addition rule \(+\) and a closed scalar multiplication \(\cdot\) such that 8 axioms are satisfied:
    1. \(\forall x,y,z\in\calV\) \(x+(y+z)=(x+y)+z\) (associativity)
    2. \(\forall x,y\in\calV\) \(x+y=y+x\) (commutativity)
    3. \(\exists 0\in\calV\) such that \(\forall x\in\calV\) \(x+0=x\) (identity element)
    4. \(\forall x\in\calV\) \(\exists y\in\calV\) such that \(x+y=0\) (inverse element)
    5. \(\forall x\in\calV\) \(1\cdot x= x\)
    6. \(\forall \alpha, \beta\in\bbR\) \(\forall x\in\calV\) \(\alpha\cdot(\beta x)=(\alpha\cdot\beta)\cdot x\) (associativity)
    7. \(\forall \alpha, \beta\in\bbR\) \(\forall x\in\calV\) \((\alpha+\beta)x = \alpha x+\beta x\) (distributivity)
    8. \(\forall \alpha\in\bbR\) \(\forall x,y\in\calV\) \(\alpha(x+y) = \alpha x+\alpha y\) (distributivity)
  • Drill: prove that the identity element is unique using axioms 2 and 3

  • Drill: prove that inverse element is unique using axioms 1 through 4

Span and linear independence

  • Let \(\set{v_i}_{i=1}^n\) be a set of vectors in a vector space \(\calV\).

  • For \(\set{a_i}_{i=1}^n\in\bbR^n\), \(\sum_{i=1}^na_iv_i\) is called a linear combination of the vectors \(\set{v_i}_{i=1}^n\).

  • The span of the vectors \(\set{v_i}_{i=1}^n\) is the set \[ \text{span}(\set{v_i}_{i=1}^n)\eqdef \{\sum_{i=1}^na_iv_i:\set{a_i}_{i=1}^n\in\bbR^n\} \]

  • Drill: show that the span of the vectors \(\set{v_i}_{i=1}^n\in\calV^n\) is a vector subspace of \(\calV\).

  • Let \(\set{v_i}_{i=1}^n\) be a set of vectors in a vector space \(\calV\)

  • \(\set{v_i}_{i=1}^n\) is linearly independent (or the vectors \(\set{v_i}_{i=1}^n\) are linearly independent ) if (and only if) \[ \sum_{i=1}^na_iv_i = 0\Rightarrow \forall i\in\intseq{1}{n}\,a_i=0 \] Otherwise the set is (or the vectors are) linearly dependent.

Span and linear independence

  • Every set of vectors has a linearly independent subset.

Inner product and norm

  • An inner product space over \(\bbR\) is a vector space \(\calV\) equipped with a positive definite symmetric bilinear form \(\dotp{\cdot}{\cdot}:\calV\times\calV\to\bbR\) called an inner product

  • An inner product space is also called a pre-Hilbert space

  • An inner product satisfies \(\forall x,y\in\calV\) \(\dotp{x}{y}^2\leq\dotp{x}{x}\dotp{y}{y}\)

  • A norm on a vector space \(\calV\) over \(\bbR\) is a function \(\norm{\cdot}:\calV\to\bbR\) that satisfies:
    • Positive definiteness: \(\forall x\in\calV\) \(\norm{x}\geq 0\) with equality iff \(x=0\)
    • Homogeneity: \(\forall x\in\calV\) \(\forall\alpha\in\bbR\) \(\norm{\alpha x}=\abs{\alpha}\norm{x}\)
    • Subadditivity: \(\forall x,y\in\calV\) \(\norm{x+y}\leq \norm{x}+\norm{y}\)
  • In an inner product space, an inner product induces a norm \(\norm{x} \eqdef \sqrt{\dotp{x}{x}}\)

Orthogonality

  • Two vectors \(x,y\in\calV\) are orthogonal if \(\dotp{x}{y}=0\). We write \(x\perp y\) for simplicity.

    A vector \(x\in\calV\) is orthogonal to a set \(\calS\subset\calV\) if \(\forall s\in\calS\) \(\dotp{x}{s}=0\). We write \(x\perp \calS\) for simplicity.

  • Every linearly independent subset can be orthonormalized.

Bases

  • A basis of vector subspace \(\calW\) of a vector space \(\calV\) is a countable set of vectors \(\calB\) such that:
    1. \(\text{span}(\calB)=\calW\)
    2. \(\calB\) is linearly independent
  • You should be somewhat familiar with this in \(\bbR^n\), there are lots of nice features
    • every subspace has a basis
    • every basis for a subspace has the same number of elements
    • the number of elements in a basis is called the dimension
    • the representation of a vector on a basis is unique
    • having a basis reduces the operations on vectors to operations on their components.

Linear subspaces

  • A subset \(\calW\) of a vector space \(\calV\) is a linear subspace if \(\forall x,y\in\calW\forall \lambda,\mu\in\bbR\) \(\lambda x+\mu y \in\calW\)

  • Every linear subspace of \(\bbR^n\) has a basis

  • if \(\bfx\in\text{span}(\set{\bfv_i}_{i=1}^n)\) and \(\forall i\quad \vecx^\intercal\vecv_i=0\), then \(\vecx=0\)

Matrices

  • A matrix \(\matA\in\bbR^{m\times n}\) is a collection of vectors concatenated together \[ \matA \eqdef\mat{c}{-\vecr_1-\\-\vecr_2-\\\vdots\\-\vecr_m-} \eqdef \mat{cccc}{\vert&\vert&&\vert\\\vecc_1&\vecc_2&\cdots&\vecc_n\\\vert&\vert&&\vert} \]

  • Let \(\matA\in\bbR^{m\times n}\) be a matrix with columns \(\set{\vecc_j}_{j=1}^n\) and rows \(\set{\vecr_i}_{i=1}^m\). The column (image) space \(\text{col}(\matA)\) of \(\matA\) is \(\text{span}\left(\set{\bfc_i}_{i=1}^n\right)\). The row space \(\text{row}(\matA)\)of \(\matA\) is \(\text{span}\left(\set{\bfr_i}_{i=1}^m\right)\). The null (kernel) space of \(\matA\) is \(\text{null}\eqdef(\matA)\set{\bfx\in\bbR^n:\matA\vecx=\mathbf{0}}\)
  • Note that \(\text{col}(\matA)=\text{row}(\matA^\intercal)\)

  • These spaces play an important role, in particular to solve systems of equations \(\matA\bfx=\bfy\) (You’ll be surprised how often this shows up in ECE!)

  • \(\text{null}(\matA)\) is a subvector space.

Rank

  • In the following \(\matA\) is an \(m\times n\) real-valued matrix

  • \(\text{dim}(\text{row}(\matA))=\text{dim}(\text{col}(\matA))\), which is called the rank of \(\matA\) denoted \(\text{rank}(\matA)\)
  • \(\text{row}(\matA)\) and \(\text{null}(\matA)\) are orthogonal complements, i.e.,

    • for any \(\bfu\in\text{row}(\matA)\), \(\bfv\in\text{null}(\matA)\) \(\bfu^\intercal\bfv=0\)
    • \(\text{dim}(\text{row}(\matA)) = n-\text{dim}(\text{null}(\matA))\)
    • for any \(\bfx\in\bbR^n\), there exists \(\bfx_2\in\text{row}(\matA)\), \(\bfx_2\in\text{null}(\matA)\) such that \(\bfx=\bfx_1+\bfx_2\)