Mathematical Foundations of Machine Learning
Prof. Matthieu Bloch
Wednesday, November 20, 2024
Last time
- Last class: Monday November 18, 2024
- We talked about Gaussian estimation
- Today: We will talk about Gaussian regression and
maximum likelihood estimation
- To be effectively prepared for today's class, you should
have:
- Gone over slides
and read associated lecture notes here
- Worked on submit Homework 7 (due today!)
- Logistics:
- Jack Hill office hours: Wednesday 11:30am-12:30pm in TSRB
and hybrid
- Anuvab Sen office hours: Thursday 12pm-1pm in TSRB and
hybrid
- Dr. Bloch office hours: Friday November 22, 2024
6pm-7pm online
- Homework 8: due Wednesday November 27, 2024
What's next for this semester
Lecture 21 - Monday November 4, 2024: SVD and least
squares
Lecture 22 - Wednesday November 6, 2024: Gradient descent
Homework 6 due on Thursday November 7, 2024
Lecture 23 - Monday November 11, 2024: Estimation
Lecture 24 - Wednesday November 13, 2024: Estimation
Lecture 25 - Monday November 18, 2024: more
estimation
- Lecture 26 - Wednesday November 20, 2024: even more
estimation
- Lecture 27 - Monday November 25, 2024: Principal Component Analysis
- Lecture 28 - Monday December 2, 2024: Principal Component
Analysis
Final exam is coming
- Start reviewing notes and exams
- Thanksgiving is next week, exam the week after
- Reminder: final is on Friday December 6, 2024
2:40pm-5:30pm
- We will be very available for help and review sessions…
- …but not last minute.
- Try to plan you studies accordingly
- Use Piazza
Gaussian estimation (last time)
The conditional density of is a Normal
distribution with mean and covariance matrix
Gaussian random processes
A Gaussian random process is a collection where characterized by
- a mean function
- a covariance function where is a PSD kernel
such that for any ,
- Can we predict the value at unknown points given observations at
others?
Maximum Likelihood Estimation
- We consider the different but related problem, of parameter
estimation
- We assume that for some unknown parameter
- The goal is to estimate
from samples of
For a probabilistic model governing the distributions of samples
, the likelihood
function is It is often convenient to work with a log-likelihood .
The maximum likelihood estimate of is
Properties of Estimators
- There are three important properties of estimator that we will
discuss:
- Bias
- Consistency
- Efficiency
An estimator of
has bias . The estimator
is unbiased if the bias is zero for all
An estimator of
using observations is consistent if for every
and
Next time
- Next class: Monday November 25, 2024
- To be effectively prepared for next class, you
should:
- Go over today's slides
and read associated lecture notes here
and there
and there
- Work on Homework 7
- Optional
- Export slides for next lecture as PDF (be on the lookout for an
announcement when they're ready)


1/1
Mathematical Foundations of Machine Learning
Prof. Matthieu Bloch
Wednesday, November 20, 2024