Lecture notes

Lecture notes will be posted here throughout the semester.


Table of contents

  1. Topic 1: Geometry and probability in high dimension
  2. Topic 2: Orthogonality, QR and least squares
  3. Topic 3: Matrix norms, low-rank approximations, and SVD
  4. Topic 4: Introduction to spectral graph theory
  5. Topic 5: Convexity, gradient descent and automatic differentiation
  6. Topic 6: Probabilistic modeling, inference and sampling


Topic 1: Geometry and probability in high dimension


Theory

  • a first data science example: species delimitation (html, ipynb)
  • review (html, ipynb)
  • high-dimensional space (html, ipynb)
  • clustering: an objective, an algorithm, and a toy example (html, ipynb)

Applications


Topic 2: Orthogonality, QR and least squares


Theory

Applications


Topic 3: Matrix norms, low-rank approximations, and SVD


Theory

  • motivating example: movie recommendations (html, ipynb)
  • matrix norms and approximating subspaces (html, ipynb)
  • singular value decomposition (html, ipynb)
  • condition numbers (html, ipynb)

Applications


Topic 4: Introduction to spectral graph theory


Theory

Applications


Topic 5: Convexity, gradient descent and automatic differentiation


Theory

Applications


Topic 6: Probabilistic modeling, inference and sampling


Theory

  • motivating example (html, ipynb, slides)
  • review (html, ipynb, slides)
  • joint distributions: marginalization and conditional independence (html, ipynb, slides)
  • inference and parameter estimation: variable elimination and expectation-maximization (see Sections 9.2.1-2, 9.3.1-3 , 13.1, 13.2.1-2 in [Bis])
  • sampling: Markov chain Monte Carlo methods (see Sections 11.2-3 in [Bis])

Applications